Spaces:
Runtime error
Runtime error
| title: AI Research Assistant | |
| sdk: gradio | |
| sdk_version: 4.38.1 | |
| app_file: app.py | |
| license: apache-2.0 | |
| # π§ AI Research Assistant | |
| An advanced AI-powered research assistant that combines web search capabilities with contextual awareness to provide comprehensive answers to complex questions. | |
| ## π Key Features | |
| - **Real-time Streaming Output**: See responses as they're generated for immediate feedback | |
| - **Contextual Awareness**: Incorporates current weather and space weather data | |
| - **Web Search Integration**: Powered by Tavily API for up-to-date information | |
| - **Smart Caching**: Redis-based caching for faster repeated queries | |
| - **Intelligent Server Monitoring**: Clear guidance during model warm-up periods | |
| - **Accurate Citations**: Real sources extracted from search results | |
| - **Asynchronous Processing**: Parallel execution for optimal performance | |
| - **Responsive Interface**: Modern Gradio UI with example queries | |
| ## ποΈ Architecture | |
| The application follows a modular architecture for maintainability and scalability: | |
| myspace134v/ | |
| βββ app.py # Main Gradio interface | |
| βββ modules/ | |
| β βββ analyzer.py # LLM interaction with streaming | |
| β βββ citation.py # Citation generation and formatting | |
| β βββ context_enhancer.py # Weather and space context (async) | |
| β βββ formatter.py # Response formatting | |
| β βββ input_handler.py # Input validation | |
| β βββ retriever.py # Web search with Tavily | |
| β βββ server_cache.py # Redis caching | |
| β βββ server_monitor.py # Server health monitoring | |
| β βββ status_logger.py # Event logging | |
| β βββ visualizer.py # Output rendering | |
| β βββ visualize_uptime.py # System uptime monitoring | |
| βββ tests/ # Unit tests | |
| βββ requirements.txt # Dependencies | |
| βββ version.json # Version tracking | |
| ## π€ AI Model Information | |
| This assistant uses the **DavidAU/OpenAi-GPT-oss-20b-abliterated-uncensored-NEO-Imatrix-gguf** model hosted on Hugging Face Endpoints. This is a powerful open-source language model with: | |
| - **20 Billion Parameters**: Capable of handling complex reasoning tasks | |
| - **Extended Context Window**: Supports up to 8192 tokens per response | |
| - **Uncensored Capabilities**: Provides comprehensive answers without artificial limitations | |
| - **Specialized Training**: Optimized for research and analytical tasks | |
| ## π§ API Integrations | |
| | Service | Purpose | Usage | | |
| |---------|---------|-------| | |
| | **Tavily** | Web Search | Real-time information retrieval | | |
| | **Hugging Face Inference** | LLM Processing | Natural language understanding | | |
| | **Redis** | Caching | Performance optimization | | |
| | **NASA** | Space Data | Astronomical context | | |
| | **OpenWeatherMap** | Weather Data | Environmental context | | |
| ## β‘ Enhanced Features | |
| ### π Streaming Output | |
| Responses stream in real-time, allowing users to start reading before the complete answer is generated. This creates a more natural conversational experience. | |
| ### π Dynamic Citations | |
| All information is properly sourced with clickable links to original content, ensuring transparency and enabling further exploration. | |
| ### β‘ Asynchronous Operations | |
| Weather data, space weather, and web searches run in parallel, significantly reducing response times. | |
| ### π§ Contextual Intelligence | |
| Each query is enhanced with: | |
| - Current weather conditions | |
| - Recent space events | |
| - Accurate timestamps | |
| ### π‘οΈ Server State Management | |
| Intelligent monitoring detects when the model server is initializing and provides clear user guidance with estimated wait times. | |
| ## π Getting Started | |
| ### Prerequisites | |
| - Python 3.8+ | |
| - Hugging Face account and token | |
| - API keys for Tavily, NASA, and OpenWeatherMap | |
| - Redis instance for caching | |
| ### Setup Instructions | |
| 1. Clone the repository | |
| 2. Set up required environment variables: | |
| ```bash | |
| export HF_TOKEN="your_hugging_face_token" | |
| export TAVILY_API_KEY="your_tavily_api_key" | |
| export REDIS_HOST="your_redis_host" | |
| export REDIS_PORT="your_redis_port" | |
| export REDIS_USERNAME="your_redis_username" | |
| export REDIS_PASSWORD="your_redis_password" | |
| export NASA_API_KEY="your_nasa_api_key" | |
| export OPENWEATHER_API_KEY="your_openweather_api_key" | |
| Install dependencies: | |
| pip install -r requirements.txt | |
| Run the application: | |
| python app.py | |
| π System Monitoring | |
| The assistant includes built-in monitoring capabilities: | |
| Server Health Tracking: Detects and reports server state changes | |
| Performance Metrics: Logs request processing times | |
| Uptime Monitoring: Tracks system availability | |
| Failure Recovery: Automatic handling of transient errors | |
| π Example Queries | |
| Try these sample questions to see the assistant in action: | |
| "What are the latest developments in fusion energy research?" | |
| "How does climate change impact global food security?" | |
| "Explain the significance of recent Mars rover discoveries" | |
| "What are the economic implications of AI advancement?" | |
| π License | |
| This project is licensed under the Apache 2.0 License - see the LICENSE file for details. | |
| π€ Contributing | |
| Contributions are welcome! Please feel free to submit a Pull Request. | |
| π Support | |
| For issues, questions, or feedback, please open an issue on the repository. | |