FinnewsHunter
FinnewsHunter: Multi-agent financial intelligence platform powered by AgenticX. Real-time news analysis, sentiment fusion, and alpha factor mining.
Install / Use
/learn @DemonDamon/FinnewsHunterREADME
FinnewsHunter: Multi-Agent Investment Decision Platform Driven by Financial News
<div align="right"> <a href="README_zn.md">中文版</a> | <a href="README.md">English</a> </div> <div align="center"> <img src="assets/images/FINNEWS_HUNTER_LOGO.png" alt="FinnewsHunter Logo" width="450"> </div>An enterprise-grade financial news analysis system built on the AgenticX framework, integrating real-time news streams, deep quantitative analysis, and multi-agent debate mechanisms.
FinnewsHunter goes beyond traditional text classification by deploying multi-agent teams (NewsAnalyst, Researcher, etc.) to monitor multiple financial news sources in real-time, including Sina Finance, National Business Daily, Financial World, Securities Times, and more. It leverages large language models for deep interpretation, sentiment analysis, and market impact assessment, combined with knowledge graphs to mine potential investment opportunities and risks, providing decision-level alpha signals for quantitative trading.
🎯 Project Features
- ✅ AgenticX Native: Deeply integrated with AgenticX framework, using core abstractions like Agent, Tool, and Workflow
- ✅ AgenticX Component Integration: Direct use of AgenticX's
BailianEmbeddingProviderandMilvusStorage, avoiding reinventing the wheel - ✅ Agent-Driven: NewsAnalyst agent automatically analyzes news sentiment and market impact
- ✅ Multi-Provider LLM Support: Supports 5 major LLM providers (Bailian, OpenAI, DeepSeek, Kimi, Zhipu), switchable with one click in the frontend
- ✅ Batch Operations: Supports batch selection, batch deletion, and batch analysis of news, improving operational efficiency
- ✅ Stock K-Line Analysis: Integrated with akshare real market data, supporting daily/minute K-line multi-period display
- ✅ Intelligent Stock Search: Supports code and name fuzzy queries, pre-loaded with 5000+ A-share data
- ✅ Complete Tech Stack: FastAPI + PostgreSQL + Milvus + Redis + React
- ✅ Real-time Search: Supports multi-dimensional search by title, content, stock code, with keyword highlighting
- ✅ Async Vectorization: Background async vectorization execution, non-blocking analysis flow
- ✅ Production Ready: One-click deployment with Docker Compose, complete logging and monitoring
🏗️ System Architecture

The system adopts a layered architecture design:
- M6 Frontend Interaction Layer: React + TypeScript + Shadcn UI
- M1 Platform Service Layer: FastAPI Gateway + Task Manager
- M4/M5 Agent Collaboration Layer: AgenticX Agent + Debate Workflow
- M2/M3 Infrastructure Layer: Crawler Service + LLM Service + Embedding
- M7-M11 Storage & Learning Layer: PostgreSQL + Milvus + Redis + ACE Framework
🚀 Quick Start
Prerequisites
- Python 3.11+
- Docker & Docker Compose
- (Optional) OpenAI API Key or local LLM
- Node.js 18+ (for frontend development)
1. Install AgenticX
cd /Users/damon/myWork/AgenticX
pip install -e .
2. Install Backend Dependencies
cd FinnewsHunter/backend
pip install -r requirements.txt
3. Configure Environment Variables
cd FinnewsHunter/backend
cp env.example .env
# Edit .env file and fill in LLM API Key and other configurations
Multi-Provider LLM Configuration:
The system supports 5 LLM providers, at least one needs to be configured:
| Provider | Environment Variable | Registration URL |
|----------|---------------------|------------------|
| Bailian (Alibaba Cloud) | DASHSCOPE_API_KEY | https://dashscope.console.aliyun.com/ |
| OpenAI | OPENAI_API_KEY | https://platform.openai.com/api-keys |
| DeepSeek | DEEPSEEK_API_KEY | https://platform.deepseek.com/ |
| Kimi (Moonshot) | MOONSHOT_API_KEY | https://platform.moonshot.cn/ |
| Zhipu | ZHIPU_API_KEY | https://open.bigmodel.cn/ |
Example Configuration (Recommended: Bailian):
# Bailian (Alibaba Cloud) - Recommended, fast access in China
DASHSCOPE_API_KEY=sk-your-dashscope-key
DASHSCOPE_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
BAILIAN_MODELS=qwen-plus,qwen-max,qwen-turbo
# Optional: Other providers
OPENAI_API_KEY=sk-your-openai-key
DEEPSEEK_API_KEY=sk-your-deepseek-key
4. Start Base Services (PostgreSQL, Redis, Milvus)
cd FinnewsHunter
docker compose -f deploy/docker-compose.dev.yml up -d postgres redis milvus-etcd milvus-minio milvus-standalone
5. Initialize Database
cd FinnewsHunter/backend
python init_db.py
5.1 Initialize Stock Data (Optional, for stock search functionality)
cd FinnewsHunter/backend
python -m app.scripts.init_stocks
# Will fetch all A-share data (approximately 5000+ stocks) from akshare and save to database
6. Start Backend API Service
cd FinnewsHunter/backend
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
7. Start Celery Worker and Beat (Auto Crawling)
# Open a new terminal
cd FinnewsHunter
docker compose -f deploy/docker-compose.dev.yml up -d celery-worker celery-beat
8. Start Frontend Service
# Open a new terminal
cd FinnewsHunter/frontend
npm install # First time requires dependency installation
npm run dev
9. Access Application
- Frontend Interface: http://localhost:3000
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
🔄 Service Management
View All Service Status
cd FinnewsHunter
docker compose -f deploy/docker-compose.dev.yml ps
Restart All Services
cd FinnewsHunter
# Restart Docker services (infrastructure + Celery)
docker compose -f deploy/docker-compose.dev.yml restart
# If backend API is started independently, manually restart it
# Press Ctrl+C to stop backend process, then rerun:
cd backend
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
Restart Specific Service
cd FinnewsHunter
# Restart only Celery (after code changes)
docker compose -f deploy/docker-compose.dev.yml restart celery-worker celery-beat
# Restart only database
docker compose -f deploy/docker-compose.dev.yml restart postgres
# Restart only Redis
docker compose -f deploy/docker-compose.dev.yml restart redis
Stop All Services
cd FinnewsHunter
docker compose -f deploy/docker-compose.dev.yml down
View Logs
cd FinnewsHunter
# View Celery Worker logs
docker compose -f deploy/docker-compose.dev.yml logs -f celery-worker
# View Celery Beat logs (scheduled task dispatch)
docker compose -f deploy/docker-compose.dev.yml logs -f celery-beat
# View PostgreSQL logs
docker compose -f deploy/docker-compose.dev.yml logs -f postgres
# View all service logs
docker compose -f deploy/docker-compose.dev.yml logs -f
🗑️ Reset Database
Method 1: Use One-Click Reset Script (Recommended) ⭐
cd FinnewsHunter
# Execute reset script
./reset_all_data.sh
# Enter yes to confirm
The script will automatically complete:
- ✅ Clear all news and task data in PostgreSQL
- ✅ Clear Redis cache
- ✅ Reset database auto-increment IDs (restart from 1)
- ✅ Clear Celery schedule files
- ✅ Automatically restart Celery services
After execution, wait:
- 5-10 minutes for the system to automatically re-crawl data
- Access frontend to view new data
Method 2: Manual Reset (Advanced)
Step 1: Clear PostgreSQL Data
# Enter PostgreSQL container
docker exec -it finnews_postgres psql -U finnews -d finnews_db
Execute in PostgreSQL command line:
-- Clear news table
DELETE FROM news;
-- Clear task table
DELETE FROM crawl_tasks;
-- Clear analysis table
DELETE FROM analyses;
-- Reset auto-increment IDs
ALTER SEQUENCE news_id_seq RESTART WITH 1;
ALTER SEQUENCE crawl_tasks_id_seq RESTART WITH 1;
ALTER SEQUENCE analyses_id_seq RESTART WITH 1;
-- Verify results (should all be 0)
SELECT 'news table', COUNT(*) FROM news;
SELECT 'crawl_tasks table', COUNT(*) FROM crawl_tasks;
SELECT 'analyses table', COUNT(*) FROM analyses;
-- Exit
\q
Step 2: Clear Redis Cache
cd FinnewsHunter
docker exec finnews_redis redis-cli FLUSHDB
Step 3: Clear Celery Schedule Files
cd FinnewsHunter/backend
rm -f celerybeat-schedule*
Step 4: Restart Celery Services
cd FinnewsHunter
docker compose -f deploy/docker-compose.dev.yml restart celery-worker celery-beat
Step 5: Verify Data Cleared
# Check news count (should be 0)
docker exec finnews_postgres psql -U finnews -d finnews_db -c "SELECT COUNT(*) FROM news;"
# Check Redis (should be 0 or very small)
docker exec finnews_redis redis-cli DBSIZE
# Check if Celery has started crawling
docker compose -f deploy/docker-compose.dev.yml logs -f celery-beat
# Should see 10 crawl tasks triggered per minute
Method 3: Use Python Script Reset
cd FinnewsHunter/backend
python reset_database.py
# Enter yes to confirm
Method 4: Quick Manual Cleanup (One-Line Commands) 🔥
Use Case: When reset script doesn't work, this is the fastest method
cd FinnewsHunter
# Step 1: Clear database tables
docker exec finnews_postgres psql -U finnews -d finnews_db -c "DELETE FROM news; DELETE FROM crawl_tasks; DELETE FROM analyses;"
# Step 2: Reset auto-increment IDs
docker exec finnews_postgres psql -U finnews -d finnews_db -c "ALTER SEQUENCE news_id_seq RESTART WITH 1; ALTER SEQUENCE crawl_tasks_id_seq RESTART WITH 1; ALTER SEQUENCE analyses_id_seq RESTART WITH 1;"
# Step 3: Clear Redis cache
docker exec finnews_redis redis-cli FLUSHDB
# Step 4: Clear Celery schedule files
rm -f backend/celerybeat-schedule*
# Step 5: Restart Celery services
docker compose -f deploy/docker-compose.dev.yml restart celery-worker celery-beat
# Step 6: Verify cleared (should display 0)
docker exec finnews_postgres psql -
