Major architectural transformation from synchronous to asynchronous processing: ## Pipeline Services (8 microservices) - pipeline-scheduler: APScheduler for 30-minute periodic job triggers - pipeline-rss-collector: RSS feed collection with deduplication (7-day TTL) - pipeline-google-search: Content enrichment via Google Search API - pipeline-ai-summarizer: AI summarization using Claude API (claude-sonnet-4-20250514) - pipeline-translator: Translation using DeepL Pro API - pipeline-image-generator: Image generation with Replicate API (Stable Diffusion) - pipeline-article-assembly: Final article assembly and MongoDB storage - pipeline-monitor: Real-time monitoring dashboard (port 8100) ## Key Features - Redis-based job queue with deduplication - Asynchronous processing with Python asyncio - Shared models and queue manager for inter-service communication - Docker containerization for all services - Container names standardized with site11_ prefix ## Removed Services - Moved to backup: google-search, rss-feed, news-aggregator, ai-writer ## Configuration - DeepL Pro API: 3abbc796-2515-44a8-972d-22dcf27ab54a - Claude Model: claude-sonnet-4-20250514 - Redis Queue TTL: 7 days for deduplication 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
26 lines
674 B
Python
26 lines
674 B
Python
from pydantic_settings import BaseSettings
|
|
from typing import Optional
|
|
|
|
class Settings(BaseSettings):
|
|
# MongoDB Configuration
|
|
mongodb_url: str = "mongodb://mongodb:27017"
|
|
db_name: str = "rss_feed_db"
|
|
|
|
# Redis Configuration
|
|
redis_url: str = "redis://redis:6379"
|
|
redis_db: int = 3
|
|
|
|
# Feed Settings
|
|
default_update_interval: int = 900 # 15 minutes in seconds
|
|
max_entries_per_feed: int = 100
|
|
fetch_timeout: int = 30
|
|
|
|
# Scheduler Settings
|
|
enable_scheduler: bool = True
|
|
scheduler_timezone: str = "Asia/Seoul"
|
|
|
|
class Config:
|
|
env_file = ".env"
|
|
env_file_encoding = "utf-8"
|
|
|
|
settings = Settings() |