System Overview

Real-time monitoring of LLM services and infrastructure.

Last 24 Hours
LLM Service Status
Operational
Tavily API Quota
8,450 / 10k
Proxy Pool Status
142 Active
Average Latency
245ms
LLM Provider Status
OpenAI
Operational
120ms avg
DeepSeek
Operational
145ms avg
Moonshot
High Load
850ms avg
Zhipu AI
Operational
180ms avg
Qwen
Operational
165ms avg
Ollama
Offline
--
Custom
Operational
95ms avg

Proxy Pool Health

94%
Healthy
US Region
85
EU Region
42
APAC
15
Healthy
Slow
Dead

Recent Activity Log

Event Source Action Detail Status Timestamp
System Config
Updated Tavily API Key Success Just now
Proxy Monitor
Detected high latency on Node #402 Warning 12 mins ago
LLM Connector
Ollama Connection Timeout Failed 45 mins ago
Search Engine
Cache cleared successfully Info 1 hour ago