Realease 2025/4/10
Observ April Updates
This month, Observ brings a series of powerful enhancements focused on VLM-driven detection, performance optimization, and workflow flexibility. Here’s what’s new:
New Features:
🔍 VLM Sampling Event Detection
You can now schedule VLM sampling events by setting fixed time intervals and applying predefined VLM templates. This enables automated streaming analysis with VLM to detect contextual patterns in real-time stream.

✏️ Customizable VLM Template
VLM prompts can now be augmented with additional parameters, giving you more control and precision over how events are interpreted and triggered.

📊 VLM Results Visualization
We’ve added analytic charts to visualize VLM detection results—helping you quickly understand detection patterns, and prompt effectiveness.

🚀 Extreme Performance Optimization
Streaming pipelines now support GPU decode with the ability to assign a specific GPU, maximizing performance on multi-GPU systems.

⏱ Flexible Event Timing Configuration
You can now define custom detection timeframes per event, allowing for more refined and situation-specific detection schedules.

📦 Expanded Batch Management Capabilities
Batch operations now support:
Batch VLM configuration
Batch editing of event time settings
Batch event creation for faster setup across multiple cameras or tasks

Coming Soon:
We’re continuing to make Observ more intelligent, more powerful, and easier to operate. Stay tuned for more exciting updates in the coming months!
VLM Sandbox: A new VLM sandbox environment will allow users to directly test prompts and preview detection results, making it easier to refine and validate templates before deployment.
Performance Tuning for Large-Scale Deployments: Upcoming improvements will focus on performance tuning for streaming models and large-scale deployments, ensuring stable, optimized operation in demanding environments.
Support
For technical support, please contact our support team during business hours.
Last updated