Skip to main content

About TimeBase

TimeBase is a high-performance streaming time-series database developed by Deltix.

TimeBase was initially designed for very fast aggregation and retrieval of massive volumes of high frequency financial market data. The same TimeBase technology excels at processing any time-series data: financial markets (MBO/ITCH), IoT (MQTT), software metrics and signals, real-time events, logging etc.

TimeBase runs standalone or in a cluster, processes millions of messages per second, stores terabytes of data, and offers microsecond latencies.

TimeBase combines multiple solutions into single package:

Key Differentiators

  • Unified streaming API for both history and live time-series data.
  • High performance: system may be configured to stream data with microsecond latencies or read/write millions of messages per second on each data producer and consumer.
  • Low latency: when streaming live data, TimeBase can feed real-time consumers from memory rather than disk, which allows for a significant latency reduction.
  • Complex message structure TimeBase can store complex message structures that reflect data in your business domain (no need for intermediate DTO objects).
  • Schema-based database with embedded data serialization and modeling framework allowing for better visibility and data migration. Smooth transition from rapid data prototyping to production solution.
  • Row-based design offers better latency and throughput for streaming use cases comparing with column-based databases.

Typical Tasks

  • Data replication framework: use multiple out-of-the box integrations or open multi-language API to create custom integrations.
  • Aggregation of massive volumes of heterogeneous time-series data history or real-time from multiple sources with superior latency and throughput.
  • Reliable data storage for heterogeneous time-series data.
  • Rapid retrieval/streaming of time-series data both history and real-time. TimeBase has a sophisticated time-series engine, capable of efficient on-the-fly merging of multiple data streams with arbitrary temporal characteristics into a unified query response.
  • Live data streaming provided by a simultaneous work of readers and writers.
  • Framework for data processing and enrichment (foundation for building normalization and validation frameworks).
  • Statistical models and machine learning: warm-up mode (initialization with historical data), parameters estimation, online forecasting, recurring learning (on the fly adjustment with the up-to-date parameters).