Book a call

Precision by Design

A Better Way to Stream Pricing to Users

3forge offers a proven path to democratizing market data applications. The result: faster delivery, lower cost, regulatory compliance, and user experiences that scale with the demands of global markets.

The Challenge of Delivering Live Pricing in Capital Markets

Delivering real-time market prices to front-office users sounds straightforward in theory — connect to a data source, authorize users, and display the feed in an application. In practice, it is one of the most complex and resource-intensive engineering challenges faced by financial institutions today. A confluence of issues makes this difficult: licensing restrictions, fragmented sources, massive data volumes, and architectural limits in distribution and visualization. Each dimension introduces unique hurdles that, if not addressed holistically, can result in brittle systems, compliance risk, and ballooning costs.

1. The Licensing Conundrum

Market data licensing is not only expensive but also deeply restrictive. Exchanges, venues, and third-party data vendors each impose their own rules for how data can be consumed, redistributed, and reported.

  • Per-user entitlements: Some providers only allow streams to be provisioned on a per-user basis, forcing firms to establish costly one-to-one relationships between a user and a feed. This makes scaling across hundreds or thousands of traders and analysts prohibitively expensive.
  • Audit and compliance risk: Licensing agreements often require firms to prove at any moment that only entitled users are consuming data. Weak entitlement controls can lead to accidental data leakage, regulatory breaches, and unexpected fees.
  • Operational friction: Entitlement changes, such as onboarding a new analyst or revoking a departing employee’s access, are often manual, error-prone, and slow to implement — the opposite of what trading teams need in fast-moving markets.

Any suitable solution must include fine-grained entitlement controls, integration with identity systems, and auditable enforcement across all applications. Without this, scaling access is simply not sustainable.

2. Diversity of Sources and Symbology

Global trading requires coverage across equities, fixed income, FX, commodities, and derivatives. No single provider covers all markets comprehensively, which means firms rely on a patchwork of vendors.

  • Multiple data providers: Bloomberg, Refinitiv, ICE, CME, and numerous regional exchanges all deliver feeds with different formats, protocols, and latencies.
  • Symbology translation: A single instrument may be represented differently across providers, complicating aggregation and distribution. Mapping these inconsistencies is an endless challenge for internal teams.
  • Engineering overhead: Supporting “many-to-many” distribution (many sources feeding many applications) requires not only feedhandlers but also ongoing maintenance, reconciliation, and version control.

A robust solution must act as a data virtualization layer: ingesting disparate feeds, normalizing symbology, and distributing data seamlessly to downstream systems — without requiring every application team to solve the problem independently.

3. The Scale of Market Data Volumes

Streaming market data is a firehose. Level 1 (top-of-book) and Level 2 (depth-of-book) updates can generate thousands of ticks per second per symbol. Multiply that by tens of thousands of actively traded instruments across global markets, and the ingestion rate easily exceeds hundreds of thousands of messages per second.

  • Performance challenge: Many architectures collapse under sustained throughput at this scale. Bottlenecks in message buses, databases, or UI updates can cascade into outages.
  • Cost of scale: Scaling traditional infrastructures for peak throughput often requires expensive over-provisioning of hardware and network capacity.
  • Latency expectations: In capital markets, latency is not measured in seconds but microseconds. Even small delays can render a platform unusable for traders who expect instantaneous updates.

A viable solution must be engineered for extreme throughput and low latency, capable of scaling horizontally without costly infrastructure sprawl.

4. The User Interface Bottleneck

Even if data is ingested and normalized at scale, there remains the question of how to deliver it to the end-user. Displaying high-frequency data in browser-based applications is deceptively difficult:

  • Rendering limits: Standard web technologies are not designed to render thousands of updates per second across large symbol universes.
  • Population scale: While a proof-of-concept might handle a dozen users and symbols, scaling to thousands of users with diverse portfolios quickly exposes performance bottlenecks.
  • Developer complexity: Building efficient real-time tables, pivots, and charts typically requires specialized front-end engineering skill sets, slowing down application delivery.

An effective solution must provide a low-latency, web-native interface framework that handles rendering at scale while abstracting away complexity for developers.

What an Ideal Solution Requires

Bringing these challenges together, a “fit-for-purpose” solution must offer:

  1. Granular entitlement management – tightly integrated with identity and access control systems, compliant with exchange rules, and auditable.
  2. Feed normalization and virtualization – the ability to ingest from diverse providers, reconcile symbology, and deliver consistent data to all consuming applications.
  3. High-performance distribution layer – able to handle millions of messages per second with microsecond-level latencies, and scale horizontally without excessive cost.
  4. Web-native UI framework – enabling browser-based applications to render streaming market data responsively, across thousands of users, without requiring deep specialist coding.
  5. Low-code development environment – empowering financial engineers and analysts to assemble applications quickly, without waiting months for traditional software development cycles.

How 3forge Fits This Model

This is precisely the model around which 3forge has been built.

  • Entitlements and Licensing: 3forge’s extensible entitlement framework allows firms to define and enforce user-level access rules across applications, ensuring compliance with the most stringent licensing regimes.
  • Feed Normalization: With robust feedhandlers and a powerful data virtualization layer, disparate feeds are harmonized with little-to-no coding, reducing engineering overhead.
  • Performance at Scale: 3forge Relay and Center are performance-tuned to handle extreme volumes, supporting ingestion and distribution of over 1 million messages per second while maintaining low latency.
  • Web Interface: 3forge Web provides a real-time user interface framework, seamlessly sourcing data from Center’s real-time database and rendering it in browser-based tables, pivots, and charts.
  • Low-Code Speed: A low-code development studio allows teams to build and deploy market data dashboards in minutes, adjusting conflation settings or display logic without deep engineering cycles.

Conclusion

Distributing live pricing is one of the hardest technology challenges in capital markets — an intersection of licensing, data fragmentation, scale, and user interface complexity. A solution must address each dimension holistically, not in isolation.

With its integrated stack — from feed ingestion to entitlement management, high-performance messaging, and browser-native visualization — 3forge offers a proven path to democratizing market data applications. The result: faster delivery, lower cost, regulatory compliance, and user experiences that scale with the demands of global markets.

Andy George

Written by Andy George

Solutions Architect

Read further