Webinars

Building a Historical Financial Data Lake at Bloomberg

Bloomberg’s Enterprise Data business has accumulated petabytes of historical financial data by taking point-in-time “snapshots” of financial entities and their attributes over four decades. Historical financial data is critical in backtesting models, evaluating risk, regulatory reporting, evaluating data quality, and more. Our Enterprise Data Lake engineering group ingested all historical text files (plus the ongoing snapshots that continue flowing in) into Apache Iceberg tables. This talk will include an overview of the challenges our organization needed to address, the open source architecture/tools we chose (Iceberg, Trino, etc.), and the impact this initiative has had on our business.

Download PDF

Ready to Get Started? Here Are Some Resources to Help

Using Data Mesh to Advance Distributed Data Access, Agility and Governance

Join this live fireside chat to learn about using Data Mesh to Advance Distributed Data Access, Agility and Governance.

read more

Webinars

Smart Data – Smart Factory with Octotronic and Dremio

read more

Guides

What Is a Data Lakehouse?

The data lakehouse is a new architecture that combines the best parts of data lakes and data warehouses. Learn more about the data lakehouse and its key advantages.

read more

Get Started Free

No time limit - totally free - just the way you like it.

Sign Up Now

See Dremio in Action

Not ready to get started today? See the platform in action.

Watch Demo

Talk to an Expert

Not sure where to start? Get your questions answered fast.

Contact Us