Featured Articles
Popular Articles
-
Product Insights from the Dremio Blog
Building AI-Ready Data Products with Dremio and dbt
-
Dremio Blog: Open Data Insights
Extending Apache Iceberg: Best Practices for Storing and Discovering Custom Metadata
-
Engineering Blog
Too Many Roundtrips: Metadata Overhead in the Modern Lakehouse
-
Dremio Blog: Partnerships Unveiled
Using Dremio with Confluent’s TableFlow for Real-Time Apache Iceberg Analytics
Browse All Blog Articles
-
Product Insights from the Dremio Blog
Building AI-Ready Data Products with Dremio and dbt
This guide will equip you with the expertise to easily build an AI-ready data product using Dremio and dbt. -
Dremio Blog: Open Data Insights
Extending Apache Iceberg: Best Practices for Storing and Discovering Custom Metadata
By using properties, Puffin files, and REST catalog APIs wisely, you can build richer, more introspective data systems. Whether you're developing an internal data quality pipeline or a multi-tenant ML feature store, Iceberg offers clean integration points that let metadata travel with the data. -
Engineering Blog
Too Many Roundtrips: Metadata Overhead in the Modern Lakehouse
The traditional approach of caching table metadata and periodically refreshing has various drawbacks and limitations. With seamless metadata refresh, Dremio now provides users with an effortless experience to query the most up-to-date versions of their Iceberg tables without wrecking the performance of their queries. So now a user querying a shared table in Dremio Enterprise Catalog powered by Apache Polaris for example can see updates from an external Spark job immediately with no delay, and they never even have to think about it. -
Dremio Blog: Partnerships Unveiled
Using Dremio with Confluent’s TableFlow for Real-Time Apache Iceberg Analytics
Confluent’s TableFlow and Apache Iceberg unlock a powerful synergy: the ability to stream data from Kafka into open, queryable tables with zero manual pipelines. With Dremio, you can instantly access and analyze this real-time data without having to move or copy it—accelerating insights, reducing ETL complexity, and embracing the power of open lakehouse architecture. -
Product Insights from the Dremio Blog
Incremental Materializations with Dremio + dbt
Incremental materializations allow you to build your data table piece by piece as new data comes in. By restricting your build operations to just this required data, you will not only greatly reduce the runtime of your data transformations, but also improve query performance and reduce compute costs. -
Dremio Blog: Open Data Insights
A Journey from AI to LLMs and MCP – 10 – Sampling and Prompts in MCP — Making Agent Workflows Smarter and Safer
That’s where Sampling comes in. And what if you want to give the user — or the LLM — reusable, structured prompt templates for common workflows? That’s where Prompts come in. In this final post of the series, we’ll explore: How sampling allows servers to request completions from LLMs How prompts enable reusable, guided AI interactions Best practices for both features Real-world use cases that combine everything we’ve covered so far -
Dremio Blog: Open Data Insights
The Case for Apache Polaris as the Community Standard for Lakehouse Catalogs
The future of the lakehouse depends on collaboration. Apache Polaris embodies the principles of openness, vendor neutrality, and enterprise readiness that modern data platforms demand. By aligning around Polaris, the data community can reduce integration friction, encourage ecosystem growth, and give organizations the freedom to innovate without fear of vendor lock-in. -
Dremio Blog: Open Data Insights
A Journey from AI to LLMs and MCP – 9 – Tools in MCP — Giving LLMs the Power to Act
Tools are executable functions that an LLM (or the user) can call via the MCP client. Unlike resources — which are passive data — tools are active operations. -
Dremio Blog: Open Data Insights
A Journey from AI to LLMs and MCP – 8 – Resources in MCP — Serving Relevant Data Securely to LLMs
One of MCP’s most powerful capabilities is its ability to expose resources to language models in a structured, secure, and controllable way. -
Dremio Blog: Various Insights
How Leading Enterprises Transform Data Operations with Dremio: Insights from Industry Leaders
At a recent customer panel moderated by Maeve Donovan, Senior Product Marketing Manager at Dremio, three of Dremio's largest customers came together with Tomer Shiran, Founder of Dremio, to share their experiences implementing Dremio's intelligent lakehouse platform. Antonio Abi Saad, Group Chief Data Officer at Sodexo, Karl Smolka, Associate Vice President - Data Platform & […] -
Dremio Blog: Open Data Insights
A Journey from AI to LLMs and MCP – 7 – Under the Hood — The Architecture of MCP and Its Core Components
By the end, you’ll understand how MCP enables secure, modular communication between LLMs and the systems they need to work with. -
Dremio Blog: Open Data Insights
Journey from AI to LLMs and MCP – 6 – Enter the Model Context Protocol (MCP) — The Interoperability Layer for AI Agents
What if we had a standard that let any agent talk to any data source or tool, regardless of where it lives or what it’s built with? That’s exactly what the Model Context Protocol (MCP) brings to the table. -
Dremio Blog: Various Insights
Dremio’s Leading the Way in Active Data Architecture
Modern data teams are under pressure to deliver faster insights, support AI initiatives, and reduce architectural complexity. To meet these demands, more organizations are adopting active data architectures—frameworks that unify access, governance, and real-time analytics across hybrid environments. In the newly released Dresner 2025 Active Data Architecture Report, Dremio was ranked #1—recognized as a top […] -
Engineering Blog
Introducing Dremio Auth Manager for Apache Iceberg
Dremio Auth Manager is intended as an alternative to Iceberg’s built-in OAuth2 manager, offering greater functionality and flexibility while complying with the OAuth2 standards. Dremio Auth Manager streamlines authentication by handling token acquisition and renewal transparently, eliminating the need for users to deal with tokens directly, and avoiding failures due to token expiration. -
Dremio Blog: Open Data Insights
A Journey from AI to LLMs and MCP – 5 – AI Agent Frameworks — Benefits and Limitations
Enter agent frameworks — open-source libraries and developer toolkits that let you create goal-driven AI systems by wiring together models, memory, tools, and logic. These frameworks enable some of the most exciting innovations in the AI space… but they also come with trade-offs.
- 1
- 2
- 3
- …
- 30
- Next Page »