Featured Articles
Popular Articles
-
Dremio Blog: Open Data Insights
What’s New in Apache Iceberg 1.10.0, and what comes next!
-
Dremio Blog: Various Insights
The Model Context Protocol (MCP): A Beginner’s Guide to Plug-and-Play Agents
-
Product Insights from the Dremio Blog
How Dremio Reflections Give Agentic AI a Unique Edge
-
Product Insights from the Dremio Blog
MCP & Dremio: Why a Standard Protocol and a Semantic Layer Matter for Agentic Analytics
Browse All Blog Articles
-
Dremio Blog: Open Data Insights
What’s New in Apache Iceberg 1.10.0, and what comes next!
Apache Iceberg 1.10.0 represents a turning point in the evolution of the open lakehouse. With the general availability of format-version 3, Iceberg now offers a more complete solution for organizations seeking the flexibility of data lakes combined with the reliability of data warehouses. Features like binary deletion vectors, default column values, and row-level lineage aren’t just incremental improvements, they redefine what’s possible in managing massive, ever-changing datasets. -
Dremio Blog: Various Insights
The Model Context Protocol (MCP): A Beginner’s Guide to Plug-and-Play Agents
By standardizing the interaction between hosts, clients, and servers, MCP unlocks true modularity. You can swap models without breaking workflows, mix and match servers for analytics, email, or storage, and grow your AI capabilities incrementally. The Dremio + SendGrid example shows how easily analytics and action can come together, transforming what used to be manual, multi-step processes into fully automated workflows. -
Product Insights from the Dremio Blog
How Dremio Reflections Give Agentic AI a Unique Edge
For organizations exploring agentic AI, this translates into a critical edge: AI agents can generate dynamic, ad-hoc questions and still receive sub-second, business-ready answers. With reflections, the performance layer is no longer a bottleneck, it becomes an enabler of intelligent, real-time decision-making. -
Product Insights from the Dremio Blog
MCP & Dremio: Why a Standard Protocol and a Semantic Layer Matter for Agentic Analytics
Dremio’s MCP server, integrated semantic layer and autonomous reflections deliver this combination. They turn natural‑language intent into secure, performant and semantically rich analytics, enabling agents to act not just as chatbots but as trustworthy decision‑makers. -
Dremio Blog: Open Data Insights
Looking back the last year in Lakehouse OSS: Advances in Apache Arrow, Iceberg & Polaris (incubating)
The direction is clear: the open lakehouse is no longer about choosing between flexibility and performance, or between innovation and governance. With Arrow, Iceberg, and Polaris maturing side by side, and with Dremio leading the charge, the open lakehouse has become a complete, standards-driven foundation for modern analytics. For enterprises seeking both freedom and power, this is the moment to embrace it. -
Dremio Blog: Open Data Insights
Scaling Data Lakes: Moving from Raw Parquet to Iceberg Lakehouses
Apache Iceberg closed that gap by transforming collections of Parquet files into true tables, complete with ACID transactions, schema flexibility, and time travel capabilities. And with Apache Polaris sitting on top as a catalog, organizations finally have a way to manage all those Iceberg tables consistently, delivering centralized access, discovery, and governance across every tool in the stack. -
Dremio Blog: Various Insights
Partition Bucketing – Improving query performance when filtering on a high-cardinality column
Introduction Dremio can automatically take advantage of partitioning on parquet data sets (or derivatives such as Iceberg or Delta Lake). By understanding the dataset’s partitioning, Dremio can perform partition pruning, the process of excluding irrelevant partitions of data during the query optimisation phase, to boost query performance. (See Data Partition Pruning). Partition bucketing provides a […] -
Dremio Blog: Open Data Insights
Apache Polaris Releases Version 1.1.0 (Better Federation, Minio Support and more)
Support for Hive Metastore federation and modular catalog integration makes Polaris more adaptable to real-world data environments. Enhancements to external authentication and Helm-based deployment reduce friction for teams operating in secure, regulated environments. And with expanded support for S3-compatible storage, the catalog can now accompany your lakehouse architecture into hybrid and edge deployments without compromise. -
Dremio Blog: Various Insights
The Growing Apache Polaris Ecosystem (The Growing Apache Iceberg Catalog Standard)
What makes Polaris especially exciting is the trajectory it’s on. Today, it is a powerful, open catalog for Iceberg tables. Tomorrow, it could serve as the central control plane for managing a full range of lakehouse assets, unifying governance, access, and interoperability across an increasingly complex data ecosystem. -
Engineering Blog
Column Nullability Constraints in Dremio
Column nullability serves as a safeguard for reliable data systems. Apache Iceberg's capabilities in enforcing and evolving nullability rules are crucial for ensuring data quality. Understanding the null, along with the specifics of engine support, is essential for constructing dependable data systems. -
Product Insights from the Dremio Blog
Dremio Reflections – The Journey to Autonomous Query Acceleration
Reflections are a query acceleration functionality unique to Dremio, that work by minimising data processing times and reducing computational workloads. Debuting in the early days of Dremio, Reflections accelerate data lake queries by creating optimised Apache Iceberg data structures from file-based datasets, delivering orders-of-magnitude performance improvements. However, the game-changing aspect of this technology was not […] -
Dremio Blog: Various Insights
Optimizing Apache Iceberg Tables – Manual and Automatic
When combined with Dremio’s query acceleration, unified semantic layer, and zero-ETL data federation, Enterprise Catalog creates a truly self-managing data platform—one where optimization is just something that happens, not something you have to think about. -
Dremio Blog: Various Insights
Optimizing Apache Iceberg for Agentic AI
By using Dremio as the data gateway, organizations improve security, reduce complexity, and give their agents the reliable, performant access they need—without reinventing the data stack. This frees developers to focus less on credentials, connectors, and workarounds, and more on building the intelligent workflows that drive business impact. -
Product Insights from the Dremio Blog
Realising the Self-Service Dream with Dremio & MCP
A promise of self-service data platforms, such as the Data Lakehouse, is to democratise data. The idea is that they empower business users (BUs), those with little or no technical expertise, to access, prep, and analyse data for themselves. With the right platform and tools your subject matter experts can take work away from your […] -
Product Insights from the Dremio Blog
5 Ways Dremio Makes Apache Iceberg Lakehouses Easy
Dremio simplifies all of it. By bringing together query federation, an integrated Iceberg catalog, a built-in semantic layer, autonomous performance tuning, and flexible deployment options, Dremio makes it easier to build and run a lakehouse without stitching together multiple tools.
- 1
- 2
- 3
- …
- 32
- Next Page »