Auditing Your Data

At Nielsen, our data goes through a robust Kafka architecture into several ETLs to receive, transform, and store the data.

While we understood our ETLs’ workflow, we had no visibility into what parts of the data, if any, were lost or duplicated, and in which stage or stages of the workflow.

I will present the design process behind our Data Auditing system, Life Line, which uses Kafka, Avro, Spark, Lambda functions, and complex SQL queries. We’ll cover:

  • Avro Audit header
  • Auditing heartbeat – designing your metadata
  • Designing and optimizing your auditing table
  • Creating an alert-based monitoring system
  • Answering the most important question of all – is it the end of the day yet?

Ready to Get Started? Here Are Some Resources to Help


2024 State of the Data Lakehouse

Benchmark your organization with Dremio's State of the Data Lakehouse Survey Report!

read more

Expert Panel Discussion – Data Integration Trends and Best Practices Webinar

TDWI senior research director James Kobielus will engage data industry experts in an in-depth discussion of data integration trends and best practices

read more
OctotronicDremio Smart Data Smart Factory 1


Smart Data – Smart Factory with Octotronic and Dremio

Wie bringt ein Data Lake House die Smart Factory auf ein neues Level?

read more
get started

Get Started Free

No time limit - totally free - just the way you like it.

Sign Up Now
demo on demand

See Dremio in Action

Not ready to get started today? See the platform in action.

Watch Demo
talk expert

Talk to an Expert

Not sure where to start? Get your questions answered fast.

Contact Us