Auditing Your Data

At Nielsen, our data goes through a robust Kafka architecture into several ETLs to receive, transform, and store the data.

While we understood our ETLs’ workflow, we had no visibility into what parts of the data, if any, were lost or duplicated, and in which stage or stages of the workflow.

I will present the design process behind our Data Auditing system, Life Line, which uses Kafka, Avro, Spark, Lambda functions, and complex SQL queries. We’ll cover:

  • Avro Audit header
  • Auditing heartbeat – designing your metadata
  • Designing and optimizing your auditing table
  • Creating an alert-based monitoring system
  • Answering the most important question of all – is it the end of the day yet?
get started

Get Started Free

No time limit - totally free - just the way you like it.

Sign Up Now
demo on demand

See Dremio in Action

Not ready to get started today? See the platform in action.

Watch Demo
talk expert

Talk to an Expert

Not sure where to start? Get your questions answered fast.

Contact Us

Ready to Get Started?

Enable the business to create and consume data products powered by Apache Iceberg, accelerating AI and analytics initiatives and dramatically reducing costs.