Get Started Free
No time limit - totally free - just the way you like it.Sign Up Now
Henkel’s Laundry & Home Care division sells laundry detergents and household cleaning products and accounts for nearly 6.6 billion euros in sales and one-third of the company’s core business. The division generates massive datasets in its supply chain for demand planning/forecasting, supply network planning, production scheduling, manufacturing and logistics for 33 production plants, 70 contract manufacturers and 60 warehouses around the world.
In 2016, Henkel struggled to connect the data silos across its supply chain to get insights to better manage the business. They did not have visibility into the data across the silos. They relied on Microsoft Excel reports that were generated weekly, monthly or quarterly. When combining Excel reports with more than 1.4 million rows, it was challenging to integrate data across different silos and functions. As a result, they were only able to look at one aspect of the supply chain at a time or had to turn to external consultants and vendors to do the cross-functional analysis.
Henkel’s Laundry & Home division started the process of improving their data analytics by implementing Cloudera and Apache Spark in 2017. While this platform was a step in the right direction, there were still a lot of challenges. Cloudera required a large team to maintain the platform, it did not scale well and query times were slow. To address these limitations the team responsible for the Henkel Data Foundation, a highly integrated data management, processing and analytics platform based on fully elastic cloud technologies, recommended that the Laundry & Home division evaluate Dremio.
Henkel wanted to build a data lake solution that would increase the agility of advanced analytics, be able to support unstructured data and create data labs for their data scientists. With its strong query performance and semantic layer capabilities, Dremio is the perfect backbone for the Henkel data lake.
Over the past year, Henkel has migrated from Cloudera and a data warehouse architecture to a new solution platform based on Microsoft Azure Data Lake Storage (ADLS), Dremio, Databricks and Tableau. They can now natively analyze data stored in ADLS and leverage Dremio for data joins, filtering and transformations. Using Dremio to curate data as virtual datasets also enables Henkel to significantly increase the speed of critical supply chain dashboards in Tableau.
As you can see from the graphic below, both Dremio and Databricks play a crucial role in Henkel’s secured data access layer. Henkel uses Databricks mostly for the automatic pre-processing of mass data from the Henkel Data Foundation. Dremio enables the easy implementation of use cases for IT and end users.
By integrating data silos, Henkel has dramatically increased data visibility and transparency, providing one source of truth for their business data. In 2016, the team would go to meetings and everyone would bring their own charts, requiring them to spend the first 15 minutes of the meeting bringing the different charts together in a meaningful way. Today Henkel’s data quality is much higher and they can leverage data across different functions, generating valuable insights to bridge the gap between supply chain planning and production.
In the past, when production planners were asked how many bottles of laundry detergent the line could produce per minute, they would typically make a conservative estimate of the line speed/capacity, not wanting to overpromise.
With significantly increased visibility in their production capacity, Henkel can base their planning on reality. Once they had dynamic data planning tools, they started to compare plan speed vs. actual speed and saw a huge gap in Overall Equipment Effectiveness (OEE), a key metric that measures how efficiently their production lines are running. Henkel now has live, dynamic parameter settings in their planning tool, which has resulted in a >10% OEE increase since the introduction of the system. Henkel has 33 production plants with around 400 lines, with 250 connected in real time, so this was a huge productivity increase and just one example of how data drives cost savings.
Previously, Henkel had been extracting data from multiple sources to Tableau because Tableau could not connect natively to multiple data sources. The scale of data was exploding at such a massive pace that Henkel would need to invest in hardware every six months to continually expand computing power and maintenance. It was becoming hard to justify these numbers that were not hitting any supply chain KPI.
Henkel needed a stable environment, something that connects to the data sources. Dremio solved that problem, which has been huge in terms of maintenance and cost. Now that loads are managed by Dremio, they no longer need to keep expanding their computing hardware. This had a big impact in terms of cost savings.
The old system only had one specific dashboard with a live connection to a 3.5 billion-row dataset. If they wanted to do a query with a specific filter in Tableau, for example looking at a forecast for one brand, it would take 3-4 minutes to execute a query, which was an extremely long time to wait for an answer. Using Dremio, Henkel reduced its query time 30x to 8 seconds.
The Laundry & Home Care division now has >500 Tableau dashboards where business users can get fast answers to practically any supply chain query about demand and supply planning, production or inventory and use those insights to make rapid and informed business decisions.
For more information about how Henkel is leveraging Dremio to increase efficiency, reduce costs and accelerate time to business insights you can read the full case study here.