March 2, 2023

10:10 am - 10:40 am PST

Delivering Self-Service Analytics on Dynamic 365 ERP

In this talk, Ricardo will describe how PON were able to streamline its data pipelines and deliver frictionless self-service analytics on Dynamic 365 ERP stored on a data lake leveraging ADLS, Dremio and Tableau.

Topics Covered

Customer Use Cases
Real-world implementation

Sign up to watch all Subsurface 2023 sessions

Transcript

Note: This transcript was created using speech recognition software. It may contain errors.

Ricardo Rietveld:

So my name is Ricardo Feld and I am a bi architect working for Pong. pong is a Dutch family company based head office is based in Amsterdam. it’s opening during visiting hours, so you can see all our mobility solutions especially for kids. It’s a great experience center where they can do some some racing and some games. And so if you ever visit Amsterdam and you bring your kids please feel free to visit. I think it’s a great experience. so again, my name’s Feld. I’m a bi architect at Home Equipment upon Power. So the key figures of p as a whole is that we have a revenue over 8 billion euros. We have 16,000 employees working in 65 countries. then my division, or our division is specific P Equipment, Palm Power, and we are an official Caterpillar dealer, and that is for the Netherlands and Norway. And we have around 1200 employees. So this E R P implementation, we have done that with our partner called Accept and they are a Microsoft Key partner.

So a little bit about our analytics timeline. So I worked for Palm from 2012, and before that time we already had an analytical platform introduced in 2008, which was based on SQL Server and Konos Konos. and basically before that we had a lot of direct queries on our R series. And what you can see here is that everything is stated with present, and that’s because these still are alive. We still have these alive. from 2014, we started a new project. it was all about self-service bi and we introduced the I B M Netezza, which is a massive parallel processing, so very, very fast together with C D C to enable near realtime reporting and also infrastructure data stage together with Tableau. So that was introduced in 2014 and like David introduced already, a is that we have brought our first company live in June, 2022. So actually we started I think one year earlier. So somewhere in 2021 we started working with with REO and Power bi on top of a data lake. so before dremeo, what was the difficulty? We had from all our legacy systems, we have multiple pipelines. so data states S

S I S most will be familiar with that one. And if you have multiple legacy system, each tool requires a specialized set of skills. And this is costly and it’s not always accessible. And the other issue that we were facing is that moving data comes with sync issues. So with our new platform, we were trying to avoid these and these were part of our selection criteria. So just some specifics on this export to ex to Azure Data Lake. so previously Microsoft told their customers to bring their own database. So most of the time this was a SQL database, but now they have moved into, into the data lake and what they have done, what they have provided, because you can’t make access of the data in their production environment. So what they will do, they will bring the data to a data lake, and this is fully managed by, by Microsoft, and you can do that with up to 350 tables.

And all changes are constantly updated in the data lake. I think this is based on c d C technology. So the, the takeaway here is that we don’t have to worry about this anymore. So this will take out all the pipelines that we once had. It is fully managed by Microsoft and Microsoft did it in such a way that it still fulfills all the key elements for data warehouse. So this is our starting point, and with moving to a data lake and the data just sits there in the data lake, we were also decoupling the semantics from the data. And now we could put a hundred percent emphasis on building a semantic layer for our end customers, and no longer had to worry about moving the data from A to B.

Okay. Before before we went to reo, we had some considerations. and what we wanted to do was to avoid a need to duplicate data. I’ve already set that, and our customers were already experience high performance and near real time reporting. So this has to be maintained because our company has a lot of ARO queries they still want to make use of that possibility, and they rather have it yesterday than tomorrow. So they need time to market should be very small, and we are very customed to Power BI and Tableau. So it needs to be integrated with with this tool. Also, it needs to have embedded security. And one of the key factors is that it is s SQL based. And I know that a lot of tools out

There they promote no code, but in our company we have quite a lot of people, a lot of people that still writes SQL L codes. And I think it’s better to learn someone to write better SQL L codes than to bring them to a very specific tool where you have to learn these, these people to to work with this tool. Okay, after working with REO myself for two years, why would we keep reo? well, the, the, the main thing that I’ve noticed is that REO is very intuitive and everything it does, everything REO does is is it, it’s keeping it under the hoot, exposing only those things that the end user really requires. so it takes out the complexity. And what I’ve seen, and I’ve onboard numerous people on this platform already, is that it doesn’t require a lot of time to onboard and train people.

So it, it, it usually takes a few hours and then they can start building value for the company. And I don’t have to send them out to a specific training or takes ages. They can start right away, which is, which is very nice. so the only thing they have to understand is, is sql, and this is still the defect of standard in the world. Yeah, so that’s, that’s good. so what were the, the factors for our success? So we are going towards an environment where we had to build new data models, but we didn’t know the e r P system. We didn’t know the processes, we didn’t know the data. In other words where to begin, and there are actually three pillars I want to mention. So Target, which is a accept partner they already provide a b i analytics platform for other heavy equipment dealers.

And we just bought their data models. That’s what we did. The SQL Server itself, and not the production database, but the test and u a T environments, they have so-called entities. And in essence, these are database views. So they combine all sorts of tables, and again, this is, this is sql and in Dynamics 365 itself if you are an admin, then you can also make use of the forms and see what kind of pseudo period code is being used underneath. So these three pillars will give you a headstart in building your own data models in in Drio. And this really speeded up our process. So we could do this in, in really matter of weeks. So for those technical people out there who would like to see how that looks like, so on the right top, you will see one of those entities, and I think people with a little bit of coding experience can, that you will see the main attributes, the main tables and the relationships between those tables. So the pseudo code is not really different, it’s written differently, but again, you will see all the tables and the relationship between these tables, how to use them. So now having this repository at our disposal, we were able to build all the, the data models quite fast.

So what do we need? so going back to onboarding and training people, you always want to keep things simple. so you don’t want to teach them the exceptions, but what you really would like to do is give them a concept and build on top of that concept. And what we did was a three le three layer concept. So the first layer is, is the staging layer. the export to Data Lake actually brings CSV data to the data layer, so it doesn’t have any metadata. So this is where we enrich all that CSV data into table format with proper metadata. Then there’s a second layer where all the business logic is, and all the integration between tables is that’s called the integration layer. And then finally we have like a like a semantics layer. It is built on top of a Kimball model. So with facts and dimensions and keeping it quite simple, again, gives people the ability to build on top what already exists. So do not train people in exceptions, but always train people in, in a, in best practices and key concepts.

 so graphically we can see that Microsoft Dynamics just puts all the data in the in the data lake. And dremeo is a kind of a, a middle layer that services as a provider of data to a number of client tools. So if a customer is very familiar with Tableau, they can just use Tableau. If they would like to do some coding in Python, then go ahead because Dremeo will provide for the data. And if you want to have Tableau data just go to D Beaver. And even in our platform, we still see Excel as a, as a BI tool. And so it can also provide for that. So one year down the line, we were already able to do things that we were never able to do. So you have to keep in mind that we are a hefy equipment dealer with a huge install base of thousands of machines and all these machines, they, they send out their I o T data with a frequency of 15 minutes. So every 15 minutes,

Each machine will send out its g p s locations and a number of other things. with a normal data database or a a data warehouse this is very difficult to comprehend because it’s quite a lot of data. And now with the capabilities of the data lake together with with reo, we were able to build. and so, so what you can see here on the right top is a, is a Tableau view, which is a, is a near real time recording of the use of our machines. So if we want to know where the machines are at this moment in time, we are able to, so this is a real enabler. another enabler is that the data to Data Lake also provides for all the changes. It’s called change feed. And with that change feed you can see what changes have occurred, and you can just use process mining.

So again, it’s it’s a, it’s a bundle of data. It’s, it’s a lot of data because there are many, many changes in a process for a single sales order or a service call or, or whatever. And now we were able to build on top of that a process mining for our own continuous improvement of our processes. So not only we can do standard analytics with this platform, but it also opens up new capabilities. And I think that’s great. So just to recap on what’s next so our journey so far has shown that we can build on Dynamics 365. And the way Microsoft has done that to bring the data to Data Lake is actually quite, quite nice. currently it’s up to 350 tables. that’s the limit, but hopefully in the future that will increase. so we can do, do even more.

 what I’ve shown you is that if you knows SQL L and you can get your SQL L models from somewhere else, you were, you are able to build Romeo models in, in, in days. And so the time to market and the time to value is, is really, is really good. so in the end, it’s, it’s making a project successful in a minimum amount of time. and there’s always people looking over your shoulder to bring value to, to bring you, to bring reports that we have not seen before. And I think the data lake together with premiero has, has done that job. So what are our next actions? so first of all, we have to take out the legacy systems because again, these require different skill sets yeah,

And they’re just dying out. So, and they’re expensive. And we have to show other other entities what we are capable of. So they are willing to also move onto our analytical platform and even reducing cost along the way. So this is our journey. I hope for all of you out there that use Dynamics 365, have seen that there is a possibility to to do this together with reo. if you would like to have any detailed information, you can always contact me. I will help you along the way. I can even provide you the data models that we have. it’s all open for use. So thank you for your time and if you need me, just reach out. Thank you.

header-bg