AI Series – Part 4: Predictive Modeling

After Data Preparation, Data Exploration and Visualization is the stage where you get to know your data. In this stage, you can perform Exploratory Data Analysis (EDA) to understand the structure and characteristics of your data. EDA involves analyzing the data, identifying outliers, visualizing to gain insights and patterns. This stage helps you gain a better understanding of the data and identify any potential problems that may need to be addressed. Its the first step to building predictive models which leads to strong AI models.

Predictive Models are made up of algorithms that perform statistical analysis and data mining to identify patterns and trends in the data. A good examples are the weather app on your phone which takes years of historical weather knowledge and the current weather and provides a 7 day outlook so you can plan accordingly. These models are the pre-cursers to AI models as they train and align your data from all your systems.

EDA is a grey area with a lot of noise as no one actually will tell you what to do to run a functional predictive model and the information out there makes it seem so complex. What is that exercise? What is that process? Silence…They all want you to think it has to be complex because they want your business. “Just choose from one of the many models (trees, neural networks, etc, etc) and run your historic and current data together in these algorithms models for success? – we can help.” Ok, sure.

I would like to make a suggestion that handles this complex grey area with a focused actionable exercise that actually provides value! Here it is and I would appreciate your feedback.

Predictive Modeling:

Use Predictive Modeling as your EDA. It’s a solid exercise that helps manually train your data, your teams and your knowledge before hitting the AI button and choosing the wrong model.

Use Case: As an example for Field Service, I have built RT (Real Time) IoT simulations to assets to have actual data flow through an entire lifecycle. Think of a contained asset to keep it simple like an MRI machine with limited downstream and upstream.

A sensor is hit and it sends an IoT Case Record Type Asset X with the data set in assorted fields on the Case. A Work Order is also created and the key here is the designated Work Type that drives the actions around the treatment of the Asset. The Work Type determines the parts, tools, skills, times, and many other key items that define the Asset issue.

By utilizing this design the user can see which Vans have the parts needed to make this call so you know the FS Tech has what they need to fulfill the order. When the FS tech visits the asset and finds during the investigation of the Asset that the Work Order is incorrect, rather than leave this valuable information, run a dynamic checklist to show whether the issue is further down or upstream on the Asset.

We worked with a train company and the Work Order may state brake issues but on inspection the real issue is in the pressure components further upstream. The issue is then mapped correctly with the dynamic checklist and applied to the Predictive Model. Now I am limiting space here but hopefully this makes sense. The Predictive Model is in front of the process or input – analytics are in the back of the process or output but impact the Predictive Model which defines the Work Types. The better the relationship with the Asset, the better the response.

This is an example of one model that can be run for an asset but this can be executed for many User Cases. If you want more details just reach out, I would be happy to assist.

Start here and you will be able to run a manual test and find some amazing things about your people, your processes, your assets (data, products, services) that your company creates.

Previous
Previous

AI Series – Part 3: Data Collection and Preparation

Next
Next

AI Series – Part 5: Modeling Intro