Data Insight
Explaining what factors are important
In today's overload of information, knowing what matters and what is just flow-on effect can make the difference between surviving and disaster. With Data Insights we analyse your data and tell you which factors were key drivers and how they affected the outcome. From price rises, product releases, and market changes, to risk and strategic planning, such insights are truly visionary.
Gaining Management Trust in AI
Unfortunately, high value AI models like Boosted Trees, Random Forests, and Neural Net, although good at predicting outcomes, are not easy to understand how the models reach their conclusions (commonly referred to as black-box solutions). This leads to a lack of trust by operational management, resulting in a poor take-up of AI.
Data insight breaks down this barrier to success by empowering managers with understanding the underlying drivers for strategic and operational informed decision making. This leads to greater confidence in Machine Learning models to predict the likely outcome based on changes in those factors.
What Data Insights can do for you
Every data set is flawed! Not just high risk medical situations or loan approvals at banks but even standard business problems like churning modelling. Now with Data Insights you can find these flaws and adjust for them and build more robust models as a result.
With Data Insights we utilise Microsoft Machine Learning interpretation tools that allow you to see exactly what an AI model learns, feature by feature, and really understand how it’s making its decisions.
It produces interactive plots to help you dive into this model and help you understand how it’s working starting with a ranking of what features an AI model would find important overall.
Extract from Microsoft Research demonstration of the Microsoft Machine Learning interpretation tools.
As a precursor to AI modelling
Understanding your data is the most important step in the path to developing correctly targeted Machine Learning models. Data Insight allows you to understand what questions you should be asking and what outcomes you really should be looking for.
or Explainability of existing models
If you aren't happy with the adoption rate of your existing modelling, or not seeing the promised returns, taking a step back and looking at what you're working with, and your goals, maybe worth considering.