AI – or Artificial Intelligence – has been one of the most talked-about technology topics of recent years. Sometimes it is portrayed as a threat, at other times it is hype: fantasies of intelligent androids have created a marketing frenzy. Many IT vendors and service providers have tried to put an AI spin on their offerings, even if in some cases the reality was merely a rule-based workflow engine with a fresh coat of paint.
Then again, most of what we read or hear about AI is actually about something conceptually far simpler: Machine Learning, or ML. This is a subset of AI, but it is a long, long way from intelligent computers and the like. ML merely uses mathematical and statistical models to make automated predictions and inferences from data.
Admittedly, they can be extremely complex models, and there are well-publicized risks in building those models on human decision-making or on personal data. However, things are rather different once we move into predictive maintenance, supply chain optimization, quality control or one of a host of other industrial areas. AI techniques, such as ML and its multi-layered subset Deep Learning (DL), have the potential to make dramatically faster and more accurate decisions here, based on far more data than a human could absorb.
It seems clear that AI is essential, therefore, to the continued digitalization of industry – to Industry 4.0 and what comes after it. With that in mind, this paper will look at the myths and realities of AI, and at what you need to know and ask when assessing and/or implementing it in an industrial context.
Download the Business Fit paper to read more…