Abhishek Mehta
Aug 27, 2024
I will start by admitting that when it comes to envisioning a world powered by AI, I am in the ALL IN camp.
Having spent the last decade focusing on all things data further amplifies my beliefs that the right data feeding AI will be key to this future.
As a practitioner, advisor, entrepreneur, student and (sometimes) expert on all topics data and its relative value, I do believe AI will drive the biggest overhaul of corporate business models we have ever witnessed.
And while the advances in consumer AI will influence what occurs in the enterprise (every enterprise is a B2C business or B2B2C business after all), the factors that will drive Enterprise AI adoption – and the tools, technologies and trades needed to make it work – will be markedly different.
The biggest bet of all for Enterprise AI will be the need for DATA AT SCALE.
Which begs the question – what creates ‘DATA AT SCALE’?
To answer that question we must understand what constraints the creation and use of systems that produce data at scale. In my perspective, the two main constraints to creating usable AT SCALE systems are: TRUST and PERFORMANCE.
First and foremost, if we are unable to replicate the accuracy, fidelity, lineage, and consistency that we see with data ‘in test’, we will never TRUST what comes off of these systems at scale. And, consequently, we will forever be hesitant to use them.
Secondly, at scale data systems must be PERFORMANT – as regards speed, affordability, and flexibility – so much so that if the system will cease to be performant when scale changes, we will never be able to justify implementing them.
What enterprises are doing today to initiate their Gen AI passage, under pressure from the market and to show that they “get it” (be “AI ready”), is merely stitching together existing systems with some AI overlays to show progress.
After a year of these ‘knee jerk’ experiments, most recent reporting on ROI on AI finds that no more than 10-15% of these experiments have been institutionalized.
The reason is quite simple: Existing tools, built in the 80’s and 90’s, were engineered for a vastly different scale and purpose. And a little “GPT” lipstick is not what will fix systemic data challenges that have remained the bane of data driven enterprise transformation efforts over the past two decades.
In fact, given the already crumbling nature of these last gen systems (struggling to keep up with current workloads that make them so brittle that they can be easily tripped), any additional load/s would be disastrous – the unfortunate events that unfolded recently are proof of the same.
What the world needs now are business models designed to run at (global) scale … only possible with next generation systems and solutions fundamentally designed to deliver DATA AT (absolute) SCALE.
Image Source – Art of Intelligence | Refik Anadol