‘It saved $260 million a year and 4,500 developer years for Amazon – this is insane’: How AI’s ‘Scaling Law’ blows Moore’s Law out of the water, driving price collapse
Generative AI costs have plummeted 80 per cent in less than two years since OpenAI launched ChatGPT. Some of the world’s biggest firms have rolled out company-wide use cases, saving hundreds of millions of dollars and driving new previously unimaginable revenues. What’s more, AI is fuelling itself. “We're using AI to build AI tools to build better AI”, per Microsoft CEO Satya Nadella. Addo AI CEO and cofounder Ayesha Khanna says the costs will keep falling due to Scaling Law, AI's equivalent to Moore's Law.
What you need to know:
- Tech industry success was built upon Moore's law and the idea that process capacity doubles every two years at no extra cost, but now AI has an alternative – what Microsoft's CEO Satya Nadella calls "The Scaling Law" and scale is doubling every six months!
- That's just one of the forces driving a collapse in the price of generative AI, down by 80 per cent in less than a year and a half, according to Addo AI CEO and founder and L'Oreal Scientific Advisory Board Member Dr Ayesha Khanna.
- During her closing keynote speech at last week's Twilio Signal conference in Singapore Khanna revealed why costs are falling – and the business impact.
- Large global businesses are already delivering company-wide use cases with huge returns.
- Walmart, for example, used Gen AI to help achieve genuine personalisation at scale by applying generative AI to the more than 420 million SKUs in its ecommerce business.
We're using AI to build AI tools to build better AI. It’s just a new frontier.
Generative AI is 80 per cent cheaper than it was just 16 months ago, delegates to the Twilio Signal conference in Singapore heard last week.
The reasons, according to Dr Ayesha Khanna, CEO and co-founder of Addo AI, as well as a director of Johnson Controls, and a scientific advisory board member for L'Oreal, are myriad and sustained – so expect the prices to keep dropping.
During her keynote she quoted comments by Microsoft CEO Satya Nadella on “the Scaling Law” which is says is analogous to Moore’s Law that underpinned the economics of microprocessors for decades.
Nadella made his comments in a conference in London a few days earlier, saying “the coolest things I have seen recently is … you can sort of use AI to do the next level of optimisation.”
He said, “Think about the recursiveness of it which is we're using AI to build AI tools to build better AI. It’s just a new frontier.”
According to Nadella, “The Scaling Law as people describe it in AI, it's an empirical law, like Moore's Law was also not a physical law. It was just an empirical observation that we talked about as if it were a law, but it held true."
The Scaling Law suggests we will see a doubling of capacity every six months which puts Moore’s Law to shame. (Moore’s Law holds that the number of transistors that could be packed onto an integrated circuit would double every two years, but the price would stay the same).
“In fact, one of the things that I think a lot about is performance [which is] tokens per dollar per watt, that's the new currency.”
Khanna however, said the Scaling Law is just one of the reasons by the cost of generative AI is falling so aggressively.
“Why is it becoming cheaper? It's [also] becoming cheaper because, number one, there are more models available. Before it was only Open AI or Google's Gemini.
"More models are becoming available, such one released by Nvidia recently that is just as good. More options mean prices come down.”
She also said new chips are coming online which are optimised specifically to process AI. “You have entirely new ways of processing AI, which brings the price down as well.” And she suggested a third reason is optimisations of the models themselves.
TLDR: Use cases
However, Khanna stressed that for most people it was enough to understand prices are falling and that those falls would continue. Of more interest, she suggested, are the emergence of use cases operating as scales that deliver genuinely significant business impact.
You can find the evidence for this within the earnings calls of some of the biggest companies companies in the US.
She gave the example of Walmart, one of the largest retailers in the world, and one of the largest employers in the the US. In its last earnings call its CEO Doug McMillon described how the business applied generative AI to the more than 420 million SKUs on its ecommerce platform, and which simply would not have been feasible in the past, given in involved creating content for every SKU.
"He said that without the use of generative AI, [the project] would have required nearly 100 times the current headcount."
She told attendees to the event, “Can you imagine how many people it would take to put in text descriptions, descriptions about price, descriptions about eligibility, description of sizes."
In fact, the project helped to deliver an even more powerful use case – personalisation at scale. "And by doing this, they were able to personalise it for everyone. Each one of us saw something personalised for us because AI could recognise [me] and then match it to what I needed."
"Look at Walmart's ecommerce revenues, they're going up. These are real business results."
She also highlighted comments by Andy Jassy the Amazon CEO who described generative AI as a game changer for its business.
"He said that they have a coding assistant called Amazon Q, that has dramatically increased their software upgrade time. Software upgrades typically took 50 developer days, and [the assistant] reduced that to their hours. It saved Amazon US$260 million a year and 4,500 developer years. This is insane."
Meanwhile, recent comments from Google's Sundar Pichai to Alphabet's investors revealed that 25 per cent of Google's code is now written by AI demonstrating how pervasive the impact of generative AI has become in a short time.
Khanna also offered a suggestion for how marketers are getting in on the act, citing the example of Toys R Us.
Playtime
"Toys R Us has been around for decades and they decided to make the first AI advertisement, using SORA from OpenAI, and they partnered with an ad agency called Native Foreign."
They used generative AI to create a script around the company's origin story about how Charles Lazarus first came up with the idea.
"80 per cent of it was done with prompting in English with AI, and 20 per cent required human involvement. And that human involvement was important because they were constantly refining it. 20 people worked on it and what would have taken months normally, took weeks," she said
"Was the quality great? It was okay," she conceded, "But its getting better."
Ubiquitous effects
Khanna spoke on the broad impacts citing a Bank of America report which identified tangible benefits across almost all industries over the next five years. "That could be a two per cent increase in productivity or a seven per cent improvement in costs. Now, that doesn't sound very exciting after all the hype, but for anyone who works with a leading Fortune 500 company this is what they care about, real business results."
She channelled Microsoft founder Bill Gates' long held and rarely challenged observation about technology that, "We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten."
That said, as Walmart, Amazon and Google demonstrate the changes in the last two years are pretty impressive, so the next decade should be wild.