View on the winners on the AI race - Episode 165, 45th minute
The TL;DR on this is simple:
Cloud providers are likely to dominate Ai ecosystem with their end to end tech stack to provide economies scale (price per token) and performance (tokens per second). These happens to be familair names still - OpenAI/Microsoft, Google, Meta.
They will be closely followed, over even surpassed by chip providers (NVIDIA, Groq) as that performance improvements are likely to hinge on the chips predominantly (of course algorithms will influence too, but the ground breaking algorithms are usually published into public domain by universities and even the Tech Giants). They will surpass or not depends on the effectiveness of the cloud providers to build their own chips (which they are all working on or already established)
The value and relevance of the LLMs responses will be influenced by the data it is provided on - so there will be a long tail end of proprietary data providers who will get to keep lot of the value
The analysis follows;
There are 3 types of players involved
LLMs itself
Tools built for doing training the models (Vector Databases, Graph Databases etc)
Applications built on top of the models (Generative AI such as Co-pilot, and then tasks such as Summarisation, )
On the LLMs itself, there seems to be 3 pathways at the moment and panel discussed which one is likely to capture the value and create long term dominance
Closed Models - Google, Microsoft primarily right now (OpenAI is included in MSFT camp)
Open Models - Meta in the lead, but 1000s models in Huggingface
Tailored models OR Apps built on Open or Closed Models - Mistral AI, Character AI, Replit AI and so on - specialised use cases
There are number of arguments which are worth pondering over
Large Language Models itself do not have an economic value - as any one with sufficient compute (the investments are coming drastically down) and access to human in the loop reinforcement learniing systems could replicate it with data all over the internet. This is not to say that it is an easy task, but its just that its not a source of sustained competitive advantage.
Adding open source models to the mix will further accelerate their value destruction process and further compress margins for the players.
So where does it take us - we seem to be on the same place as early cloud adoption - the cloud services itself was about products like EC2 and S3, however in the end it was about how reliable and how fast and how cost effective these services were. It is the same process which is going to repeat in the AI space as well. This is becoming about the parameters of the performance of inference
Token rate (that is what define for experience for the apps sitting on top)
Attention span
Cost per token
etc, to name a few.
And those capabilities are going to boil down to the type of chip capabilities and how controls the availability of chips. Clearly at this moment in time NVIDIA races ahead in the game. But there are close competition such as AMD and Qualcomm. However, it is the existing cloud providers themselves who are going to attact lot of that value by creating AI specific chips. Google started its TPU chips back in 2016 when it was becoming increasingly clear that we needed custom chips to deal with AI operations such as matrix multiplication. Microsoft announced its plan few months ago. Amazon and Meta are following. Investments for this is not something small scale players can approach. So clearly its likely that the model inference providers (responses to your prompts) are likely to keep all the value.
In addition to this - these providers will add stickiness by providing collaborative capabilities (network effects) and also reach to the existing userbase (network effects) to the extent organisations will be willing to host their own data in these networks (security vs flexibility)
Now providers like Mistral AI and Character AI can build this on NVIDIA AI chips. Economically they will not be in a particularly advantageous position.
Apart from the model and hardware players - there is another significant player which is proprietary content providers - these could be
Media outlets - WSJ, Economist, NY Times
Social media platforms - Reddit, Quora, X, LinkedIn etc
Social Mediat platforms owned by Tech Giants themselves - Youtube, Facebook, Insta and so on
It will boil down to getting access to proprietary data to provide the most relevant and reliable results to industry applications such as news analysis, military intelligence, financial analysis and even education
Comentários