Today Google announced/released applications based on PaLM 2, their new smol Language Model.
While the trend had been larger models until a year ago, something reflected in our calling them LLMs, Large Language Models, since these have become useful we have actually seen a move to flip from research to engineering to get them to a stage they can go to billions of people without using every GPU on Earth.
We have also seen that many of the parameters in these models and underlying data aren't actually required, we have been feeding them junk food.
The accompanying technical document is short on details on the final architecture, datasets etc but does show that models similar from 400m to 14b parameters achieve great performance, something we have see with the open source explosion around LLaMa and other base models recently as folk realised you can get large models to teach smaller ones.
This comes a few days after an article that was sent to me approximately 127387129837 times,
To the surprise of many I didn't actually agree with this piece on a number of things, even though I am one of the bigger proponents for open source AI.
While the open language model acceleration has been wonderful to see, the truth is that a lot of the work around this has been by a handful of folk and mostly fine tunes and some adjustments to the base model.
Some of the details in the piece were also a bit odd, does anyone at Google actually use LoRA?
These techniques are nothing new and distillation and student teacher approaches are well known and actually the former pioneered at Google.
There have been a number of internal issues and walls that have prevented information sharing and even process optimisation - why we saw Chinchilla outperform the original PaLM yet the original still used for example.
Google started with innovation but innovation is a really difficult thing to build a good company around - the innovation is the catalyst but the ingredients to a good company are turning that research and innovation into engineering and operation (sound familiar?).
I wrote more on that here:
While this article fits with much of our thesis I think it has a misunderstanding of what moats actually are— Emad (@EMostaque) May 4, 2023
It is v difficult to build a business with innovation as a moat, base requirement is too high
Data, distribution, great product are moatshttps://t.co/Qjz3Syfzxg
The narrative will likely turn as Google integrates Palm 2 and then Gemini into its various products to drive real customer value - maximising function and flow and reducing annoying frustration.
Similarly while the models before they start slimming are already good enough, fast enough and cheap enough to make an impact, we will see this really take off into 2024 as we start to explore some of the limits of optimisation and engineering, something Google is good at (Amazon too actually).
This aligns with my call in January saying not to write off Google as they have all the right ingredients to be one of the couple of main players in this space, something that will become more and more apparent.
Stories about how chatGPT will kill Google are a bit silly.— Emad (@EMostaque) January 4, 2023
Google have the best full stack LLM team and infra with custom chips (PaLM, LaMBDA, Chinchilla, MUM, TPUs etc)
Nobody can bet them on innovation, cost or go to market.
Institutional inertia is only limiting factor.
The dynamic this sets is interesting though - if the models are this good, this fast and getting better, where is the space for other proprietary players?
I think folk need to consider what would be the impact if it turned out you could get GPT-4 level performance in 1 billion parameters or less, something that could work offline on a phone (maybe the cool Pixel Fold)?
Looking at how things are going I don't think this is inconceivable from proprietary models and we will continue to see massive innovation in this space at a pace few of us expect.
I think Google, OpenAI and a few others will lead on this and it will be really hard to compete on features, meaning focusing on the unchanging needs for a business around this will be critical.
We shall see, but for now Sundar came to dance.
edit: heh AI is awesome