Discussion about this post

User's avatar
Colin Mcglynn's avatar

I think that the 'Stochastic Parrot' take is a misunderstanding of how AI works. LLMs don't actually directly store text that they then collage back together. They store concepts and the relationships between concepts. LLMs are incredibly inefficient at learning language compared to a human, but the learning process and the skills produced are very similar. This includes LLMs ability to work out of distribution on original problems, see Microsoft's "Sparks of AGI" paper for an example

Expand full comment
Colin Mcglynn's avatar

"The “AI crash,” if such a thing occurs, will be akin to the dot-com crash of the early 2000s — a temporary setback from which the industry will recover. AI may have limited profit-making potential over the next few economic quarters, but its technical potential over the next few years remains high."

So much this. We love to mock pets.com while ordering dog food on chewy.com

Expand full comment
1 more comment...

No posts