Are We Chasing the Wrong AI Dragon?

Okay, so hear me out…

We’re all kind of obsessed with making AI bigger and more powerful, right? More data, more compute, more parameters. It feels like the default move. But what if we’re doing it all wrong?

Think about it. We’re pouring massive resources into just scaling up current models. It’s like trying to build a faster car by just adding more fuel, without really rethinking the engine. Many of the folks who pioneered this stuff, like Ashish Vaswani (you know, the ‘Attention Is All You Need’ guy), are starting to hint that maybe sheer scale isn’t the magic bullet we thought it was.

This isn’t about saying AI isn’t impressive – it totally is. But the way we’re going about scaling it feels less like scientific discovery and more like a brute-force arms race. We’re hoping that by just throwing enough computational power at the problem, something truly intelligent will eventually emerge. But is that a reliable path?

What if the real breakthroughs aren’t in having more compute, but in having smarter compute? Think about how much we still don’t understand about how our own brains work. Human intelligence isn’t just about raw processing power; it’s about efficiency, creativity, understanding context, and a whole lot of things we can’t easily quantify.

Maybe we need to shift our focus. Instead of just building bigger LLMs, what if we invested more in fundamental research? Research into new architectures, novel learning methods, and a deeper understanding of the underlying principles of intelligence itself. It’s about the ‘how’ and ‘why’ of learning, not just the ‘how much’.

It feels like we’re at a crossroads. We can keep on the current path, hoping that scale alone will unlock the next level of AI. Or, we can take a step back, embrace a more science-driven approach, and perhaps find a more efficient, more profound way to build truly intelligent systems. It might mean a paradigm shift, moving away from just more compute and towards more understanding. And honestly? That sounds way more exciting to me.