Ask Hackaday: How Much Compute Is Enough?

One Powerful Opening Sentence

As we continue to navigate the ever-expanding realm of computing, one question lingers in the shadows: how much compute is enough? The answer, much like the horizon, seems to shift with each passing day, leaving us to ponder the boundaries of what’s truly necessary.

Over the history of this business, a lot of people have foreseen limits that look rather silly in hindsight. In 1943, IBM President Thomas Watson declared that “I think there is a world market for maybe five computers.” That was more than a little wrong. Depending on the definition of computers—particularly if you include microcontrollers—there’s probably trillions of the things. We might as well include microcontrollers, considering how often we see projects replicating retrocomputers on them.

The Shifting Paradigm of Compute

Insight

The RP2350 can do a Mac 128k, and the ESP32-P4 gets you into the Quadra era. Which, honestly, covers the majority of daily tasks most people use computers for. The RP2350 and ESP32-P4 both have more than 640kB of RAM, so that famous Bill Gates quote obviously didn’t age any better than Thomas Watson’s prediction. As Yogi Berra once said: predictions are hard, especially about the future. Still, there must be limits.

We ran an article recently pointing out that new iPhones can perform three orders of magnitude faster than a Cray 2 supercomputer from the 80s. The Cray could barely manage 2 Gigaflops—that is, two billion floating point operations per second; the iPhone can handle more than two Teraflops. Even if you take the position that it’s apples and oranges if it isn’t on the same benchmark, the comparison probably isn’t off by more than an order of magnitude. Do we really need even 100x a Cray in our pockets, never mind 1000x? It fits in your pocket now, but somehow, we were expecting warmer colours.

The Teraflop Barrier and Beyond

Going forward in time, the Teraflop Barrier was first broken in 1997 by Intel’s ASCI red, produced for the US Department of Energy for physics simulations. By 1999, it had bumped up to 3 Teraflops. I don’t know about you, but my phone doesn’t simulate nuclear detonations very often.

Insight

According to Steam’s latest hardware survey, NVidia’s RTX 5070 has become the single most common GPU, at around 9% total users. When it comes to 32-bit floating point operations, that card is good for 30.87 Teraflops. That’s close to NEC’s Earth Simulator, which was the fastest supercomputer from 2002 to 2004—NEC claimed 35.86 Teraflops there. Is that enough? Is it ever enough?

The Software Engineer’s Dilemma

The fact is that software engineers will find a way to spend any amount of computing power you throw at them. The question is whether we’re really gaining much of anything. At some point, you have to wonder when enough is enough.

Take, for example, a 2011 MacBook Pro. I don’t stress it out very much these days. For me, personally, it’s more than enough compute. If I wasn’t using YouTube I could probably drop back a couple generations to PPC days, if not all the way to the ESP32-P4 mentioned above. Quite possibly my next workstation. The 3D models for my projects really aren’t more complex than what I was rendering in the 90s. Routing circuit diagrams hasn’t gotten more complicated, either, even if KiCad uses a lot more resources.

Insight

The Bleeding Edge

How about you? The bleeding edge has always been driven by edge cases, and it’s left me behind. Are you surfing that edge? If so, what are you doing with it? Training LLMs? Simulating nuclear explosions? Playing AAA games? Inquiring minds want to know!

Furthermore, for those of you who are still at that bleeding edge that’s left me so far behind—how far do you think it will keep going? If you’re using teraflops today, will you be using petaflops tomorrow? Are you brave enough to make a prediction? I’ll start: 640 gigaflops ought to be enough for anybody.

Conclusion

In conclusion, the question of how much compute is enough remains a complex and multifaceted issue. As we continue to push the boundaries of what’s possible with computing, we must also consider the limitations of what’s truly necessary. Whether it’s for personal use or for the most demanding applications, the answer will vary depending on the context.

But one thing is certain: the future of computing will continue to be shaped by the innovations of software engineers and the demands of the market. So, join the conversation and share your insights on how much compute is truly necessary for various tasks. Together, we can navigate the ever-expanding realm of computing and uncover the secrets of when enough is enough.

Add Comment