As I helped my dad shop for a new mid-grade laptop over the Christmas holiday. I was absolutely floored by the low-cost options on hand: roughly $400 got you a dual-core 2.2Ghz processor, 4GB of RAM, and ~500GB of storage. If you wanted to go lower, you could nab one of those wildly popular <$250 Chromebooks, sans OS entirely.

Every model available, no matter how cheap, had more than enough power to handle basic consumer applications both reliably and virtually lag-free. For the purposes of the vast majority of users, computers really aren’t “slow” anymore. It’s an incredible change over just five years ago, and the pace keeps on accelerating.

Keeping Moore’s Law alive and well, Intel will be debuting an 8-core processor in the 2014 cycle.  HDD (non-flash) storage costs declined a further 20% in 2013, standing at .00000015% what they did in 1970. A friend recently posted a bandwidth speed test in which he clocked a non-theoretical download speed of 630.85 MB/s, making all regular bandwidth use virtually instantaneous. He could download the world’s first (160GB) 4K video in roughly four minutes.

The world of computing tech seems, very quickly, to be outpacing its practical applications. I don’t believe (unlike a certain apocryphal patent office employee) that everything that can be invented has been invented, but I remain very stumped as to what future inventions might possibly look like.

Consoles

Least exciting console war ever.

So, right now, is everyone else. Take this year’s release of the PS4 and Xbox One, a face-off which couldn’t possibly have been more boring. The gaming industry has reached a point where photo-realistic graphics and complex physics modeling are now par the course. While incremental improvements in face-pore fidelity might be interesting to some people, they don’t exactly fire up the gaming population en masse.

Basically, major games developers haven’t figured out ways to harness their new hardware and innovate beyond the previous (now basically accomplished) goal of lifelike graphics. Indie studios and some zany hardware designers are trying, but they simply lack the same degree of resources.

While I follow the gaming industry most closely, this same question seems to be stumping developers across every aspect of computing – even those most famous for their forward-thinking agility. The possibilities may be endless, but they also seem pretty unfathomable.

I suspect this is why so much attention has shifted to horizontal innovations: miniaturization, wearable computing, and (in the incredible ecosystem of network-centric startups) an increasing push toward streamline and consolidation. These are all great advances, sure, but I really want to see the piece of consumer software that requires those eight Intel processors to function – and dramatically improves my quality of life in the process.

As food for thought, take Google’s incredible suite of cloud-based applications. If bandwidth is no longer a limiting factor (and it won’t be with a 630 MB/s download speed, especially since .doc, .ppt, and .jpeg’s have stayed so constant in size), what do you do next? Minor interface improvements and compatibility fixes…or wholly new file types to take advantage of that much untapped capability? Are there new audiovisual, resource-heavy, mass-collaborative applications you could invent that might also address a legitimate need that a simple text document can’t also handle?

Most fundamentally: will we still exchange information and ideas the same way when the mediums by which we do it – data processing, storage, and online relay – no longer have practical technical limitations?

I have no clue, I really don’t. But I can’t wait to see the next generation of zany new ideas we come up with, and which ones stick.