I’m currently sitting in front of a TouchSmart PC from a half decade ago. Back then this was on the bleeding edge, pioneering desktop touch innovation. It was bold, daring, and in the most part, completely useless. Despite this it still managed to become HP’s poster child, a product they could be proud of. A huge segment quickly emerged for kitchen worktop computing devices which nicely coincided with Microsoft’s push into their Windows Media Centre for digital TV watching.
Despite its many flaws – awful boot time, bulky design, poor screen resolution, terrible viewing angles, and resistive touch technology – I still love this thing. It was different, and different takes guts. For the most part, it worked. I could quickly watch a DVD before school (assuming it wasn’t from a cold start up), catch up with the news on BBC breakfast via digital, and even stream digital content straight from a flash enabled web browser.
A few days back the very same company that pushed for such invention announced this, a product that has a remarkable resemblance to another Cupertino based company’s work:
But was this demise into ubiquity inevitable? Did Apple, in true troll fashion, get to glass and aluminium first and the rest were just late to the game? It’s hard to say. The materials lend themselves brilliantly to such devices but Nokia have shown that there are real alternatives in industrial design. The near identical touchpad and chiclet keyboard seem a step too far.
Apple pushed the mantra of controlling both the hardware and software, with others like Microsoft with Surface, Amazon with Kindle, and Google with Nexus now following suit. Over the next half decade I think we’ll see a continuation of this shift. Fewer companies will be able to compete through a lack of in-house software excellence and a rich media ecosystem. Microsoft’s once one-OS-for-every-PC strategy will shortly be over.
As a side note, Apples new spaceship themed campus is being built atop the remnants of the former home of Hewlett Packard, the very same company that gave Apples co-founder his first job in the industry. Despite recent failed efforts with WebOS, another once great consumer electronics company has slipped into the hands of the enterprise sector.
Apple’s TV will disrupt the television market in a massive way; it’ll be akin to how the iPhone immediately antiquated phones that predated it. I won’t speculate on the ways that their product could be different to cause such a revolution (interface paradigm, content delivery methods etc.), but be sure that it will.
Apple’s motives in suing Samsung over patent and trade dress infringements aren’t to protect future iPhones or iPads, but to safeguard their TV offering against a repeat of such events.
Apple famously only move into a new market segment when they believe their input will be entirely beneficial and unique. Just months before their former Founder & CEO died, Steve Jobs told his biographer that the Apple TV will “have the simplest user interface you could imagine” and that he’d “finally cracked it”. By now it isn’t a question of “if” they’ll release it, but of “when”.
We don’t do printers, monitors, Windows installs or anything to do with IT for that matter. Mostly, we work as Developers writing software. You probably live in a bliss, ignorant bubble where software, native or web based, just works. You might imagine that building software is much the same as designing a PowerPoint slideshow, drag and drop and all that, but it isn’t.
All computers - your iPad, iPhone, that ATM machine on the High Street, and even your dishwasher - at the lowest hardware level only interpret binary commands; 1’s and 0’s.
Writing such code, known as machine code, would make our lives very difficult. We have drastic layers of abstraction from this basic language to a more human readable, friendly and accessible development environment. Some languages are more abstracted than others depending on their purpose and when they were designed.
In a Computer Science degree, the primary focus is to ensure graduates think in a logical, efficient manner. We’re taught dozens of programming and scripting languages, but fashion and rapid innovation mean most are irrelevant by the time we graduate. This taught mentality of problem based thinking therefore ensures we can adapt to new languages with little friction.
It’s an extensive field. Some spend their time writing the back end of web applications, like Facebook or Google, some design AI algorithms for use in the aerospace industry, and others work for Banks ensuring the safe transactions of billions of pounds. These things have little relation to what many people think of as IT.
If you want help with your printer, ask tech support.
As a side note, I’m not your typical Computer Science student or programmer. I don’t say this in a vain or arrogant way, but in a matter-of-fact way. I love developing software but am also extremely passionate about the design, the usability and the core concept of a product. I enjoy programming because it’s extremely empowering. I have ideas and I can build them.
Some people delude hope and trust and love and happiness. You run and leap and jump for some people, but it’s no use. They’ll never change.
They’re all the same, these people. When realised you’ll be free to bound over them. And bound I will.
Facebook have replied to my request, it’s looking like good news.