Messages From The Future: There’s No Freaking iRing

Ok, look, I get it. Most of you don’t believe I’m from the future. It’s absurd, right? I know, believe me, I do. Fine. But you shouldn’t have to believe that to see how galactically stupid that iRing rumor is and conversely how sensible and likely this detailed description of Apple’s iTV is.

Today I checked to see how many of the media outlets picked up my accurate divulgence of the actual future of Apple’s actual, honest to god iTV, and what stories do you all run?

No, not the guy from the future who actually owned numerous iterations of future Apple’s iTV and described it in detail, no, you run with the complete Bozo’s made up goofball story about a ring and some crappy second screen, neither of which ever happens.

Really?!

No there was no ring. White, his imaginative source, and every media outlet that ran that rumor is just doing that hyperventilatey thing. A ring. I’m so sure. But it’s amusing how you can hear the mental wheels slowly turning on this one.

Imagine a pair of small eyes blinking with each weak spark of a synapse:

Blink.
“Apple watch?”
Blink.
“That would be so cool. I want an Apple watch.”
Blink.
“Gee, maybe Apple will make other fashion accessories too… With technology built in.”
Blink.
“Like… A necklace… Or…
Blink.
“… a ring!”
Blink.
“Yeah! I mean, I don’t know why they would make a rin… ”
Blink.
Gasps as a weensy lightning bolt strikes. Blink blink.

” …gestures!”

And this, people, was the entire scientific basis for yesterday’s completely bogus Apple TV rumor about a ring.

And this thing about a second screen that will -omg- let you take the show with you as you walk around the house!

Fine, don’t believe I’m from the future and actually saw how it all worked. But come on, since when has Apple ever created a crappy, hobbled device? This makes no sense no matter your historic reference.
Apple would never – ever – want a semi-functional pseudo iPad replacement in the world (even if its called a “remote”). One that does a little of what iPad can do, but that you can’t do most other iPad stuff on. That’s stupid. Apple’s actual iTV worked because it fit into their ecosystem. It encouraged sales of iPads and Minis and iPhone Pluses. Apple is not afraid to bite that bullet, the one where the amazing new device (iTV) works perfectly fine with a simple, screen-less remote, but so much better if you own an iDevice. Because, news flash, jillions of people own iPads and they generally buy the first iTVs anyway. The only people who complained about needing an iDevice were Android users.
It’s so simple, this. And I mean, maybe being from the future gives me some added confidence to say “c’mon, surely you have to see this!” But c’mon, surely you have to see this!

So for crying out loud, read my last post. It was long, I know, and trust me, it could have been a lot longer, there were screens and states I couldn’t possibly describe coherently in the flow, but I assumed you would want to know with reasonable accuracy what an iTV really turns out to be.

And I know because I’m from the future.

Messages from the Future: How Apple Finally Cracked TV

I seriously debated revealing the future of TV. I worry that I’ll change things too much. I’ve already noticed that little things are different than I remember. Alicia Keys as global creative director of Blackberry?! Seriously, that was totally my fault, that so wasn’t supposed to happen. But as I say, it’s just been little things so far.

Oh what the hell. No one reads this blog right?

So this all came up because I was trying to watch TV last night on a clunky old HDTV, and all I could find were old infomercials.

Which reminds me, we still have Snuggies in the future.

You know what those are right?  Those full-body fleecy blankets with sleeves – comfy as hell?  Those.  Have they taken off here yet? Well, they will. I recall Alicia Keys was photographed texting on her iPhone wearing one and – boom – everyone wanted a Snuggie. I mean I guess it makes sense, if you don’t have money, sex or power, I have to think a Snuggie and some cookie dough ice cream is maybe the next best thing.

So being back here again I am reminded of all the hoopla about 3D TV and 4K being the next big things.

Well, despite all its hype, 3D TV never – ever – advanced past being slightly dumb and underwhelming; the people who bought new TVs specifically for 3D, sort of the same. In fact, before Facebook croaked, Graph Search famously showed that 3D TV was purchased almost exclusively by men who drank Pabst Blue Ribbon. I don’t remember anyone sufficiently explaining that correlation, but Pabst Brewing was so excited about it that all their ads went 3D.

I guess the whole 3D glasses thing was just too cumbersome and annoying for most regular people who weren’t drunk on cheap beer. By double 20 there were even lots of native, glasses-free 3D systems, but then if you tilted your head the whole illusion broke, so a generation of overweight Pabst drinkers went and bought those airplane neck pillows which also had the effect of making them look alert after they’d passed out watching Snuggie ads. Of course Snuggie then started building the neck pillows right into the collars and adding can-holders to the sleeves… it was a vicious circle.

Anyway, nor was 4K “the next big thing”.  I get it though, hindsight being 20/20, it’s easy to look back at the pre-paradigm-shifted TV industry and see why a generation of manufacturers might have focused almost exclusively on image resolution.  It’s mainly what their business had been competing on forever.  Form factor and resolution.  That was the extent of TV manufacturing innovation.   But so myopic was that focus that they completely missed the area of improvement most needed.  And ultimately most desired by consumers.  Even if consumers hadn’t yet realized it themselves, having yet to see it.

For those of us with taste buds, there were 3 general, somewhat overlapping differences between TV in the future and TV that you watch Snuggie ads on today: the OS, the remote control, and the way you used these to access, and interact with, the content.  You’ll note that resolution and form factor were not what changed the TV industry.

For those of us with taste buds, there were 3 general, somewhat overlapping differences between TV in the future and TV that you watch Snuggie ads on today: the OS, the remote control, and the way you used these to access, and interact with, the content.

Incidentally, on this topic of remote controls – in your time there are lots of people proposing that gesturing in mid air is the future of navigating computers and TV. Nyea – no it’s so totally not.  That never caught on. I do recall by about 2014-16 there were a couple peripherals and TVs that were meant to enable this – Samsung, and Sony especially were trying to unseat Apple and went down this path briefly. But in the end I guess we were lazy. I don’t say that to be insulting – it’s just frankly easier to wiggle one finger on a small handheld remote than it is to lift your arm in the air and wave it around in an articulate manner. People always want to argue this – “but joel waving your arms around and making things happen is magic”.  No, no it’s not, trust me, it was tiring and annoying. And anyway I’m just telling you want actually happened.  I even remember reading about some study where someone explained how gesturing in the air, even at the elbow (as opposed to the shoulder), was actually quite a lot harder, by a significant factor, than tapping a handheld device, since for one, it requires your gesturing limb to be generally unsupported and, since momentum builds through a gesture, it must be equally countered by your muscles to slow that momentum and stop. Combined, these factors had the effect of making gestures in air much more work than holding a small device on your lap. So that whole idea died. You know, just FYI. Anyway…

The OS
I guess I could have cut to the chase and said “iOS”, but that’s a cheap shot, isn’t it.

That said, surely you’re not surprised (even if you’re annoyed as I was) that in the future Apple finally launched a home media solution that was so vastly superior to anything developed during the 70-year-old paradigm of traditional TV that it changed everyone’s perception overnight, and that the usual suspects immediately tried copying it?

I know, total shocker, right? Who would have thought.

Sure enough, Apple had indeed “cracked it” as Steve Jobs had famously asserted before passing. That quote marking a parting gift to his Apple survivors, one last reality distortion field from the great beyond.

Even with it’s initial slow start (due to limited network launch partners mostly) anyone who wasn’t overtly biased against Apple could clearly see how much better it was. There was a lot of joyful applause that day at the Apple event as the vision for TV’s future was presented. Mostly because you sort of felt this strange weight lifted. A weight you didn’t realize was even there until it was gone; the inveterate accumulation of every confoundedly obtuse tool you’d tolerated throughout your life, quashed. There was a collective release in the auditorium, everyone had the same reaction – “About freaking time!”

And then there was the other corner. The cynical, sad little corner of beady-eyed squinters poking at their Androids. I think the first reaction from the Apple-can’t-possibly-keep-doing-this-can-they(?!?!) contingent was blustery disbelief and a scrambling to discredit the reveal to save face, most of them having dug a rather big pit prior to launch. The naysayers (friends of the old guard) had brayed relentlessly leading up to the day, “the cable providers are too powerful”, “the networks will never agree”, “the advertisers will abandon it”, “TV manufacturers are already there”, “Google is on their heels” – such that regular people even started to believe… that maybe a real Apple TV would be a dud. But in the end they were all left a pretty toothless bunch since ultimately, it was profoundly great. And in the end it didn’t matter because TV had finally changed.

There have been few reveals in the history of Apple that were as satisfying. The iPhone was one – that moment when Steve repeated “an iPod, a phone, and an internet communicator!… are you getting it?”, and you saw the magic of multitouch on a consumer device for the first time. The classic bit where he pulled the first Nano out of his Levis fifth pocket. And the first MacBook Air – in an envelope.

And finally seeing this – well, it was the way you always wished TV would work. In a flash you realized you had just been beaten down over a lifetime to believe that the obviousness of your own intuition shouldn’t be possible. Like how we passively accepted TV commercial interruptions in the middle of a show, those FBI warnings that inexplicably disabled the skip button, or the criminally, fabricated cost of texting on a carrier network.

Really in some ways “it” wasn’t even some surprising or ingenious leap. I mean in hindsight I’m not sure what anyone thought a bigger idea might be when they’d hyperventilated their wacky Apple TV rumors. The solution was so sensible and obvious it left me wondering what the hell the other manufacturers were doing all those years before Apple came along.

That said, you may be surprised to learn…  that Apple’s invention wasn’t a TV.

In fact what I realized about Apple that day was that the reason they always appeared to shake up new industries, and TV was no exception, was because they offered consumers a perfect new solution – without regard for the legacy businesses and predications. I think Apple looked at these legacy businesses as unwieldy things, grid-locked by their own interdependencies and politics.  When these companies tried to innovate it was always within the existing architecture, so as not to unduly upset delicate relationships, the status quo.  Essentially, no one at the table actually had the power to change the game… except the consumer.

But where consumers had the power, they lacked organization, they lacked the tools, they lacked a meaningful choice.  And that was Apple’s genius.  Apple just empowered them.  Routinely gave consumers a disruptive option.  “Here you go, now, let’s show this industry what you really want.”  And this was true when Apple “cracked” TV. That said, you may be surprised to learn…

…that Apple’s invention wasn’t a TV.

Nope. They made a point about that in fact. I mean everyone “knew” they were launching one that day. But in a classic, only-at-Apple moment of legacy industry shake-up, Tim Cook invited Phil Schiller on stage who built up to the moment where he dropped the bomb “…and today, Apple is ushering in the ‘post television era'”. The room erupted in bubbling laughter and applause. The point he was making was that the old business, and way of interacting with televised media, cable companies who arbitrarily controlled limited and locked bundles of networks, content interrupted by ads, the weeding through countless channels you didn’t want, overly complicated dumb remotes, and so much more, all of that was behind us.

And although I remember a fair number of pundits debating that claim, the new devices indeed weren’t technically televisions. Sure they announced some tear-jerkingly beautiful displays – but none had television tuners. The new devices were just computers.

Phil unveiled 2 product form factors that day: the first was called iTV Mini, an AppleTV box more or less as you know it today, but totally redesigned. It was super slim – like a flatish, squarish, aluminum plate or a blade; the thickest portion housing the ports. Eye level with it, you barely noticed it, but off-angle it was striking. Gorgeous. (By the third version they’d cut out the legacy ports making it even thinner.)

And the other product, the main event, was the new iTV, a screen in 3 sizes, with the box guts built in. The screens were gorgeous in person and had a giant, round-cornered iPad feel about them. Blade-like edges. Very flat. Aluminum back. Even had the requisite front-facing camera.

In all honesty – and I’m remembering out loud here, in some way, even though these were indeed lovely screens, I remember my inner fanboy wanting more. The screens were refined, and the fit and finish was better than any other display on the market, and I’m not sure what I’d hoped for exactly (maybe a smaller bezel?), but this looked sort of predictable. I mean, it looked like an Apple product for sure, all minimalist and clean, but I guess I’d hoped for some profoundly unexpected design-gasm. Well, as I say, I still bought one and came to respect and appreciate it’s design after all. Naturally I bought the biggest one – it was prohibitively expensive. (Incidentally, buying this device was what inspired me to make money selling 3D models on plasticly.com, which I actually don’t think you have yet. Plasticly is the biggest 3D printing retail community on the net. That probably needs to be a whole post some day.) And anyway, years later, that fanboy wish was answered. Around 2018-19 they released the first display ever to utterly lack any bezel at all, where the picture came literally right up to the edge of the display. When it was “on” it was like the lit image was almost floating or projected in thin air. Totally minimal, all picture. No one had a screen like that, lit so close to the edge. It was OLED at some insane resolution (they didn’t call it 8K then but I recall it was in that neighborhood). Stunning. I remember they made one of those Apple videos where Jony Ive and crew spent a fair bit of it talking about how the technology and engineering required to achieve that edge was, like, “totally amazing”. Some new glass bonding process. And of course they’d bought out most of the global capacity.

Even so, as I’ve explained, at first launch the industrial design and resolution wasn’t even the story – those weren’t the big idea. The beauty was in the way it all worked. That’s when you realized things were changing. That’s when consumers got excited.

iRemote
Today with your completely stupid, paleolithic remote control (now, grudgingly, “ours” – thank you very much, time travel), you type a cryptic number, 042 for Adult Swim, say, or flip through a ludicrously long linear sequence of channels, one of which is the “program guide” channel where you find these ugly little blurbs that were graphically designed by some mediocre “Web 2.0” Photoshop reject, describing what’s on. Or you hit “Menu” and the show gets squished in half as you try to tab through the same cryptic mess that way. Sorry cable operators and set top manufacturers, but every bit of that is completely, utterly lame.

iRemote was an iOS app. Naturally, it was included by default in iOS 8 and ran on any iPad / Mini, iPhone / Plus, or iPod Touch. To the chagrin of some, iTV didn’t ship with any of those devices however, it rather shipped with a small typical Apple remote control that allowed users to navigate the on-screen version of the iRemote app (if much less effectively). Even so, anecdotally, no one really used the on-screen version or that little remote. At first critics blasted Apple complaining that it meant customers had to buy an iDevice in addition to iTV. But then the only people who shared that complaint were Android users. And some time later, in a complete surprise and controversial announcement, Apple released an Android version of iRemote, which as I recall turned out not to be such a crazy idea in the end since it opened iTV to a whole new segment and minimized the impact of Samsung and the other iTV competitors who were trying to stand on that platform competitively.

Anyway, with iRemote running on one of your iDevices, you didn’t have to change to a different channel or obscure the on-screen content to see your program guide since all that information was displayed on your iPad’s screen. And you didn’t have to power through the endless list of channels to find the few you like. Rather you assembled your personal, custom collection of favorite networks and shows on a single screen, laid out like large app icons (really they were more like thumbnails, but it’s easier to talk about if you think of them as app icons).

Collecting your favorite shows and networks was sort of like browsing the iTunes Store, only you could move things around. You could select both discrete shows or networks to appear in “My Guide” sort of like putting apps into folders, and once it was loaded with your stuff, you generally used the “My Guide” screen(s) the most for choosing shows to watch.

For me that was an “aha” moment.

Apple’s “My Guide” personalized program guide was to changing the channel what visual voicemail was to an answering machine.

Apple’s “My Guide” personalized program guide was to changing the channel what visual voicemail was to an answering machine. You kind of went “Oh my God. No duh.” Think about it – only the networks and shows you like, with none of the annoying flip-past channels your cable provider hoisted on you through their coarse bundling.

On the My Guide screen you had 3 main semi-segregated sections – the first, on top, was what’s “On Now” – a line up of the show icons – currently on – either because you chose the show, or more often, the network.

Below that, you had your full list of favorite Network icons – aligned under their corresponding On Now shows. So when you swiped through the show icons the network icons moved too (and vis versa).

Finally, at the bottom (scrolling down vertically), you had your full list of favorite show icons (whether they were on now or not).

I should point out that to put a show on the big screen you could do it a couple ways but in either case you had to tap 2 times to switch shows. You tapped either a show icon or a network icon, and then you tapped an area at the top of that to put it on the big screen. I remember whiny critics of the system complaining about this, saying you should be able to tap once – but in use it really wasn’t that big a deal – these people forgot how many buttons they’d had to tap before.  I rather liked it because you came to rely on the idea that you could always see what a show was – see more about it – before you committed.

Tapping network icons worked a bit like folders. It opened the network and revealed show art for what was playing now on that network at the top (something is always playing now at the network level) and below that you had access to all that network’s other shows (similar to network screens in iTunes as I recall). Embedded in the On Now show artwork at top was a “Watch” button, and if you tapped it, boom – the show appeared on the big screen.

There was another sweet little feature – you could choose to join the live show in progress, or you could also choose to start it from the beginning instead. Without paying more. Logical.

That was the network icon, alternatively from My Guide, you could have tapped a show icon.

When you tapped one of the show icons (no matter if it was On Now or not) it opened the show’s screen.  Every show had one, but they rather worked like apps (or “App States” since it was all in the iRemote app). Among other things, it was how producers and networks could offer show-specific info, other episodes and interactive experiences.

Each show’s app state had at least 2 tabs. You’d land on a different tab depending on whether the show was “On Now” or not when you tapped it.

If it wasn’t On Now, you arrived at “Episodes & Seasons”.
There was custom art at the top of this screen, with the next upcoming showtime and the upcoming episode description embedded in it. From there you could tell your iTV to jump to that show once it started, or you could set an alert (which hit all your iDevices). Below that you could select other episodes and seasons from the show archive to watch on demand. You could also watch trailers and read reviews and all the stuff you do today in the iTunes Store. But in this case it happened on the tablet – and didn’t interrupt the big screen – unless you wanted it to.

But if the show was actually running live you arrived at another tab called “Showplay” (A name I hated at first, but got used to).
As on the Episodes & Seasons tab, at the top you had the main show artwork, but this time with the “Watch” button embedded in it. Naturally you tapped that and boom, it’s on the big screen.

Below that was another game-changer – you had access to all the interactive content that the show’s producers had developed. This is what Showplay was about.

Below that was another game-changer – you had access to all the interactive content that the show’s producers had developed. This is what Showplay was about. In the early days most of that was supplementary, sort of like extras on a DVD, but eventually that was where the action was.

Most big show producers eventually took advantage of Showplay, creating games, chat, voting, and other interactive experiences. After an initial period of intense resistance, even advertisers finally figured out how to employ these side-channel experiences, and started participating in new types of shows that actually hinged on this live, articulate interaction.

A New Kind of Show
It was the development of this new kind of content, interactive shows, enabled by Apple’s interactive ecosystem, that ultimately cemented Apple’s dominance in the living room for so many people. Like Angry Birds on the iPhone, these shows came to iconify the new model.

The originator of the most popular of these shows was General Amusements (some production company that got into this early). I have to admit to the guilty pleasure – their shows were insanely entertaining, and captured the system’s potential. Two shows in particular were the big ones that seemed the most popular, one was called “We The People”, co-hosted by John Stewart and Bill O’Reily, and another was called “Mob Pile” with Daniel Tosh (there were a bunch of others but these two were the big ones). Both shows were based on live voting and live feedback.

And it was a marketer’s dream come true. Basically the host brought ideas, people or things out and viewers just voted whether they agreed, disagreed or liked it or not. Or answered simple multiple choice questions about things. This was all done through Showplay. I’ve read that this had been tried before through inferior systems for years but never so successfully. Sound boring? It so wasn’t.

The shows were paced really well. Some were even frenetic with new things to vote on every 15 seconds or so. Others were more story driven and made you want to help, or express your anger, or whatever. But the bread and butter of these were about products. For example in one of the daytime shows, “I Want That!” with Howie Mandel (which was all product-based) a retailer – like J. Crew say, would show options for their upcoming season, beautiful models would walk out wearing the outfits while some designer was interviewed and then Howie screamed “J.Crews new Fall Line – you’ll only see one this season, and its your call – tell us what you want!” and all these numbers start rolling in and a few seconds later – there’s a thumbs up or thumbs down – that fast. And it wasn’t just binary – because when you played your expanded profile included info you opted to share, like age, gender, style, all sorts of preferences.

And the shows could break the data up in different ways live – they’d show maybe what men Vs women voted, or parents Vs kids, or various cities or states. People ended up aligning with all sorts of causes and groups and it told marketers a lot about the viewer’s psychographics. These shows had crews who searched for interesting stories in the data. Depending on the thing being voted the results were often funny or frustrating or heart-warming. And the live, in-studio crowd was always going “ooooh!” whenever tension was created by a vote. They even pitted the studio audience against the rest of the country to see how “normal” the audience was. And I’m telling you they voted everything. Product packaging options, color schemes for cars, flavors for packaged foods, music, casting actors for shows, movie reviews (the We The People movie ratings were the most reliable gauge of a good or bad movie out there), charities for funding, social topics. It went fast. You wanted to vote because you had a direct hand in the creation of these products and things that you would then be able to watch or buy or whatever. Advertisers paid a lot to buy vote segments for their products on the shows, which served to help remove interruptive advertising.

There were also longer form segments where they would bring out random Bozos, neo-nazis and other goofballs who would come on looking for affirmation and more often than not get totally humiliated. Couples who were fighting would each tell their side of the story and the mob would “pile” on the one they thought was wrong. The winner was always pretty affirmed from that.

We The People achieved notoriety and credibility when the 2020 presidential candidates held their debate on the show and it accurately predicted the vote virtually to the state. “GMO food! Lets see the numbers! Red States Vs Blue!  America, VOTE!”

This was a whole new class of entertainment and social interaction.  And most of them worked even if you weren’t watching live, you could still add your vote later (though the host couldn’t respond).  The shows spun off variants related to regions and topics. And most traditional shows started doing it too. Ellen, the Jimmy’s (Fallon & Kimmel) and the other talk shows were easy. Part gameshow, part social experiment and part focus group. In some way you just wanted to know how you measured up with the rest of the world.

Anyway – this was one of the ways marketers made the new “ad-less” model work for them.

Shopping
Then in the third major update of iRemote they added shopping. Everyone talked about “buying the sweater on Friends” in the 90s but even in this time, now, no one has a reasonable way to execute on that. Well, Apple did. So if you selected the new shopping tab in the show or network apps, there were little windows in your iRemote “side-channel” that would display products as they appeared casually in the show.  Not an ad – just a thumbnail.  And like a notification it would then disappear.  Although you could buy stuff right then, you were usually engrossed in your show, so all you had to do was tap the item which would drop it into a viewer, that you could inspect later. From the product view, you could usually replay the scene you saw the product in. This too threw off tons of data for marketers- even when selling their products this way didn’t always make much sense.

Siri
I saw a “prediction” in this time that Apple would use Siri in iTV – it also predicted the use of hand gestures in the air. Well I already told you that the gestures thing tanked, but Siri was indeed part of iTV.

That said, I really hated Siri at first. It worked for some people, but I just always felt like a dork calling out the actions. Sometimes I used it when I was alone, but it was way better to get previews of the content and info on iRemote. Because with Siri you didn’t get all the advance info and interaction the second screen afforded you without interrupting your show.

Even so I have to confess, by 2021-ish Siri actually started seriously kicking butt. I had this one day where I realized it finally understood my personal idiot language – like all of it.   It was when I said, and I’m paraphrasing here, “Siri, what was that sci fi movie… with the… you know.. that kid… agh… I don’t remember his name, a bit player from the uh …  he was in the last season, maybe(?) of Mad Men where they did that store… with the big moose… thing? …”

She knew I’d meant Star Wars VIII and the child actor who played Han and Leia’s son. And also informed me that the “moose thing” I’d referred to was Macy’s Bullwinkle parade balloon.  That’s when I started using Siri, like all the time.

Games
Jesus, I haven’t even gotten into games. Ok, real fast  Yes you could connect your peripheral game boxes.  But in short order we generally stopped doing that.  Games and developer apps for iTV were sold in the app store but they had to follow unique formatting from the other iDevices, which everyone screamed about at first. Even so that format afforded added benefits. You could push or pull the content to and from the Big screen. That last point probably sounds humdrum but it was really cool especially when it came to games. I guess it was a little like the Airplay thing today, but by then the developers were really putting the effort into making games designed for the display and interaction layer split. The networking was very fast and easy. If your iPad detected a big screen it alerted you and you could throw the presentation layer (image/audio) up to iTV and the controls stayed on your iPad. If you walked out of the room, with a tap you could pop it right back. It was just effortless.

Wrap Up
With Apple’s iTV, users basically had the best remote control, customized program guide, show interaction platform, and side channel on the planet – all in one device. We had all the content of the iTunes store, all of which was available on all your devices (iCloud naturally) – plus a whole new category of social interactive experiences. And it all worked effortlessly and the devices were well-made (except for a brief but very annoying “lower left green corner problem” which became a huge media dog pile, but Apple replaced those iTVs for free).

The Networks were better able to promote their shows and aggregate audiences, and create an interactive domain for their properties that extended the consumer relationship beyond merely watching linear content once.

And marketers had new, unimaginably powerful tools to engage consumers meaningfully, and collect insane amounts of useful data for the first time in their history.

Do you see what I meant when I said it was so much better than legacy TV for almost everyone at the table?

The casualties of Apple’s living room dominance were generally worthy of their fate.

The cable companies indeed became dumb pipes. But they always were really. It’s relative. It never made much sense that the guys who lay wire should so strongly influence what I watch and how I watch it. The more they fought the Apple iTV migration, and the more they pressured the networks to stay away, the weaker their entertainment services seemed, and the less anyone believed they would ever catch up. They were fighting for stagnation, for legacy, for a self-centered corporate benefit that didn’t make the customers lives any better in the face of such rational improvements elsewhere.

Playstation, XBox, Nintendo… at that point their atrophying from the living room wasn’t a shock to anyone.

Yes – Samsung and Sony fought hard. And they did pretty good replicating similar models. But the action was on iTV.  Lots of people went for the alternatives.  There was a fairly strong Apple-fatigue manifesting by then, mostly kids, but it seemed that fatigue came and went every couple years or so but never really undermined Apple’s foothold.

Advertisers screamed bloody murder at first- fought right along side the cable companies for different reasons, pressuring the networks not to do deals with Apple. All they saw was a world where consumers would pay to disconnect from ads. Their viewers no longer held in optionless captivity – the horror.  But as usual, the advertisers just hadn’t spent enough time thinking about the new models to see the historic options they might have. Thankfully a few renegade advertisers did embrace the change and were some of the earliest to exploit the GA-style shows in service to their clients. But no, the ad business was admittedly no longer about the development of “creative messages”, those interruptive :30 second spots. They became more like business or entertainment consultants really. And although there was some blood-letting in the ad industry at first, they started hiring the right people and then profited hugely.

The Networks that joined early prospered. HBO, ABC, Disney, and the other launch networks, became more popular than ever thanks to their early involvement, good will and experimentation. There was a lot of network squirming and sniping at the very beginning, took forever for CBS to come aboard – though not many cared since “2 And A Half Men” had virtually run it’s course and it had become commonly known that the integrity of CBS hinged on its corporate relationships and not on some moral commitment to honesty, truth or viewers’ value. But even they eventually realized how much more they could do in the new paradigm. Years later as it seemed things had finally settled into balance, virtually every popular network was represented in the iTV ecosystem.

Indeed, Apple had cracked it. Not by playing the game. Not through improvements in resolution.  But they completely changed the way we approached in-home media, commerce and socializing. It took a while, and there were bumps, but they did it.

And as I say, I know, because I’m from the future.

Messages From the Future: The Fate of Google Glass

Man, time travel sucks. I mean think about it, you know all this stuff- and I mean you really know this stuff, but of course you can’t say, “You’re wrong. And I know, because I’m from the future.”

So you pretend like its just your opinion and then sit there grinding your teeth while everyone else bloviates their opinions without actually knowing anything. Of course my old friends hate me. I mean I was always a know-it-all, but I really do know it all this time, which must make me seem even worse.

Anyway I was catching up on current events and was surprised to realize that I had arrived here smack dab before Google started selling Glass.

Truth is, I’d actually forgotten about Google Glass until I read that they are about to launch it again. Which itself should tell you something about its impact on the future.

So here’s the deal on Google Glass. At least as far as I know – what with my being from the future and all.

It flopped.

Nobody bought it.

sergeyOh sure they sold SOME. Ultimately Google Glass got used mostly by very specialized workers who typically operated in solitary and didn’t have to interact with other humans. Of the general public, there were a few geeks, opportunistic future-seekers and silicon valley wannabes, who bought them to keep up with developments or hoping to look as “cool” as Sergey did when he was famously photographed sitting on the subway (some PR guy later admitted that the whole “I’m just a normal guy slumming on the subway looking like some hipster cyborg” thing was just an orchestrated Glass marketing ploy arranged by Googles PR firm) but they didn’t. That’s because none of those geeks were young, mincingly-manicured-to-appear-casually-hip, billionaires. No. They just looked overtly dorky and as I recall, slightly desperate for the smug rub off that comes with publicly flashing a “cool” new product. But that didn’t happen for them. Quite the opposite.

Glass just smacked of the old I’m-an-important-technical-guy-armor syndrome. The 90’s cellphone belt holster. The 00’s blinky blue bluetooth headset that guys left in their ears blinking away even while not in use. And then Google Glass.

The whole “I’m just a normal guy slumming on the subway looking like some hipster cyborg” thing was just an orchestrated Glass marketing ploy arranged by Google’s PR firm.

You know, sometimes you see a new innovation and it so upsets the world’s expectations, it’s such a brilliant non sequitur, that you can’t imagine the events that must have lead to such an invention. You wonder what the story was. The iPhone was one of those.

But Google Glass was so mis-timed and straightforward – the exact conversations that lead to it seemed transparent. In hindsight, they were just trying too hard, too early, to force something that they hoped would be a big idea – and eventually would be, if only a little over a decade later, by someone else.

Here’s the scene:

Sergey and his hand-picked team sit in a super secret, man cave romper room on the Google Plex campus. Then Sergey, doing his best to pick up the magician’s torch as an imagined version of Steve Jobs says:

“As we have long discussed, the day will come when no one will hold a device in their hand. The whole handheld paradigm will seem old and archaic. And I want Google to be the company that makes it happen – now. We need to change everything. I want to blow past every consumer device out there with the first persistent augmented reality solution. The iPhone will be a distant memory. Money is no object, how do we do it?”

And then within 10 minutes of brainstorming (if even), of which 8 mostly involved a geek-speak top-lining of the impracticality of implants, bioware and direct neural interfaces, someone on the team stands with a self-satisfied twinkle of entitlement in his eye stemming from his too good to be true ticket to Google’s billion-dollar playground wonder-world which he secretly fears is little more than the result of his having been in the right place at the right time and might rather be more imaginatively wielded by half a dozen brilliant teenagers scattered throughout that very neighborhood, let alone the globe, says:

“We can do this, think about it. We need to give the user access to visual content, right? And audio. And our solution must receive voice commands. So the platform that would carry all that must naturally exist close to each of the relevant senses – somewhere on the head. And that platform – already exists. (murmurs around the room) Ready? Wait for it… a HAT!”

A sniff is heard.

A guy wearing a t-shirt with numbers on it says: “…Augmented Reality …Hat?”

And then someone else, who is slightly closer to being worthy of his access to the Google moneybags-action playset, says, “No, not a hat… Glasses! Think about it – glasses have been in the public consciousness forever as a device for seeing clearly, right? Well, enter Google, with glasses… that let you see everything clearly, more… clearly.”

Everyone in the room nods and smiles. Even obvious ideas can carry a certain excitement when you happen to experience their moment of ideation. This effect of course must be especially pronounced when you’ve passed through a recruitment process that inordinately reveres academic measures of intelligence.

Either that, or it was just Sergey’s idea from the shower that morning.

In any event, the iPhone was such a truly disruptive idea that one cannot as easily pick apart the thought process that lead to it. Too many moving parts. Too much was innovative.

But Glass was a simple idea. Not simple in a good way, like it solved a problem in a zen, effortless way. No, simple like the initial idea was not much of a leap and yet they still didn’t consider everything they needed to.

What didn’t they consider?

Well having seen it all play out, I’d say: Real people – real life. I think what Google completely missed, developing Glass in their private, billion dollar bouncy-house laboratory, were some basic realities that would ultimately limit adoption of Glass’ persistent access to technology: factors related to humanity and culture, real-world relationships, social settings and pressures, and unspoken etiquette.

Oh and one other bit of obviousness. Sex. And I mean the real kind, with another person’s actual living body – two real people who spend a lot of money to look good.

But I guess I get why these, of all über geeks missed that.

While admittedly, sunglasses have found a long-time, hard-earned place in the world of fashion as a “cool” accessory when well appointed and on trend, in hindsight, Google glass should not have expected to leap across the fashion chasm so easily.  There are good reasons people spend umpteen fortunes on contact lenses and corrective eye surgeries. Corrective glasses, while being a practical pain in the ass also effectively serve to make the largest swath of the population less attractive.

Throughout history, glasses have been employed predominantly as the defacto symbol of unattractiveness, of loserdom. They are the iconic tipping point between cool and uncool. The thin line separating the Clark Kents from the Supermen. Countless young ugly ducklings of cinema needed only remove that awkward face gear to become the stunning beauty, the glassless romantic lead. How many make-over shows ADD a pair of glasses?

Throughout history, glasses have been employed predominantly as the defacto symbol of unattractiveness, of loserdom. They are the iconic tipping point between cool and uncool. The thin line separating the Clark Kents from the Supermen.

Sure, there are a few fetishists out there, but for every lover of glasses wearing geekery, there are a thousand more who prefer their prospective mates unadorned.

Leave it to a bunch of Halo-playing, Dorito-eating engineers to voluntarily ignore that basic cultural bias. And worse, to maybe think all they had to do was wear them themselves to make them cool somehow.

“But didn’t you SEE Sergey on the subway?” You ask. “He looked cool.”

Well, Sergey had indeed been styled by someone with taste and has been valiantly strutting his little heart out on the PR runway in an obviously desperate effort to infuse some residual “billionaires wear them” fashion credibility into his face contraption.

But look at that picture again, he also looked alone, and sad.

And to think Google Glass was a really good idea, you sort of had to be a loner. A slightly sad, insecure, misfit. Typically riding the train with no one to talk to. Incidentally, later- before Facebook died, Facebook Graph showed that Glass wearers didn’t have many friends. Not the kind they could hug or have a beer or shop with.

And to think Google Glass was a really good idea, you sort of had to be a loner. A slightly sad, insecure, misfit. Typically riding the train with no one to talk to.

Wearing Google Glass made users feel like they didn’t have to connect with the actual humans around them. “I’m elsewhere – even though I appear to be staring right at you.” Frankly the people who wore Google Glass were afraid of the people around them. And Glass gave them a strange transparent hiding place. A self-centered context for suffering through normal moments of uncomfortable close proximity. Does it matter that everyone around you is more uncomfortable for it?

At least with a hand-held phone there was no charade. The very presence of the device in hand, head down, was a clear flag alerting bystanders to the momentary disconnect. “At the moment, I’m not paying attention to you.”

But in it’s utterly elitist privacy, Google Glass offered none of that body language. Which revealed other problems.

At least with a hand-held phone there was no charade. The very presence of the device in hand, head down, was a clear flag alerting bystanders to the momentary disconnect. “At the moment, I’m not paying attention to you.”

But in it’s utterly elitist privacy, Google Glass offered none of that body language.

In the same way that the introduction of cellphone headsets made a previous generation of users on the street sound like that crazy guy who pees on himself as he rants to no one, Google Glass pushed its users past that, occupying all their attention, their body in space be damned – mentally disconnecting them from their physical reality. With Glass, not even their eyes were trustworthy.

Actually, it was commonly joked that Glass users often appeared down right “mentally challenged” as they stared through you trying to work out some glitch that no one else in the world could see. They’d stutter commands and and tap their heads and blink and look around lost and confused.

Suddenly we all realized what poor multi-taskers these people really were.

Any wearer who actually wanted to interact with the real world quickly found they had to keep taking off their Google Glasses and stowing them, or else everyone got mad.

It was simply deemed unacceptable to wear them persistently. And in fact users reported to having been socially pressured to use them quite a lot as they had previously used their phones. Pulling them out as needed. Which utterly defeated the purpose. On some level – that’s what broke Google Glass. It wasn’t what it was supposed to be. It wasn’t persistent. It was more cumbersome and socially uncomfortable than the previous paradigm.

People who left them on in social situations were openly called “glassholes”.

People who left them on in social situations were openly called “glassholes”.

They were smirked at, and laughed at walking down the street. I know because I did it too.

There were lots of news reports about people who got punched for wearing them in public. In fact, anecdotally, there were more news reports about people getting beat up for wearing Google Glass in public than I actually saw on the street wearing them. The court of public opinion immediately sided on the position that Google Glass was little more than some random stranger shoving a camera in your face. Other people stopped talking to wearers until they took them off. They didn’t even want it on top of their heads.

In hind sight it was pretty quickly clear Google Glass wasn’t going to be a revolution.

I read an interview somewhere (years from now) that someone on the Google team had admitted that they more than once asked themselves if they were on the right track – but that the sentiment on the team was that they were doing something new. Like Steve Jobs would have done. Steve Jobs couldn’t have known he was on the right track any more than they did – so they pushed forward.

Except that I think Steve Jobs sort of did know better. Or rather, he was better connected to the real world than the boys at Google’s Richie Rich Malibu Dream Labs were. Less dorky and introverted, basically.

The problem with innovation is that all the pieces need to be in place. Good ideas and good motivation can be mistimed. Usually is. That’s all Google Glass was. Like so many reasonable intentions it was just too early. Selling digital music didn’t work until everything was in place – iPods and iTunes were readily available and insanely easy to sync. HDTV didn’t hit until content and economics permitted. And the world didn’t want persistent augmented reality when Google created Glass.

All the above disclosed, Augmented Reality is still indeed your future. It’s just that when it finally comes, well, when it happened, it didn’t look like Google Glass.

Like, at all.

And I know, because I’m from the future.

My First Message From the Future: How Facebook Died

It was a hot, sunny Boston morning in July, 2033 – and suddenly – it was a freezing London evening in Feb 2013, and I had an excruciating headache.

I have no clue what happened. No flash, no tunnel, no lights. It’s like the last 20 years of my life just never happened. Except that I remember them.

Not knowing what else to do I went to the house I used to live in then. I was surprised that my family was there, and everyone was young again. I seemed to be the only one who remembers anything. At some point I dropped the subject because my wife thought I’d gone crazy. And it was easier to let her think I was joking.

It’s hard to keep all this to myself though, so, maybe as therapy, I’ve decided to write it here. Hardly anyone reads this so I guess I can’t do too much damage. I didn’t write this stuff the first time around, and I’m a little worried that the things I share might change events to the point that I no longer recognize them, so forgive me if I keep some aspects to myself.

As it is I already screwed things up by promptly forgetting my wife’s birthday. Jesus Christ, I was slightly preoccupied, I mean, I’m sorry, ok? I traveled in time and forgot to pick up the ring that I ordered 20 years ago… and picked up once already. All sorts of stuff changed after that for a while. But then somehow it all started falling back into place.

Anyway – that’s why I’m not telling you everything. Just enough to save the few of you who read this some pain.

Today I’ll talk about Facebook.

Ok, in the future Facebook, the social network, dies. Well, ok, not “dies” exactly, but “shrivels into irrelevance”, which was maybe just as bad.

Bets are off for Facebook the company. I wasn’t there long enough to find out – it might survive, or it might not, depends on how good they were… sorry, are at diversifying.

At this point perhaps I should apologize for my occasional shifting tenses. I’m finding that time travel makes it all pretty fuzzy. But I’ll do my best to explain what happened…  Happens. Will happen.

Anyway, seeing Facebook back here again in full form, I marvel at the company’s ability to disguise the obviousness of the pending events in the face of analysts, and corporate scrutiny, with so many invested and so much to lose.

But hindsight being 20/20, they should have seen – should see – that the Facebook social network is destined to become little more than a stale resting place for senior citizens, high-school reunions and, well, people whose eyes don’t point in the same direction (it’s true, Facebook Graph showed that one, it was a joke for a while – people made memes – you can imagine). Grandmothers connecting with glee clubs and other generally trivial activities – the masses and money gone.

The Facebook social network is destined to become little more than a stale resting place for senior citizens, high-school reunions and, well, people whose eyes don’t point in the same direction

There were two primary reasons this happened:

First – Mobile (and other changing tech – including gaming, iTV and VR).  I know, I know I’m not the first, or 10,000th guy to say “Mobile” will contribute to Facebook’s downfall. But there is a clue that you can see today that people aren’t pointing out. While others look at Facebook with confidence, or at least hope, that Facebook has enough money and resources to “figure mobile out”, they don’t do it. In fact there is a dark secret haunting the halls of the Facebook campus. It’s a dawning realization that the executive team is grappling with and isn’t open about – a truth that the E-suite is terrified to admit. I wonder if some of them are even willing to admit it to themselves yet.

Here is the relevant clue – the idea that would have saved Facebook’s social network, that would make it relevant through mobile and platform fragmentation – that idea – will only cost its creators about $100K.  That’s how much most of these ideas cost to initiate – it rarely takes more.  Give or take $50k.

That’s all the idea will cost to build and roll out enough to prove. 3-6 months of dev work. Yeah it would have cost more to extend it across Facebook’s network. But that would have been easy for them. So, Facebook has gobs of $100Ks – why hasn’t it been built yet?

The dark secret that has Facebook praying the world doesn’t change too fast too soon (spoiler alert, it does), is that – they don’t have the idea. They don’t know what to build.

Let me repeat that, Facebook, the company, doesn’t have the one idea that keeps their social network relevant into mobile and platform fragmentation. Because if they actually did… it’s so cheap and easy to build, you would already see it. Surely you get that, right?   Even today?

Perhaps you take issue with the claim that only “one idea” is needed. Or perhaps you think they do have the vision and it’s just not so easy; it requires all those resources, big, complex development. And that today it’s being implemented by so many engineers, in so many ways across Facebook with every update. Perhaps you will say that continually sculpting Facebook, adding features, making apps, creating tools for marketers, and add-ons, will collectively add up to that idea.  This is what Facebook would prefer you believe.  And it’s what people hope I guess.

Well, that’s not how it works. Since the days Facebook was founded, you have seen a paradigm shift in the way you interact with technology. And that keeps changing.  I can report that the idea that will dominate within this new paradigm, will not merely be a collection of incremental adjustments from the previous state.

Hell, Facebook was one simple idea once. One vision. It didn’t exist, and then it did(and it didn’t even cost $100K). It answered a specific need.  And so too will this new idea. It won’t be a feature. It won’t look like Facebook. It will be a new idea.

I know, I’ve heard it, “Facebook can just buy their way into Mobile”. You’ve seen that desperation already in the Instagram land grab. It’s as if Mark said “…oh… maybe that’s it..?? …or part of it … Maybe…?”

Cha-ching.

The price was comically huge. Trust me, in the future a billion dollars for Instagram looks even dopier. How much do you think Instagram spent building the initial working version of Instagram? Well, I didn’t work on it, but like most projects of their ilk I am willing to bet it was near my magic number: $100K. I read somewhere that Instagram received $250K in funding early on and I seriously doubt they had to blow through more than half that on the initial build.

And Facebook’s desperate, bloated buy of Instagram is virtual confirmation of the point. See, you don’t buy that, and pay that much, if you have your own vision. If you have the idea.

And Facebook’s desperate, bloated buy of Instagram is virtual confirmation of the point. See, you don’t buy that, and pay that much, if you have your own vision. If you have the idea.

Unfortunately, Facebook will eventually realize that Instagram wasn’t “it” either. No, the idea that will carry social networking into your next decade of platform fragmentation and mobility isn’t formally happening yet. Rather the idea that will make social connections work on increasingly diverse platforms will come about organically. Catching all the established players mostly by surprise. It will be an obvious model that few are thinking about yet.

And that leads us to the second, and most potent, reason Facebook withers – Age.

Facebook found it’s original user-ship in the mid ’00s. It started with college-age users and quickly attracted the surrounding, decidedly youthful, psychographics. This founding population was united by a common life-phase; young enough to be rebelling and searching for a place in the world they can call their own, and just barley old enough to have an impact on developing popular trends.

Well, it’s been almost a decade for you now- time flies. Those spunky, little 20+ year-old facebook founders are now 30+ year-olds and Facebook is still their domain. They made it so. And they still live their lives that way. With Facebook at its center.

But now at 30 things have started to change – now they have kids. Their kids are 6-12 years-old and were naturally spoon-fed Facebook. That’s just the nature of life as a child living under Mom and Dad. You do what they do. You use what they use. You go where they go. Trips to the mall with Mom to buy school clothes. Dad chaperoning sleep-overs.  Messages to Grandma on Facebook.  It’s a lifestyle that all children eventually rebel against as they aggressively fight to carve out their own world.

So give these kids another 6 years, the same rules will apply then. They’ll be full-blown teenagers. They started entering college. They wanted their own place. And importantly, they inherited your throne of influence for future socializing trends. Yup, the generation of Mark Zuckerburgs graduated to become the soft, doughy, conventionally uncool generation they are… or rather, were,  in the future.

So project ahead with me to that future state, do you really think Facebook is going to look to these kids like the place to hang out?? Really? With Mom and Dad “liking” shit? With advertisers searching their personal timelines?

No – way.

 So project ahead to that future state, do you really think Facebook is going to look to these kids like the place to hang out?? Really? With Mom and Dad “liking” shit? With advertisers searching their personal timelines?

No – way.

Don’t even hope for that. See, the mistake a lot of you are making is that Facebook was never a technology – for the users, Facebook has always been a place. And 6-7 years from now these kids will have long-since found their own, cooler, more relevant place – where Mom and Dad (and grandma, and her church, and a gazllion advertisers) aren’t. And it won’t be “Social Network Name #7”, powered by Facebook (but Facebook tries that – so I bought their URL yesterday – I recall they paid a lot for it). You will find it to be a confoundedly elusive place. It will be their own grass-roots network – a distributed system that exists as a rationally pure mobile, platform-agnostic, solution. A technically slippery, bit-torrent of social interaction. A decisive, cynical response to the Facebook establishment, devoid of everything Facebook stood for. At first it will completely defy and perplex the status quo. That diffused, no-there-there status makes advertisers crazy trying to break in to gain any cred in that world. But they don’t get traction. The system, by design, prohibits that. At least for a year or two. Not surprisingly some advertisers try to pretend they are groups of “kids” to weasel in, and it totally blows up in their faces. Duh. It will be a good ol’ wild west moment. As these things go. And they always do go. You’ve seen it before. And the kids win this time too.

 It will be their own grass-roots network – a distributed system that exists as a rationally pure mobile, platform-agnostic, solution. A technically slippery, bit-torrent of social interaction.

Then a smart, 20-year-old kid figures out how to harness the diffusion in a productized solution. Simply, brilliantly, unfettered by the establishment.

And at this point, you might say – “… well… Facebook can buy that!”

Sorry, doesn’t happen. I mean, maybe it could have, but it doesn’t. Don’t forget, Yahoo tried to buy Facebook for a Billion Dollars too.

For a kid, the developer of this new solution is shrewd, and decides that selling out to Facebook would weaken what he and his buddies built – rendering it immediately inauthentic.

Seeing the power of what he holds, this kid classically disses Mark’s desperate offer. It’s all very recursive, and everyone wrote about that. My favorite headline was from Forbes: “Zucker-punched”. And anyway, Google offers him more (which is not a “buy” for Google – later post).

Look, it doesn’t matter, because at that point Facebook is already over because Facebook isn’t “where they are” anymore.

Their parents, Facebook’s founding user-base, stay with Facebook for a while and then some, those who still care how their bodies look in clothes (again Facebook’s Graph, famously showed this), will switch over presumably because they suddenly realized how uncool Facebook had become. Then even more switched because they needed to track their kids and make sure they were not getting caught up in haptic-porn (something I actually rather miss now). And that kicks off the departure domino effect (or “The Great Facebalk”, The Verge, 2021 I believe).

Later, Grandma even switches over. But some of her friends are still so old-timey that she’ll keep her Facebook account so she can share cat pictures with them. And of course, she won’t want to miss the high-school reunions.

some of her friends are still so old-timey that she’ll keep her Facebook account so she can share cat pictures

So that is Facebook’s destiny. And you know, I am from the future. So I know.

Oh one last thing, in Petaluma there’s a 14 year-old kid I bumped into the other day – quite intentionally.  He’s cool.  He’s hungry. When he turns 20, I plan on investing exactly $100K in some crazy idea he’ll have. I have a pretty good feeling about it. I’ll let you know how it goes.

Why Apple’s Interfaces Will Be Skeuomorphic Forever, And Why Yours Will Be Too

“Skeuomorph…”  What??  I have been designing interfaces for 25 years and that word triggers nothing resembling understanding in my mind on its linguistic merit alone.  Indeed, like some cosmic self-referential joke the word skeuomorph lacks the linguistic reference points I need to understand it.

So actually yes, it would be really nice if the word ornamentally looked a little more like what it meant, you know?

So Scott Forstall got the boot – and designers the world over are celebrating the likely death of Apple’s “skeuomorphic” interface trend.  Actually I am quite looking forward to an Ive-centric interface, but not so much because I hate so-called skeuomorphic interfaces, but because Ive is a (the) kick ass designer and I want to see his design sensibility in software.  That will be exciting.

And yet, I’m not celebrating the death of skeuomorphic interfaces at Apple because – and I can already hear the panties bunching up – there is no such a thing as an off-state of skeuomorphism. That’s an irrelevant concept.   And even if there was such a thing, the result would be ugly and unusable.

Essentially, every user interface on Earth is ornamentally referencing and representing other unrelated materials, interfaces and elements.  The only questions are: what’s it representing, and by how much?

Essentially, every user interface on Earth is ornamentally referencing and representing other unrelated materials, interfaces and elements.  The only questions are: what’s it representing, and by how much?

For example, there is a very popular trend in interface design – promoted daily by the very designers who lament Apple’s so-called “skeuomorphic” leather and stitching – where a very subtle digital noise texture is applied to surfaces of buttons and pages.  It’s very subtle – but gives the treated objects a tactile quality. Combined with slight gradient shading, often embossed lettering and even the subtlest of drop shadows under a button, the effect is that of something touchable – something dimensional.

Excuse me, how can this not be construed as skeuomorphic?

Is that texture functional – lacking any quality of ornamentation?  Is the embossing not an attempt to depict the effect of bumps on real world paper?  Are the subtle drop shadows under buttons attempting to communicate something other than the physicality of a real-world object on a surface, interacting with a light source that doesn’t actually exist?  The most basic use of the light source concept is, by definition skeuomorphic.

Drop shadows, embossing, gradients suggesting dimension, gloss, reflection, texture, the list is endless… and absolutely all of this is merely a degree of skeuomorphism because it’s all referencing and ornamentally rendering unrelated objects and effects of the real world.

And you’re all doing it.

This whole debate is a question of taste and functional UI effectiveness.  It’s not the predetermined result of some referential method of design.

So when you say you want Apple to stop creating skeuomorphic interfaces – you really don’t mean that.  What you want is for Apple to stop having bad taste, and you want Apple to make their interfaces communicate more effectively.

So when you say you want Apple to stop creating skeuomorphic interfaces – you really don’t mean that.  What you want is for Apple to stop having bad taste, and you want Apple to make their interfaces communicate more effectively.

The issues you have had with these specific interfaces is that they either communicated things that confused and functionally misled (which is bad UX), or simply felt subjectively unnecessary (bad taste).  And these points are not the direct result of skeuomorphism.

“But,” you say, “I don’t use any of that dimensional silliness.  My pages, buttons and links are purely digital – “flat” and/or inherently connected only to the interactive function, void of anything resembling the real world, and void of ornamentation of any kind.  Indeed, my interfaces are completely free of this skeuomorphism.”

Bullshit.

I’ll skip the part about how you call them pages, buttons and links (cough – conceptually skeuomorphic – cough) and we’ll chalk that up to legacy terminology.  You didn’t choose those terms.  Just as you didn’t choose to think of the selection tool in photoshop as a lasso, or the varied brushes, erasers and magnifying glass.  That’s all legacy – and even though it makes perfect sense to you now – you didn’t choose that.  Unless you work at Adobe in which case maybe you did and shame on you.

But you’re a designer – and your interfaces aren’t ornamental – yours are a case of pure UI function.  You reference and render nothing from anywhere else except what’s right there on that… page… er, screen… no … matrix of pixels.

For example, perhaps you underline your text links.  Surely that’s not skeuomorphic, right?  That’s an invention of the digital age.  Pure interface.  Well, lets test it:  Does the underline lack ornamentation, is it required to functionally enable the linking?    Well, no, you do not have to place an underline on that link to technically enable linking.   It will still be clickable without the underline.  But the user might not understand that it’s clickable without it.  So we need to communicate to the user that this is clickable.  To do that we need to reference previous, unrelated instances where other designers have faced such a condition.  And we find an underline – to indicate interactivity.

“Wait,” you say, “the underline indicating linking may be referencing other similar conditions, but it’s pure UI, it’s simply a best practice.  It is not a representation of the real world.  It’s not metaphorical.”

Nyeah actually it is.  It just may not be obvious because we are sitting in a particularly abstract stretch of the skeuomorphic spectrum.

Why did an underlined link ever make sense?  Who did it first and why?

Well although my career spans the entirety of web linking I have no clue who did it on a computer first (anyone know this?).  But I do know that the underline has always (or for a very looooong time) – well before computers – been used to emphasize a section of text.  And the first guys who ever applied an underline to a string of text as a UI solution for indicating interactivity borrowed that idea directly from real-world texts – to similarly emphasize linked text – to indicate that it’s different.  And that came from the real world.  We just all agree that it works and we no longer challenge it’s meaning.

Face it, you have never designed an effective interface in your whole life that was not skeuomorphic to some degree.  All interfaces are skeuomorphic, because all interfaces are representational of something other than the pixels they are made of.

Look I know what the arguments are going to be – people are going to fault my position on this subject of currency and and how referencing other digital interface conventions “doesn’t count” – that it has to be the useless ornamental reproduction of some physical real-world object.  But you are wrong.

Skeuomorphism is a big, fat gradient that runs all the way from “absolute reproduction of the real world” to “absolute, un-relatable abstraction”.

Skeuomorphism is a big, fat gradient that runs all the way from “absolute reproduction of the real world” to “absolute, un-relatable abstraction”.

And the only point on that spectrum truly void of skeuomorphism is the absolute, distant latter: pure abstraction.  Just as zero is the only number without content.  And you know what that looks like ?  It’s what the computer sees when there’s no code.  No user interface.  Pure abstraction.  That is arguably a true lack of skeuomorphism.  Or rather, as close as we can get.  Because even the underlying code wasn’t born in the digital age, it’s all an extension of pre-existing language and math.

Look at it this way – an iPad is a piece of glass.  You are touching a piece of glass. So as a designer you need a form of visual metaphor to take the first step in allowing this object to become something other than a piece of glass.  To make it functional.  And that alone is a step on the skeuomorphic spectrum.

Sure you can reduce the silliness and obviousness of your skeuomorphism (good taste), and you can try to use really current, useful reference points (good UI), but you cannot design without referencing and rendering aspects of unrelated interfaces – physical or digital.  And that fact sits squarely on the slippery slope of skeuomorphism.

I read a blogger who tried to argue that metaphoric and skeuomorphic are significantly different concepts.  I think he felt the need to try this out because he thought about the topic just enough to realize the slippery slope he was on.  But it ultimately made no sense to me.  I think a lot of people want a pat term to explain away bad taste and ineffective UI resulting from a family of specific executions, but I don’t think they have thought about it enough yet.  Skeuomorphic is metaphoric.

OK so let’s say all this is true.  I know you want to argue, but come with me.

In the old days – meaning 1993-ish – There was something much worse than your so-called skeuomorphic interface.  There were interfaces that denied the very concept of interface – and looked completely like the real world.  I mean like all the way.  A bank lobby for example.  So you’d pop in your floppy disc or CD-Rom and boom – you’d be looking at a really bad 3D rendering of an actual bank teller window.  The idea was awful even then.  “Click the teller to ask a question” or “Click the stapler to connect your accounts”.

And that was a type of “skeuomorphism” that went pretty far up the spectrum.

Back then my team and I were developing interfaces where there were indeed, buttons and scroll bars and links but they were treated with suggestive textures and forms which really did help a generation of complete newbie computer users orient themselves to our subject and the clicking, navigating and dragging. You would now call what we’d done skeuomorphism.

My team and I used to call these interfaces that used textures and forms, ornamentally suggestive of some relevant or real-world concept “soft metaphor interfaces”.  Where the more literal representations (the bank lobby) were generally called “hard metaphor interfaces”.

And these terms allowed for acknowledgment of variability, of volume. The more representative, the “harder” the metaphoric approach was.  The more abstract, the “softer” it could be said to be.

And these terms allowed for acknowledgment of variability, of volume. The more representative, the “harder” the metaphoric approach was.  The more abstract, the “softer” it could be said to be.

To this day I prefer these qualifiers of metaphor to the term “skeuomorphic”.  In part because “skeuomorphic” is used in a binary sense which implies that it can be turned off.  But the variability suggested by the softness of metaphor is more articulate and useful when actually designing and discussing design. Like lighter and darker, this is a designer’s language.

I hope after reading this you don’t walk away thinking I believe the leather and stitching and torn paper on the calendar app was rightly implemented.  It wasn’t – and others have done a solid job explaining how it breaks the very UX intent of that app.

But the truth is – there are times when some amount of metaphor, of obvious skeuomorphism in interface design makes tons of sense.  Take the early internet.  Back then most people were still relatively new to PCs. Ideas we take for granted today – like buttons, hover states, links, dragging and dropping, etc, was completely new to massive swaths of the population.  Computers scared people. Metaphorical interfaces reduced fear of the technology – encouraged interaction.

And I think, as Apple first popularized multi-touch – an interface method that was entirely new – it made all the sense in the world to embrace so-called skeuomorphism as they did.  I don’t begrudge them that at all.  Sure – there are lots of us that simply didn’t need the crutch.  We either grew up with these tools and or create them and feel a bit like it talks down to us.  But Apple’s overt skeuomorphic interfaces weren’t really aimed at us.

Remember the launch of the iPad, where Steve Jobs announced that this was the “post PC era”?  Apple didn’t win the PC war – and instead deftly changed the game.  “Oh, are you still using a PC?  Ah, I see, well that’s over.  Welcome to the future.”  Brilliant!

But the population WAS still using a PC.   And Apple, with it’s overt skeuomorphic interfaces, was designing for them.  Users who were figuratively still using IE6.  Who were afraid of clicking things lest they break something.

These users needed to see this new device – this new interface method – looking friendly.  It needed to look easy and fun.  And at a glance, hate it though you may, well-designed metaphorical interfaces do a good job of that.  They look fun and easy.

Communicating with your users is your job. And to do that you must continue to devise smart UI conventions and employ good taste – and that means choosing carefully where on the skeuomorphic spectrum you wish to design.  Skeuomorphic is not a bad word.  It’s what you do.

Steve Jobs

Years ago my business partner at Red Sky, CEO Tim Smith, used to tell a story about having met Steve Jobs in a most unusual, almost comic, situation. Tim has, after all these years, felt the pull to write it for posterity, or therapy maybe.

My favorite picture of Jobs and Woz. It reminds me that you can achieve anything, from any starting point. Here, juvenile and awesome, Jobs inspects a home-made "Blue Box" which would allow them to hack the Bell System touch-tone telephone system and place long-distance calls from pay phones. My kind of guys.

It’s a great read.  If you’re a bit stunned at the loss of Steve Jobs you will appreciate it as I did.

Read Tim’s story here.

I never met Steve. I always thought I would some day, egoist I am.  The man shaped the lives and careers of so many of us, and we (I) invested so much of who we are in him.  He played such a central role in our days.

But as I sit here and write this I feel a tugging that I recall having only once before.  And although it was understandably quite a lot stronger and more personal then, I recognize the feeling.  It happened on the morning my grandmother, my father’s mother, passed away.

I drove to be with my grandfather and we spent the day together alone in their house.  It was an emotional day, her presence was everywhere.  But the most poignant moment came when the two of us sat down and, in thick silence, ate a slice of fresh pie that my grandmother had made only the day before.  Her fingerprints were in the crust.

Nothing had been said before, or subsequently, that was ultimately more emotionally meaningful to me than that moment. The feeling washed over  me as I realized simultaneously – that she was gone forever, but how fresh and delicious the pie was.

It was a strange, ghostly feeling – both utterly empty and yet full of meaning.

I guess sitting here, writing this now, I feel something similar that must be playing out in so many ways all over the world tonight.

I usually delete the following… but not today.

Sent from my iPad.