Messages From The Future: The Decline of Apple

I’m sure you’ve had your own debates with the “Apple is about to die” crowd. I’ve had those too. Except that being from the future, of course I’m the only one who actually knows what I’m talking about. And yet even though the future is not always rosy for Apple, even though some of these people sometimes have a point, they still piss me off just like they did the first time I was here.

Usually the argument centers around the tired meme that Apple has nothing significantly visionary or profitable to jump to that comes close to the potential of the iPhone, which of course supposedly means that Apple is going to die under its size and obsessive and unsustainable inclination to polish and “perfect” in the face of speedier, less precious, competition.

But that is so not how it goes down.

The other day Marco Arment read about Viv the AI virtual assistant , still being developed by creators of Siri. This, in particular, he told me years from now, after we’d met online which hasn’t happened yet (Hi Marco – you dropped it in the potato salad – remember I said that), coupled with highly cited reports of the AI efforts of Google and Facebook, inspired his first post on the topic of this possible kink in Apple’s armor. It was about then that he, and a handful of others, came to their conclusion; one that was not too far off from what actually happened.

Though it wasn’t quite as simple as “Apple showing worryingly few signs of meaningful improvement or investment in…big-data services and AI…”, nor as some had suggested, “When the interface becomes invisible and data based, Apple dies”.

Actually interfaces remained visible, tactile and exceptionally alive and well in the future. AI (via natural language interfacing) did not herald the death of the visual or tactile interface.  We used each for different things and in different places. Trust me – there are still a million reasons you’ll want to see and touch your interfaces, and maybe more importantly, a million places in which you still don’t want to sound like a dork talking to your virtual assistant.  Even in the future.

But there was some truth floating within the “big-data services” thread.

But there was more going on here than the advance of AI. There was also the ongoing fragmentation of your platforms.

Apple mastered the hardware/software marriage. With rare exception, Apple exceeded in virtually any device category they ventured into. So you might argue that so long as there were devices to build and software to make for it, even if it was indeed powered by advanced AI, Apple, with resources beyond any other company, stood a chance. But there was more going on here than the advance of AI.

There was also the ongoing fragmentation of your platforms.

20 years ago most of you still had one computer: a desktop. 15 years ago you probably had two, including a laptop.  10 years ago you added a smartphone and a “smart” TV. 5 years ago you added a tablet. Last year you added a watch. Now you have six computing devices plus peripherals and are only a few years from adding the first real VR platforms (incidentally real VR catches all the currently uncertain silicon valley trenders, and mobile hypeists off-guard. VR is so not this unbelievably temporary, phone-based, turn-your-head-to-look-another-direction ridiculousness. What a joke. That’s the equivalent of the 1998-chose-your-ending interactive CD-ROM, or red and blue 3D glasses. Trust me, speaking from the future – your phone didn’t become much “VR” anything. Took a lot more hardware. If you’re investing, focus on in-home solutions and content creators, that’s where it all went down in the future. Another post.)

And do you think this platform fragmentation will stop? (Spoiler: it doesn’t.) Having to remember to put your device in your pocket is totally kludgy, you can see that even now, right? That you have to manage where all your various form factors are and independently charge, set up, maintain and connect them all, is grossly unwieldy, right?

You should know this, because it has been commonly theorized by now, that computers will continue to get cheaper and more powerful so as to become ubiquitous. In everything. Not just the so called internet of things, but the interconnection of EVERYTHING. And indeed, that happened. Exponentially. And yet few were ready. Businesses were disrupted and died.

Computers became part of almost everything. Seriously, for example, computers were in fish. And for that matter your toilet became a massively parallel processing platform…

Seriously, for example, computers were in fish. And for that matter your toilet became a massively parallel processing platform… I’m not kidding, 1.84 YottaFLOP power flushers.  Everything that touched it, and went into it, was measured, DNA-decoded, and analyzed in a million ways.  And this, more than any other advance in medicine, lead to quantum advances in health care and longevity. Who knew. There was a toilet company on Forbes Top 100 US Companies list.  No, really.  Though Apple never made a toilet.

Aside from your watch (and a headset which eventually you didn’t need anyway), you didn’t carry anything. Everything was a potential display, even the air, when it was dark enough. My Apple Watch was still the last and only device I carried with me (I posted about this before – it is laughable that people today think Apple Watch has no future. Oh man, just you wait.) But I’m getting ahead of myself.

This platform fragmentation, and not just AI, and not the feared loss of interface, was what ultimately changed things for Apple. Suddenly – there were thousands of device form factors everywhere. A virtual fabric. Rampant commoditization. Experience democratization.

In hindsight, it became clear that what Apple required to be at its best was a market limitation in consumer’s access to devices. Limited form factors, limited production and distribution. These limitations, which were common during the PC and mobile eras, allowed Apple to qualitatively differentiate itself. To polish and make something insanely great – in contrast to the others. To design a noticeably superior experience.

But the more the device ecosystem fragmented, the harder it became for Apple to offer uniquely valuable and consistent experiences on top of all those infinitely unique functions and form factors. It just became unmanageable. I mean Apple had an impact for sure. Companies knew design mattered. And Apple went down in history as the company that made all that so.

Your six devices became upwards of fifteen devices, at which point I think most of us just stopped counting. And the number kept growing so fast. The car was a big one. Yeah, Apple did a car. A few actually. The car was perfect for Apple then because there were still significant market limitations in that category. And Apple could focus on the qualitative experience. Later, as the dust settled, the watch, being the last device we needed to carry with us, also served as a different kind of market limitation – a focal point for Apple’s strengths.

the idea of a “device” of any specific, personalized sort had begun to lose meaning

But as device fragmentation continued to explode, as the hardware market massively commoditized, the idea of a “device” of any specific, personalized sort had begun to lose meaning. In the future, the term “device” sounds much the way “mainframe” or “instamatic” might sound to you today. Quaint, old; it’s just not a relevant concept anymore. Everything was a device. Focus instead shifted to the services that moved with you. Which was part of Apple’s problem.

Like Facebook, Apple, the wealthiest company in the world at the time, did a good job buying its way into various service categories (including AI), and innovating on some. Apple had a huge finance division, they had media and content production, utilities, but then so did other companies by then. Ultimately, none of it was in Apple’s sweet spot.

Without discrete devices, it was services and systems that became consumers’ constant. I hope you can see that when this happens, it fundamentally changes Apple’s proposition, the so-called perfect marriage of hardware and software itself becomes an antiquated paradigm.

No, Apple did NOT die, but became some significant degree less a focal point for all of us. And yet… maybe now armed with this knowledge, they can change how things played out.

In the meantime I’m watching the toilet sector. And I plan to invest heavily.

The Presentation of Design

There was an excellent post on Medium recently called: 13 Ways Designers Screw Up Client Presentations, by Mike Monteiro which contained thoughtful, if rather strident, recommendations related to the selling of design work. It was a relatively enjoyable read. I agreed with all 13 points. However in his first paragraph, establishing the primary rationale for the article, Mr. Monteiro made a statement that caused me to choke on my coffee:

“I would rather have a good designer who can present well, than a great designer who can’t.”

Like a punch to the gut — it caught me off guard. I had to reread it a few times to make sure I’d read it correctly. After reading the article I kept coming back to that line. “Really?” I kept asking myself.
He went on to say:

“In fact, I’d argue whether it’s possible to be a good designer if you can’t present your work to a client. Work that can’t be sold is as useless as the designer who can’t sell it.
And, no, this is not an additional skill. Presenting is a core design skill.

My emphasis added.

Undoubtedly that pitch goes over super well in rooms filled with wannabe designers who can present really well, busy account executives and anyone whose primary tool is Excel. Certainly for people who look on the esoteric machinations of designers as a slightly inconvenient and obscure, if grudgingly necessary, part of doing business.

But surely it can’t be the mantra of someone who cares supremely about the quality of the design work – about achieving the greatest design?

I never do this, but I posted a brief opinion of disagreement on this point in the margin comments of Mr. Monteiro’s article. And I would have moved on and forgotten all about having done that, but my comment was subsequently met with some amount of resistance and confusion by Mr. Montiero and other readers writing in his defense. It frankly depressed me that there were professionals in our industry that might sincerely feel this way and more so that the article might convince even a single talented young designer for whom presentation is a non-trivial challenge, that this particular thought, as worded, has any industry-wide merit. And then I came across this recent talk he gave where he doubled down on the idea.

So I wanted to explain my reasoning more fully- it’s an interesting debate- but the limited word-count allotted to side comments didn’t allow for meaningful explanations or exchanges (particularly by people who are as verbose as I am). So rather than pollute Mr. Monteiro’s otherwise fine article further, I decided to explain myself more completely in a post of my own. Maybe more as personal therapy than anything.

No matter your field or role, you will never do worse by having strong presentation skills. It will help you align the world in your best interest, without question.

Let me first state — presentation proficiency is a useful skill. No matter your field or role, you will never do worse by having strong presentation skills. It will help you align the world in your best interest, without question. Everyone should cultivate these skills to the best of their ability.

But to what degree does this affect a designer? Does its lacking utterly obliterate one’s potential as a great designer, as Mr Monteiro asserts? And should designers further be “had” principally on presentation skill over other attributes?

Language Logic

Linguistically speaking, his choice of words “A good designer”, and “who can present well” clearly contemplate two separate states of being. This compels one to infer that not all good designers can present well, which is further supported by the fact that Mr. Monteiro evidently turns away “great designers who can’t.”

Which leaves me wondering:
What is demonstrably “great” in a “great designer” who can’t present well, if presenting is a core design skill that dictates the ultimate usefulness of the entire role?
Wouldn’t that mean then, that there never was any such “great designer” to begin with? That this designer must have been, rather, a “poor designer” for lacking presentation skill?
Perhaps a better way to say what I believe Mr. Monteiro meant is:

“There is no such thing as a great designer who can’t present well, because presenting is a core design skill.”

On the one hand this revised statement at least avoids contradicting itself, but on the other I still absolutely disagree with it because, to me, it inexorably expands into the following thought:

“I would rather have a designer who has relatively weaker creative problem-solving, conceptual, aesthetic and technical skills so long as he can, alone, persuade the client to pay for the work, than I would a designer who has vastly superior creative, conceptual, aesthetic and technical skills  who unfortunately happens to lack presentation skill.”

Based on his specific wording, I think Mr. Monteiro would have to concede, at the very least, that design and presentation are separate, independently measurable skills unrelated to one another except within the context of what he prioritizes – in this case the selling,  as opposed to the designing of  work.

Part of what troubles me then, is that no other option further appears to exist to Mr. Monteiro except that every designer present and sell his own work – full stop.   And that the quality of the design work is naturally the first thing that should be compromised to enable this.

And I think that’s an unnecessary, limited, nonrealistic supposition.

When Design Requires Explanation

Great design does not exist in some vacuum, opaque and impenetrable until, thank God, some good presenter comes to our rescue and illuminates it.

Nor is presentation inexorably required in order to perform the act of designing. If it were, that would mean that a tongueless person, who also perhaps further lacked the ability to play Charades, could never be a designer. Which is ridiculous of course. None the least of which because I cannot name any tongueless designers who could not also play Charades, but I trust that within the expanse of probability such a person could nevertheless exist.

But what about basic language and cultural barriers?

I now work and live in Switzerland with a team of highly international designers: German, Swiss, Swedish, French, Ukrainian, British.  And so perhaps I see this more acutely than Mr. Montiero, who lives and works in America.  But the native languages and references of these great designers are all quite different – and this would obviously affect their ability to present to, say, an American audience. If I valued their universal presentation skill above their great design skill, well – there would be no team.

That said, it would be interesting to see Mr. Montiero present to a roomful of native Chinese executives.  I wonder whether he would attempt to learn Mandarin, or choose to have a Mandarin translator interpret his words and meaning, or ask the Manderin-speaker on his team (if he has one) to assist in the presentation.  More critically, I wonder if he would be eager to define his presumed lack of fluid, confident Mandarin presentation skill as weakness in his design, or in his skill as a designer.

I’m admittedly being obtuse here, but only to illustrate the fault in the mindset. Great design is worth defending with presentation support, and I would argue there are even those projects where, counter to Mr. Montiero’s opinion, design actually does speak for itself.

 …design which is not “great” rather usually does require a fair amount of explanation. Enter “good design”

This is because design is, in part, a language of its own. Indeed great design results in, among other things, the communication of function.

So where design is truly “great”, as opposed to “good”, its value must be nearly, if not sometimes wholly, self-evident. Great design is observable — at the very least, by the designer’s own team, for example. More on that later.

In contrast I find that design which is not “great” rather usually does require a fair amount of explanation. Enter “good design”, or worse, which may in fact require some presentation skill merely to compensate for its relative lower quality, its relatively weakened ability to self-communicate.

Supporting Talent

If you limit what you value in design talent by requiring that it absolutely be accompanied by self-sufficient sales skill, then you are shutting yourself off to some of the most creative and talented people in the world.  Indeed many people become designers and artists in part specifically because their brains don’t connect with the world the way people who are good presenters do!  From my point of view it rather requires a kind of tone-deafness to the psychology of creatives to not see this.

My old friend, Sir Ken Robinson, speaks on the topic of creativity all over the world, and he often points out that exceptional intelligence and creativity take many forms. That rather, our reluctance and systematic inability to recognize and accommodate these varied forms of intelligence and creativity – our resistance to individualizing our interaction and support of it – results in an utterly wasted natural resource. He points to many famous creative people — at the top of their respective fields — who simply didn’t fit in “the box”, they didn’t easily align with the system. And that only through acknowledgment of their unique skills and provision of personalized support, could their inordinate brilliance find its way into the world. These are the people who often dominate their profession once the standardized models surrounding them are challenged to support their unique strengths.
And I suppose I feel something similar is certainly true here. From my perspective, great talent must always be nurtured and supported. Even if, no, particularly if, that merely requires the support of a presentation.

From my perspective, great talent must always be nurtured and supported. Even if, no, particularly if, that merely requires the support of a presentation.

My expectation is that the people who buy into Mr Montiero’s stance don’t like this idea in part because, for them, it probably perpetuates an old archetype of entitled, high-maintenance designers; insulated royalty who idealistically prefer to ignore business realities and design in a bubble. Of the managers and the operational and sales functions having to serve and adapt to the designers whims— of having to support and compensate for someone who isn’t carrying his weight in the business sense.

In reality, the type of extra effort required to support the development of truly great creative work in any field is exhausting and something that anyone lacking sufficient constitution gets quickly fed up with. So it must feel good, refreshing even, to be able rally behind this concept, to shed all those feelings of subordination and responsibility, and demand that designers do that work themselves, to say:

“Designer, if you can’t sell the work yourself you’re not good enough! Because guess what, it’s always been your job – alone!”

And although that stance may feel refreshing and proactive, it’s misguided.

The Business of Design

“Work that can’t be sold is as useless as the designer who can’t sell it.”

With this excerpt from the article, here again, I take issue. Sure, in the business of design, work that can’t be sold is (usually) useless. Agreed. But why on Earth is it the only option that the designer alone sell the work? And why does that make one’s world-class, insanely-great design “useless”? This designer obviously works on a team, since Mr. Monteiro “would rather have” one of a different sort. So where is the rest of this team?

Of course in business, presentation must happen — it’s a requirement in the client-based sales of design. But how we go about accommodating that requirement within our agencies, I think, is a fair debate, and a relevant topic.

In my teams we frankly rely on one another. Does that sound odd?

Since we have already established that great design can be identified in isolation without the accompaniment of a formal sales presentation, that means great design is observable. At the very least, it’s certainly not going to be missed by a seasoned team. Especially, I assume, by someone like Mr. Monteiro, or his fans, who have all undoubtedly worked in design for a very long time. Surely each would acknowledge being able to recognize great design work if it were shown to them without the benefit of a sales presentation?

In my teams we frankly rely on one another. Does that sound odd?

So when this truly great designer who can’t present comes to you, lays an unbelievably brilliant piece of design work on your desk, perhaps the best you’ve ever seen, and mumbles to his feet:

“Yeah, um…well, this is what I did. ….er… I uh…. don’t know what else to say. (inaudible… something about “…my mom… ”)

What does Mr. Monteiro, or any of the people who would argue with me do?

I’ll tell you what I wouldn’t do, I wouldn’t yell:

“Somebody get a worse designer in here and start all over! Pronto!”

I would sit down patiently with this great designer who can barely put two words together, along with members of our team, and talk it through.

This is where a couple things happen, first it’s at this time that a strong director is sometimes called upon to be a mentor, a psychologist, a parent or friend to nurture, to listen and understand, to pull words and thoughts from someone whose mind literally doesn’t work that way. Yes, that sometimes takes work, but in my world-view,  great design is well worth it.  This is also when the team comes together to build our common language.   The fact is, the whole team needs to understand the project anyway. We all need to internalize why it works and what makes it so insanely special. Each of us.

If the design is actually great, at most, this exercise takes one hour.  Usually quite a lot less.  Rather I find we enjoy discussing truly great work, it sets the bar.  And we probably spend more time than necessary doing that because we love doing it.

And I have never in my 30+ year career been faced with a situation where someone on the team who was indeed exceptionally skilled at presenting could not assist a great designer who can’t present well.

Oh sure, it’s super-duper convenient to have great designers who are also great presenters — but those are rare creatures. Unicorns. You better believe that your search results get exponentially narrower with each search term you add. To combat this natural rarity, Mr. Monteiro claims he would rather broaden his search results by dropping the term “great design” from a search that includes “can also present”.

Whereas I prefer the reverse.

What the Client Wants

Obviously Mr. Monteiro is a busy person who runs a company that hires designers. This company cannot survive if design work is not paid for by clients. Perhaps because he has very little time, he has therefore decided that he needs his designers to be able to present well, as well as design. In fact, his preference for designers who can present is so strong, he will choose a designer with lesser design talent to accommodate that.

Hierarchically this clearly places the quality of the work below one’s ability to persuade the client to buy it.

Hierarchically this clearly places the quality of the work below one’s ability to persuade the client to buy it.

If one were to take this to heart (and I am not suggesting that Mr. Monteiro necessarily takes his own advice in running his studio), to me this would be a very cynical, virtually dishonest, platform on which to operate a design firm that promises great design solutions. Indeed it’s a hiring platform that is perfectly engineered to systematically lower, rather than raise, the qualitative bar. One that prioritizes not the best work, but the ease of the financial transactions. And one that takes advantage of unsuspecting clients.

Where I come from that’s called selling out, and as a client, if truly great work is what I’m in the market for, any team that operates that way is the team I wouldn’t knowingly hire.

Good and great are relative of course, but in principle, I simply cannot imagine passing on what I would perceive of as great design in favor of something lesser-than just so that the rest of my team and I don’t have to put effort into assisting with a presentation. Because in the end — that’s all this boils down to — a willingness to apply the required effort to sell the greatest solution.

If you’re not willing to support a great designer with help in presentation — you might as well tell your clients you routinely compromise on quality because you don’t like to work that hard.
Surely your clients would vastly prefer having the best possible talent in the world on their project.

Common Ground

Honestly when I originally wrote this post I just didn’t think Mr. Monteiro probably had the opportunity to be as critical as I am being about the language of that particular statement yet. I guessed he might have accused me of splitting hairs.  Of playing semantics. I was sure if he were really pushed against the wall on this topic he would probably concede that this particular stance seriously needs to be re-worded.  But his recent lengthy magnification of the idea at his recent talk makes me think he sincerely believes it, as I interpreted it.

That said, the two skews in which I think Monteiro’s language and my beliefs on this topic are in absolute alignment are:

  1. Regarding a fully independent designer, one who wishes to work in total solitary — not part of any team — contracting design skills directly to paying clients. Then I agree, having presentation skills will be critical in the event that you wish to support yourself on the sales of that design work.
  2. And that as a universal rule, presentation is an excellent skill to nurture in yourself if you are a designer, or occupying any other role on the planet, quite frankly. Presentation is just a good skill to have no matter what field you are in or role you play. It’s a skill that will always serve you well in affecting the world to suit your interestes. Everyone should do what they can to improve their presentation skills.

The lone swordsman aside, if you have even one other partner or team member, there are almost always alternatives that will allow great work to be presented and sold.

And if you are indeed a great designer, a lone swordsman, and feel genetically incapable of presenting well, I’d suggest you develop a professional relationship with a strong sales/presentation partner.

FYI — that’s generally called starting a company.

Apology

I’d like to sincerely apologize to Mr. Monteiro for being so hard on him in this piece. I’m sorry. I rather respect his thinking in every other way so I have felt conflicted the whole time writing this. I think if you haven’t, you really should read his article because it otherwise contains some solid advice and can help you be a better presenter.

…Just maybe completely skip the first two paragraphs. Truth is, the thrid paragraph is actually a much better opening bit.

Lastly —  To any truly great designers out there who can’t present well or don’t feel comfortable doing so:

I admire and respect you. I’m hiring great designers all over the world.  Send me a message.

iOS Ad Blockers: Why Advertisers Are Suddenly Going Diarrhea In Their Pants

Apple recently released ad blocking capabilities in iOS, and the ad and publishing industries began frothing at the mouth. Every emotion from spitting panic to disdain have been hurled into the webversphere over the capability. And as a consumer, and an ex-advertising shill, I love it.

I am particularly fond of the most vicious ad blockers, the so-called ‘blunt instruments’. The ones that leave gaping, blank maws between thin slices of actual content. The ones that so severely disable Forbes ’welcome page’ (an interruptive page of ads feigning value with some irrelevant ‘quote of the day’) that you are required to close the resulting blank window and click the article’s original link again to see the content.

Yes, I even revel in the extra effort it requires to get past all the newly broken, well-blocked bits. It’s harder in some ways. But you know what? It’s payback time. And that extra effort? It’s a pleasure. I know that each tap and empty window is sending a message.

With every whiny press release and industry insider wailing about the “end of content as we know it” a delightfully warm, glowing feeling washes over my insides.

I admit it, it’s an unhealthy pleasure in general. And in any other context I wouldn’t celebrate it. But here? I’m gonna party like its 1999.  Because for all the ad industry has learned since then, it might as well still be.

I’m gonna party like its 1999.  Because for all the ad industry has learned since then, it might as well still be.

This is what selfish, self-inflicted industry ruin smells like. Banners in ashes, melted trackers. A stockpile of suddenly outmoded scripts and tactics, all in embers. The dumbfounded expressions of dim-witted middlemen watching the gravy dry up.  Ah, there’s that warm glow again.

Unfortunately, ruin is what this will take.

I realize there is a risk that the arms race will result in even more devious forms of advertising, that the penicillin will result in resistant strains. But the relief for now is unquestionably worth it.

Even so, some are feeling guilt.  Under peer pressure, I assume, a few creators of Ad blocking technology are trying to give a crap.

Marco Arment pulled his ad blocker from the iOS app store, after 3 days as the top seller, I assume, with a last-minute guilty conscience.

He said: “Ad blockers come with an important asterisk: while they do benefit a ton of people in major ways, they also hurt some, including many who don’t deserve the hit.”

I believe his observation is mostly correct but his response was wrong. And his kids will probably hate him someday for leaving a sizable portion of their inheritance to someone else’s family. To wit, other excellent ad blockers have already moved in happily.  At least he hopefully slept better that week.

Then there is the new “AdBlock Acceptable Ads Program” where the previously dependable ad blocker now whitelists so-called ‘acceptable ads’ – allowing these ads through by default.  They define acceptable ads as adhering to a manifesto they’ve concocted which attempts to qualify approved types of interruptions.  I commend the attempt – but it is critically flawed,  a fundamentally incomplete manifesto, that sits precariously on an arbitrary portion of the slippery slope.

In an article posted to the Verge, Walt Mossberg wrote: “browser ads represent both an unwanted intrusion and a broken promise”. I read that and wanted to virtually high-five him since I momentarily thought he shared a core belief. But then I kept reading and discovered that the only ‘intrusion’ he referred to was the surreptitious collection of your information, and the ‘broken promise’ was the delivery of ads that weren’t as personalized and useful as he felt should be possible.

Well, ok he has a point, a reasonable one, but completely misses THE point. He’s a Kool-Aid drinker debating flavors.

So, What Is the Point?

Those of you who have read this blog in the past know that my world view of interactive media has, since the early 90s, been based on a small handful of very stable principles. Interactive Axioms.

The most sweeping of all, what I call “The First Axiom of Interactive”, is that the user is, by definition, in control. “The User is your King. You, the creator, are merely a subject.”

People don’t often acknowledge that this medium would simply not even exist if delivering control to the user was not the singular top-most goal.  There is nothing inconsistent or squishy about this reason for being.  Any functional capability you can point to will distill upwards to the quest for control.

The sheer existence of an affordance, a button say, anywhere on a remote control, or a website, or app, is a promise. It’s not one that we talk about much. But the obvious, unspoken promise is that it will react predictably and instantaneously.

The medium itself is an affordance – and the expected result of that affordance is control.

THAT is the promise.  Said another way, the medium itself is an affordance – and the expected result of that affordance is control.

If you remember DVDs and you happened to be in the USA, you might recall the FBI Duplication Warning at the start of every movie. Upon seeing these warnings, every one of us pressed the “skip” button. And then we subsequently experienced a moment of inner outrage because the button had been temporarily disabled requiring us to view the FBI warning in its entirety.

The promise of control had been intentionally wrested away from us. And it felt like a violation.

Because it was.

Today interactive media is based on an even wider and more articulate provision of such control. It is a ubiquitous and fundamental condition of the medium. As such, any time anything happens that is not what we wish, we feel something similar to a sense of injustice. A violation of the medium.

So, yes, of course Walt Mossberg is right, spyware and irrelevant ads sit somewhere on the spectrum of broken promises. But what he does not acknowledge is that the mere existence of interruptive ads in the first place, ads that were not explicitly requested, is the spectrum.

That’s further the problem with the Adblock Acceptable Ads Program manifesto.  It attempts to carve out a little plateau on the slippery slope that allows for *some* control to be wrested away from you.  But they miss the point which is that sheer interruption of any kind, not degrees of interruption, is the violation.  My rewritten manifesto would be very simple and would contain only one test, “Acceptable ads do not, in any way, interrupt the user’s attention.”

Acceptable ads do not, in any way, interrupt the user’s attention.

That would be acceptable.

But the problem for advertisers, then, is that such an ad will take up no screen real estate.  It will call no attention to itself. It will not seek to draw the user.

In short therefore, it will not exist – until explicitly sought out. That is an acceptable ad, because that is an ad that honors the promise of the medium.

John Gruber occasionally points to his ad partner The Deck, as a viable ad model, intimating that it is less invasive, and more relevant, and therefore an appropriate ad format. Ads, but not “garbage”. He claims not to understand someone who wants to block ads. But I hope you can see that he is still defining the Deck’s format merely by contrasting it with the grosser violations of other advertisers. Yes, it’s a degree less offensive, sure. A comparison to “garbage” ads actually makes sense because they are, after all, genetically closer, interruptive cousins. But we are not comparing it in context to, say, the content the user sought out in the first place. Because if we did that we would see that such an interruptive ad is still quite a lot further away.

If you’re an advertiser, or an interruptive-ad-funded writer or publisher, I’m sorry if your livelihood may yet suffer as a result of ad blockers. That’s no one’s goal. But it’s you who’ve chosen to base your livelihood on such a patently inauthentic payment format, one that defiles the very medium it exists in. Tidy and convenient though it may have seemed for you at the start.

It’s a kind of Faustian bargain. Content creators agree to include interruptive advertising to afford creation of their content or derive wealth. But the ads are, by definition, not the content. I seriously doubt a single one of these content creators would choose to include an interruptive ad on the merit of the ad alone. Which reveals a truth.

That interruption in the user’s quest, the user’s wishes, is not allowed in this medium. If you break this rule – you must accept the penalties.

You say, “But ads are the necessary cost of receiving content!”  No, actually they are not. It’s the cost of receiving your content. And if you stop, unable to afford creation of your content any longer, don’t worry, someone else will be there to take up the slack.  And I think you know that.

“But ads are the necessary cost of receiving content!”  No, actually they are not. It’s the cost of receiving your content.

Do you seriously think that without advertising content creation will go away?  Please. It will result in industry upset perhaps. It will inspire more authentic payment systems, or not. But it won’t go away.  Fees from advertising is not a prerequisite for creation of content.

All these publishers and content creators who complain about the bluntness of the ad blockers, arguing about which interruptive ads should be blocked, are already working way outside true-use of the medium. Ignoring the basic fact that they stand on stolen ground to begin with. They rather seem to be suggesting that there is a way to break the law of the medium in a good way. They remain hopeful that they can remove maybe just a little of your control. And that should be totally ok with you.

Well, sorry, I appreciate the work many of you do – but you’re wrong. It’s not ok. You have merely gotten away with the violation until now.

Authentic Advertising

Authentic advertising (if you can even call it advertising) requires an advertiser to be part of the very business it’s selling. To promote the product through authentic interaction with the product itself (I’ve written about this before). And/or to create something that is so inordinately valuable and powerful that it will be sought out. To become the very content, services and products that people want.

To create authentic advertising you must embrace that you must be CHOSEN (or ignored) by the King. If you interfere in any way in your King’s journey to suit your own interests – even daring to appear when the King doesn’t wish it – you are a violator. A criminal.

Since you are not allowed to present yourself until invited, authentic advertising is hard. Much harder than the ad industry is accustomed to.  Traditional interruptive ads need only be good enough that users maybe won’t look away after their control has been wrested away. That kind of traditional, interruptive advertising of course is much easier to produce.

But rather, honest to god valuable content that people might be willing to pay for, or invest their time and networks into, takes the same effort, risk and expense that developing a successful product does.

Interruptive ads need only be good enough that users maybe won’t look away after their control has been wrested away.

Do not confuse this with so-called ‘native advertising’ as it’s been disingenuously referred to, which is little more than a cheap ad aping the appearance of content.

Authentic advertising in interactive is not easy to produce, and it’s often the subject of inordinate luck. This means advertisers wishing to defensibly game that system have to resort to great expense and extravagance. And precious few are willing to do that.

Conversely, interruptive advertising requires little to no luck, and demands roughly the same work and expense that advertisers are used to applying. The difference is that these advertisers are still, unbeknownst, spending wildly. The resource these advertisers have been spending rampantly without qualm is your goodwill. Your willingness to continue to tolerate their violations.

Well advertisers, you’re in a deficit now. A really big, fat overwhelming deficit. Hope you enjoyed the ride, because interruptive advertising has drawn down your accounts and built tremendous debt.

And ad blockers are just the latest means of putting holds on your well-worn credit cards.

An Open Letter to the Creators of the New Muppet Show

Dear Disney and ABC,

Holy crap, how could you assholes so monumentally blow it!?

Whoa… I’m… I’m so sorry, that just came out. I totally meant to intelligently build up to that point.  Sorry, let me start over:

Dear Disney and ABC,

There is precious little joy in our world. Such little magic and wonder. Far too little care-free innocence.

When we connect through media today, our lives are more commonly associated with terrorism, disease, economic meltdowns resulting from greed, natural disasters, child shootings, police brutality, suffering, intolerance, hatred and the ongoing horror and assault of the seedy bottom half of real life.

Hold on, before you jump into writing me off as some “the world is going to hell, whatever happened to the values of this great country” kind of person, who worries that video games are perverting our youth or thinks television should be censored or whatever, please know that I enjoy violent video games and I like seeing movies and TV shows that push the limits of acceptability. Things change. Boundaries get crossed. That’s progress, art and evolution. Live and let live. No I truly don’t give a crap about breaking the rules and pushing the limits of inappropriateness.

But over the last several weeks I’ve discovered that I actually do give a crap – a really big, fat, loving crap, about the Muppets.

A New Muppet Show!

When I heard that you were bringing the Muppet Show back – with a new, more modern take – I remember thinking, “YES! It’s about time!”

The news was reason enough for celebration. I told my wife and some friends and they too shared the very same sentiment. Who wouldn’t? One would have to carry a very cold, dark heart not to feel that way.

But as I have been exposed to your new pre-show clips, teaser, pseudo press releases and marketing, a slow dawning has crept over me. It took a while, but I have begun to feel something unsettling that has taken some effort to define.

In fact my initial elation has now settled into deep disappointment.

Although as of today the show has yet to premiere, I believe (but continue to hope you’ll prove me wrong) that you have mistranslated and misunderstood Henson’s great, iconic legacy. Worse, I believe you may be in the process of undermining it, surely unintentionally. But surely nonetheless.

Until I saw the teaser for the New Muppet Show (UPDATE: the video has been pulled) I confess I took the long-standing values of the Muppets and the reality of their world quite for granted; how they behaved, how they deftly interacted with our real world at arms length. They made it seem effortless.

And one of those values, a key attribute, perhaps the most critical of all, is that the Muppets never – ever – fell below THE LINE.

The Line

When I talk about the line, I mean the line above which the Muppets remain arguably pure creatures at heart, connected to the joyful world they came from, and largely driven by the pursuit friendship and the spreading of happiness. Sounds a bit corny – but in fact isn’t. And the line below which the Muppets would become just another part of real-life’s ugly bottom half, inconsistent, undependable, self-centered and cynical.

Naturally, good humor demands breaking boundaries, stepping over some line. And at their strongest, the Muppets were so very good at doing that.  The Muppets always broke boundaries. They understood magic – of playing with the medium (whatever medium they were contemplating) – of breaking the 4th wall – of being surprisingly self referential. And in so doing concocted their own, very recognizable, brand of magic.

And I imagine, aside from Henson’s obvious challenge of inventing, or rather, raising the art form to a new level, it must have taken tremendously hard work and commitment to that vision to maintain that position – above the line.

Henson, Oz and company always stayed above the line. Dependably. They clearly worked very, very hard to find new humor and boundaries to break above the line. Satire and social comment are all possible above the line of course. Tear-inducing laughter is possible above the line. Pixar, for example, dependably and successfully lives only above the line. Boundaries can be broken above the line. And like it or not, the Muppets made clear that being above the line was a fundamental tenant of the brand.

As I reflect on my feelings upon seeing your new teaser, the pseudo PR and marketing for the new show, I believe the tone with which you are approaching the new Muppet Show, the direction your underlying compass is aimed, is fundamentally inauthentic and careless.

I further argue, that this approach you’ve taken, this direction, required very little effort. You merely chose the easy path.

You’ve just drug The Muppets way below the line, sacrificing everything that came before it, in exchange for a few cheap laughs.

You’ve just drug The Muppets way below the line, sacrificing everything that came before it, in exchange for a few cheap laughs. You chose the dark side.

You chose the dark side.

Did you think, for one second, that the temptation to do what you have just done was not an easy temptation all along to the original teams, just as it was for you?

Do you think that living in 2015 somehow suddenly makes such a thing a good idea? Perhaps that only now would we “get” such a joke? Give me a break.

Yes, yes, the Muppet Show was made “for adults”. And quite often the show would venture briefly into comically dark places. But these ventures always fell short of true cynicism of cold reality. Never would the Muppets cross the line into the seedy underbelly of real life. Of genuine cynicism, grime and fear. The Muppets were never cynical, they were never crass. They always reassured us with a deft wink. They were always tethered to a balloon that kept them floating, kept them from descending. And in so doing they defended and insulated us from the bottom half of life. That was their role and very reason for being after all! They gave us a world that we could to escape into. One that wasn’t reality.

Why then have you concluded that being an“adult” today must equate to being cynical, inwardly conflicted and cold?

I have no doubt that as the new show and its tone was being developed words like “edgy, fresh, and real” were used. Which always, bar none, sounds like a good idea in any board room.  Who wants to be the opposite of that?  Further that you probably felt the writing of the later Muppet movies and presentations were growing stale and you must have talked at length about breaking through that staleness with a “modern, fresh take”.

I do not believe, as you must, that the Muppets innate lack of cynicism, and consistent distance from the grotesqueness of the bottom half of real life, was the reason the material was not compelling enough, not fresh. That, I believe, would be a misdiagnosis on your part.

We can debate the quality of much of the writing in later movies and years. Some of it was admittedly a bit tired and occasionally not very good. Not as good as Pixar. Some of the later movies suffered a kind of lack of meaningful stakes for the characters to respond to (one might also argue this was true of Most Wanted). But, and I suppose this is one of my main points, I do not believe, as you must, that the Muppets innate lack of cynicism, and consistent distance from the grotesqueness of the bottom half of real life, was the reason the material was not compelling enough, not fresh. That, I believe, would be a misdiagnosis on your part.

What’s worse, by depicting The Muppets in our often tragic, imperfect real world via the reality-TV, documentary style, and imbuing the characters with peculiar new behaviors, inconsistent with their legacy, you have, perhaps unintentionally, established that anything the Muppets may have been before – any purity or innocence they may have shown us in the past – all of that was actually unreal, an illusion, just show. That by rewriting their characters and motivations to be able to exist in our world, you have introduced the idea that whatever we thought they were – with their original personalities, these were just parts, roles they’d been playing before. That only now are we seeing the “real” Muppets, their real lives, behind the scenes, for the first time. The suggestion is that they were actually like this all along, we were just never exposed to what they do off-camera before now.

As a result, you have instantaneously undone and debunked Henson’s entire great legacy.

What a shame.

As a result, you have instantaneously undone and debunked Henson’s entire great legacy. What a shame.

Yes, Miss Piggy, and others as well, have made occasional appearances in our “real world” for decades.  And it was always met with a level of heightened enthusiasm from audiences.  It’s pretty transparent that this partly inspired your approach.  But it was not so simple.  These appearances, and the occasional overlaps with our real world was always a delicate balancing act. Those brief appearances were a magic trick that only worked because their world, the Muppet’s world, still existed somewhere. Piggy was only visiting us, breaking the 4th wall of their world.  And we were all in on the joke. Her dips, so precariously close to the line on those occasions, were handled with extreme care and awareness.

And on a superficial level, yes, most of the movies even appear to happen in our world – but they never did. It was always the muppets own world, and it only looked a lot like ours. On the Muppet Show and in every Movie, human actors always joined the Muppets in their world.  Not the other way round.

…one instantly feels that we were never meant to see any of this.

But by eliminating the existence of the Muppet’s safe, insulating world, as the new show appears to have done, you have scraped them raw. Laid them bare. There is no Muppet world left to poke through or join into. The magic has been surgically removed. Like pulling the skin off a live animal, we now see with discomfort, the organic muscle, ligaments and bones hidden underneath. And one instantly feels that we were never meant to see any of this.

Fozzie

I think the appearance of Fozzie in your teaser best captured this problem and as such caused my heart to sink most of all.
So, Fozzie has a sexy human girlfriend. Um… ok. Feels quite out of character and slightly creepy, but alright, I’m sort of with you, maybe, MAYBE that could work, Miss Piggy was briefly attracted to William Shatner years ago (although that WAS absolutely in character for her).

COMIC howard the duck magazine 7

Howard the Duck and Hot Girlfriend

But then you bring us to the real home of his girlfriend’s disapproving human parents, they reveal their “secret” romance, she calls him “Honey”, Fozzie’s panic over his pending unemployment, and all that stark reality is run through a way-below-the-line, icky exploration of a kind of cartoon bigotry and a clear intimation of sexuality. Such ideas were somewhat funny in the Howard the Duck comics – a comic world specifically designed to explore these topics – but feels utterly out of place and even grotesque here because this is not some random bear. This was, we all thought, our innocent, beloved Fozzie. But it slowly dawns on us that, no, this is not our Fozzie, it’s a strange imposter. Even Fozzie’s voice change, no longer performed by the brilliant Frank Oz, might have passed by without bothering us much, but packaged within a sweaty real world just makes us feel queasy.

As a result of this scene, we are not so subtly asked to consider Fozzie’s underlying drive for survival and even his reproductive needs. Requisite mental images of the two of them “sleeping together” conjure naturally as a side effect of your scenario. Oh, sorry, you didn’t even think of that, right? Images of the two of them having sex? Never occurred to you? Uh huh, sure, how sick of me to even think that. Yeah, right, convince yourself of that.

Images of the two of them having sex? Never occurred to you? Uh huh, sure, how sick of me to even think that. Yeah, right, convince yourself of that.

Face it, the joke of that scene – the uncomfortable humor – comes from the fact that a real woman is really truly dating a bear puppet. Ha ha ha. The rest of the mental images are just falling dominoes.

Gonzo’s love affair with Camilla the Chicken never had this kind of real-world context and intimated followthrough.

You’re showing us inauthentic things that, speaking as a viewer, we never wanted to see. You’ve pushed deep below the line and opened a big ol’ can of slimy worms: if Fozzie can be unemployed, does he get unemployment checks? Since he’s in our world, well, one must assume that he does. If he can’t afford food does he go hungry or beg? Does he mooch off his friends? Either way this is all a kind of undeniable, below the line thread that just feels icky. But why stop there? One is almost encouraged then to wonder all sorts of things – perhaps whether Fozzie gets feces stuck to his fur when he defecates. Does he wear a condom? No, don’t feign surprise at all this. Please see that this is the natural result of pushing below the line as you have. You have broken those boundaries, and opened these thoughts, not us. Though undoubtedly you would feign surprise at such implication.

Good god, Disney! You have whole buildings full of departments in place devoted to ensuring that Mickey is never caught in compromising positions. How dare you turn around and do this to our dear old friends.

Kermit and Piggy

Good god, Disney! You have whole buildings full of departments in place devoted to ensuring that Mickey is never caught in compromising positions. How dare you turn around and do this to our dear old friends.

Really, now Kermit actively dates and is “hopelessly attracted to pigs”… in general? Ugh, too much information, yet again.

Gonzo was insane – and loved chickens. That worked. It never dipped into sexuality because he was truly an eccentric. But Kermit is sane, he’s our hero, the reasoned one – and therefor this new intimation that Kermit sleeps around is once again moving towards the too-real grotesque. And this focus on his and Piggy’s TMZ break up – as though they actually ever had a relationship – seems totally misguided.

IMG_0933b

The garbage can in the foreground says it all.

Yeah, yeah, we get it, if they are “broken up” it gives the characters and narrative something to build to. And it’s tabloidy which plays into the whole theme, and maybe most important of all, serves as free marketing.

Brilliant.

Hey, you’re the writers, but throughout the Henson years, the beauty of their story was that, well,  Kermit and Piggy never really had an official relationship to break up over.  They flitted around the idea, flirted with it you might say, Piggy always on the offensive, and Kermit never quite connecting. Like so many other things, even their relationship always hovered just above the line. They’re so-called relationship was a slippery and elusive concept. Totally non-committal.  By design.

But by bumbling into the the Muppet universe flailing, mouth-breathing and drooling as you seem to be, you are knocking over these delicate constructions, it seems, without much care for the original rationale or their great benefits.

Gonzo

This failure, like the others, is so obvious and easy to see.

In your teaser you chose, of all characters, Gonzo to criticize use of “the office interview” format.

Should have been funny. I wanted to chuckle because the observation was a good one. But I found myself wincing a bit. Don’t you see, you chose perhaps the only character in the entire Muppets main cast, next in line perhaps to Animal, who lacks enough self-awareness to even have such an opinion in the first place? So it just feels strangely “off” somehow. Not to mention that Gonzo’s, well “GONZO” has been completely denied. Now Gonzo is suddenly just some calm, rational guy? Seriously?

Hello?! Gonzo is many things, but calm and self-aware was never – and I mean like ever – one of them.

TMS.301.GonzoPiano

Gonzo recites the seven-times table… balancing a piano. Naturally.

This is the guy who overenthusiastically agrees to every insane, wrong idea, no matter how absurd – in the name of art. That’s who he is. You see that, right? The guy who shoots himself from cannons, wrestles a brick, tap dances in oatmeal, recites shakespeare while hanging from his nose, Plays bagpipes from the top of a flagpole, recites poetry while diffusing a bomb, hypnotizes himself, and wants to go to Bombay India to become a movie star because it’s not the “easy way”.

Did you, even for a second, consider that just maybe Gonzo was the completely wrong guy to feel vaguely self-conscious and introspective enough care about such a subtle little narrative device? Do you really think he, of all characters, would really care? Piggy sure, Fozzie maybe, but freaking Gonzo?! You’ve lost me. The reason that joke wasn’t funnier (and it should have been), is because you chose the wrong guy. You went fully against his long-standing character. And we all felt it. Maybe younger viewers don’t remember enough to care, and maybe most long-time viewers couldn’t quite put their finger on why – but sure enough – it just felt weird. And it’s another example of your apparent inability to defend and shepherd The Muppets at a most basic level.

What Worked

Lest you think I did not appreciate any of your effort, there were, what I would call, a few “authentic, above the line, classic Muppet moments” in the teaser too.

Miss Piggy’s walk, smack, into the glass, leaving a nose print. A brilliant moment.

That creepy “incredibly obscure character” with glasses who talked with his tongue between his teeth was funny as crap.

I’m conflicted on Rowlf wearing the big surgery collar. I laughed authentically at that. And although that doesn’t sit above the line, maybe ON the line, a very careful, self-aware break like that can clearly work, so long as the Muppet universe is still intact.

Work Harder

Look, truth is – the idea that, say, a puppet is dating a real girl, probably has sex, meets her disapproving, real parents, and maybe loses his job and all that, that’s actually really funny.

ted-article

Ted smokes weed.

Seth Macfarlane’s Ted did that a couple years ago and it was a good movie. A teddy bear that has sex, smokes weed, swears – it’s totally juvenile and funny as Hell. I loved it.

And then there’s “Meet the Feebles”, Peter Jackson’s obscure, disturbing puppet movie that includes a frog prone to vietnam war flashbacks, a pornography-directing rat, suicide, adulterous three-ways, alchoholism, drug-running and all sorts of other far below the line topics.

meet_the_feebles_04_stor

A gun to the head of a character in “Meet the Feebles”

But the Muppets? In one of your bumpers Rowlf talks about being followed by cameras into his bathroom at home. It’s kind of funny, but so now Rowlf uses the can? This is a very slippery slope you’re on.

In the old days these topics could never find their way into the Muppet consciousness. The Muppet world was intentionally disconnected from all that. But now, stripped from their world, these  real-life concepts begin to co-mingle, and indeed they will.

And that’s not who the Muppets are. You should have known better.

“Hey – you’re making all this up! We never said they had sex, and we definitely would never show them doing drugs or taking a dump!!” you say.

No? But that is the world you have directed them to inhabit. All these ideas, and a lot more, exist in our real world, and you have placed them in that exact real world. You have provided no buffers. No signals. No insulation from the edges of that very cold reality. Indeed, your every creative decision has amplified it. You have said, “They live with us here, amidst our real-life challenges, filth, and complexity.”

What a monumentally bad call.

You have said, “They live with us here, amidst our real-life challenges, filth, and complexity.”  What a monumentally bad call.

If that’s the show you wanted to make, why, oh why didn’t you just work harder, take some risk (e.g.. by not trying to rely on the automatic, positive associations we all have for the characters), and instead invent a new set of colorful characters of your own. Some who could more naturally play out the decidedly unMuppet-like topics you are shoe-horning our old friends into? I would have actually enjoyed seeing that show to be honest. I would have tuned in, and I’m sure I would have laughed. Ironically the connection to the the Muppets and every other pillar of innocent puppetry would have been obvious. But at least then you would have been arguably protecting and defending something the world still needs.

We needed the Muppets that Jim Henson left us.

We needed the Muppets that Jim Henson left us.

But instead you chose to exploit our gentle, rainbow-yearning friends into the same old, daily gutter that we were all, ironically, trying to escape. All in trade for a couple easy, if uncomfortable, laughs and the benefit of a built-in audience.

“Hey, the Muppets were all about cheap laughs.” True, but you did it at the utter expense of their very long and hard-earned legacy. You threw that gentle, magical, innocent legacy under the bus of reality. And that is not where The Muppets great and endearing humor belongs.

In doing so you have so far proven yourselves unworthy guardians of these beloved icons.

And from Disney of all places. Hard to imagine.

Well, I’ve made my point ad nauseam. So all I can do now is beg you, please, please be more careful.

These are our dear friends.

And corny as it sounds, the world still needs that rainbow.  Maybe now more than ever.

Messages From The Future: What Happened to Apple Watch

As some of you know by now, I am from the future.  And slightly annoyed to be here.  But anyway, this is what became of Apple Watch.

AplWatch-Hero-Tumble-PRINT

Truth is, being back in 2015 is such a trip. All this talk about “wearables”. I have to laugh, I remember that! Ugh, It’s so quaint to hear that again. “Wearables”. For the record, in the future no one talks about “wearables” like it’s some classification of device. That’s just you guys coming to grips with the fact that technology is everywhere. It’s in everything, it’s networked, and no, you have no privacy. But that’s a different post.

Today I wanted to let you in on Apple Watch since I guess you’re only now about to see it launch. Weird.

A lot of you are asking “Why would I use it?”, “What’s the killer app?”, “Why would I pay so much for it?”.  Yeah, yeah. You do that every time Apple launches a new device, did you realize that? Android users are staring at it dismissively thinking they would never want one since it probably doesn’t do that much.

“Why would I use it?”, “What’s the killer app?”, “Why would I pay so much for it?”.

Admittedly what the first Apple Watch did was only a glimpse at it’s value. A few years after Apple Watch was released it became pretty obvious what it was all about, and yet it still took a decade before absolutely everybody stopped doubting.

Indeed Apple Watch not only survived a decade, but it survived quite a lot longer than that. It outlasted PCs. It outlasted iMac, iPhone and iPad. The Apple Watch strand functionally outlasted almost every other product strand of Apple device and consumer hardware model you are aware of today. It was still going strong when I popped back here, but by then auto-implanted alternatives were becoming pretty common – even though they gave me the willies.

So, what did Apple Watch do that was so useful?

Much to the chagrin of a fair number of iOS App developers in this time, Apple Watch was not a platform that was ideal for, well, running apps. At least not like they do on iPhone and iPad. Sure people tried. But in short order it became clear that Apple Watch was about being used in conjunction with other devices. If your app did not involve another device or platform, your app-life was probably short lived. As a result many of the best app makers were also developers of apps on other platforms or device makers.  You almost never made an app for Apple Watch alone.

And that was a clue into Apple Watch’s true conquering strategy.

Apple Watch became your key. First and foremost. It was your unique identifying digital self. Your ID for all manner of technical configuration in every other device and context.

Apple Watch became your key. First and foremost. It was your unique identifying digital self. Your ID for all manner of technical configuration in every other device and context.

When I look back, the clues are all around you today:

Apple Pay, Continuity, Apple ID, iCloud, Apple TV.

These are some of the “existing” components that dovetailed to make Apple Watch what it was.

Ultimately, Apple Watch was not a device for consuming media, or even much in the way of experiences (with the exception of communication). Primarily, Apple Watch identified you, it was the key that unlocked your information and preferences and configured all your other devices and environments.

Secondarily Apple watch served as an interface for simple tasks (related to these devices and environments) and as a communicator.

This is not to say that the devices around you became dumb devices (dumb screens, dumb terminals etc). They were never that. They still carried the lion’s share of computing power required to perform their specialized tasks. But they were merely normally “un-configured”.

My Apple Watch connected to any friendly Apple TV and suddenly all my movies and shows appeared. All my content was in “the cloud” after all. (Btw, we don’t call it “the cloud” in the future, in fact we don’t call that anything, it’s just “storage”.)

Within a few years an iPad or iPhone in your household could switch between users depending on who was using it. Your unique desktop and apps would appear on any workstation you sat down to.  Because it knew it was you.

Apple Pay was just another variation on the theme. Apple Watch validated your identity and gave you the choice of credit card to use.

And I should mention, since there is a flurry of speculation, that yes, Apple Watch worked amazingly well with what you guys are calling the Apple Car (and other cars by the way). The Apple Car was particularly excellent. Your digital environment on wheels. Once identified, all your media was available, your seat, mirrors, mood-lighting, common destinations, and temperature adjusted to you, and of course you locked, unlocked and started your car with your Apple Watch.

There was some other stuff of course – once Apple and others started making things for the home. Thermostats, lighting, door locks and home security. It all responded to and was partly controlled by, your Apple Watch.

This system was ultimately more secure as well. None of your other devices had to hold content or information. It was encrypted in storage (sorry, in “the cloud”), and your Apple Watch merely unlocked it.  But in this way, none of your other devices became points of vulnerability.

Do you see what I mean – Apple Watch – plus our finger print (and later more convenient biometric ID – another post) – was our digital key. So it was with us literally all the time.

And this is why so many of us were so willing to spend so much on our Apple Watches. It was the most central piece of hardware we owned; a functional part of every other device we used and every modern environment we entered. It was perpetually on display, occupying the familiar, ornamental status of horological watches of the past. But even more important than that, it was the sole material manifestation of our digital selves.  And in the future, let’s just say, our digital world doesn’t get less important. For these reasons it was plainly worthy of inordinate expense and pageantry.

It was the sole material manifestation of our digital selves.  And in the future, let’s just say, our digital world doesn’t get less important. For these reasons it was plainly worthy of inordinate expense and pageantry.

It was so much more than critics today seem able to wrap their heads around.  More than a hobbled phone, more than the convenience of ready alerts and messaging.  It was your key, your hub, it was you.

There was admittedly an awkward phase where Apple Watch was lovely, if a little bulky. You’re in that phase now, well, or are about to be. But Apple quickly slimmed the device, and generated many more models. Once the dimensions were improved, and battery life extended, Apple Watch found it’s sweet spot. One that lasted for many years.  I could have spelled that “maaaaaannnny”, which is an actual word in the future, but I believe that’s still bad grammar in this time.

Anyway, having seen it all play out, I think Apple understood this larger system before most. Being the Apple with vision, they got all this at a time when other companies were scrambling around calling goofy, little, one-off technical experiments “wearable” when in reality few of them really were. No one wanted to wear visible gewgaws. It was just a fact. Existence of these technologies never sold anyone on wearing some device prominently on our bodies. Not on our clothes (except underwear for mostly medical reasons), and definitely not on our glasses. Not anywhere on display BUT OUR WRISTS. Oh, and our finger of course… ah, but that’s another post.

Apple Watch is NOT Replacing the Mechanical Watch

Time goes byMy little voice is nothing in the breathless rush of chatter about the Apple Watch. But I keep hearing the same set of sentiments from my friends and I think they have it all wrong.

In various ways, friends are lamenting the loss of the mechanical watch. Others are asking “Why do I need this accessory? What’s the killer app”?

Back in the day people had pocket watches. You’d dig in your pocket, and pull out your pocket watch to tell the time.

Then the wristwatch came along. It was smaller – but so much more convenient. The time was right there at a glance.

The thing people have wrong is that Apple Watch is not replacing the watch. It’s replacing your phone. Or it will rather. Apple is just hoping it can provide sufficient value through the form-factor in the meantime.

“But they call it a watch.”
Yes, it’s called “watch”, but calling the Apple Watch a “watch” is akin to calling the iPhone a phone, and not, say, a pocket computer. The Apple Watch is a wrist computer and will eventually replace your pocket computer. All based on pure convenience.

The Apple Watch is a wrist computer and will eventually replace your pocket computer.

“But I need a bigger screen!”, friends have then said. Of course you do for some things, and bigger screens will become accessories. And that’s another paradigm shift here – the watch is not the accessory, the screen is.

There is no way this first Apple Watch is the fully expressed big idea. This is just the first step.  Surely the plans for Apple Watch are long.

It’s long been acknowledged that anyone under 30 who wears a mechanical watch today is essentially wearing jewelry. And that they use their phones to tell the time now.  For these users wrist watches are merely quaint objects on par with vinyl LPs and 50s geek glasses.  So for a generation of users who have abandoned mechanical watches for “pocket computers”, a wrist computer is so much more convenient, and it does not replace anything already there. For them, sheer convenience is the killer app.

For hipsters and us old farts who still think mechanical watches are beautiful and functional jewelry, yes, we need to “replace”. And if one is contemplating that switch there is no killer app. But there are 3 dozen small, functional features – in addition to telling time – that make the switch quite worthwhile.

Over time, I believe that switch will happen – even for them – as the Apple Watch replaces the iPhone.