The Great Web Design Crisis of 2017

Beginning in 1993 and several times each decade since, the interactive industry’s reigning crop of web creators have faced new challenges that have required concerted design responses to overcome. Some of these challenges have been the result of advances in codebases and web standards, changes to hardware, economic shake outs and new business trends. And with each challenge the industry responded decisively. But this year web design faces a new kind of challenge, one that we have never confronted, and one we are failing to overcome. Not the result of external forces, this is a monster from within, ironically ushered in by the very designers and developers that are subject to it. On the surface we can see only symptoms: an industry-wide homogenization of web design, accompanied by a sharp decline in the innovation of new interactive conventions. And while those critical failures would be bad enough, the underlying cause is complicated and runs much deeper. The real crisis is that our entire state-of-the-art web design methodology, our roles and teams, and even our qualitative values are the product of a misunderstanding.



bio1   Narrowing The Cause

Despite now providing access to countless, wide-ranging categories of content, products and services, today’s websites are aesthetically and functionally blending; becoming indistinguishable from one another, save for a logo in the topmost banner. More and more, the brands that occupy these sites are losing their identities in a sea of sameness.

Further, in a medium where interactivity is its defining attribute, and the technology never more advanced and capable of handling challenges, for the most part, designers seem to have all but abandoned pursuit of new, improved, interactive models, rather settling into a non-confrontational, follow-the-leader approach to web design.

I reject the claim that the pursuit of theoretically optimal usability releases us from the strategic need to notably differentiate and innovate. There is not one absolute way things should look, communicate and behave on the web any more than there is one absolute in architecture, interior or industrial design. Great design has always included a quotient of subjectivity.

To which one might then swoop in with the oft-quoted web-design hammer, “Yeah but it’s been scientifically proven that users prefer generic, prototypical designs. Generic interfaces have been shown to convert better.”

Yes. That’s true. At least that is until it is measured against something that converts better than a prototypical design, at which point the opposite will have been scientifically proven.

Which begs the question, did you stop to wonder how that original prototypical design ever established itself in users’ minds in the first place?

It exists because its parts were innovated. They began life as disruptions. Non-sequiturs. In essence, risks. And that’s something web designers, and the companies they serve, don’t appear to have the guts to do much in 2016; instead, taking short-term refuge in the safety of the status quo. Confidently back-peddling into the historic innovations of braver others. Surely you can see that the meme “users prefer generic interfaces” might also be regarded as the convenient mantra of a designer who takes the passive route to short-term profiteering.

Finally, you may be thinking, “Oh come on, any of us could break out of any design trend tomorrow if we so chose. We’ve done it before.”

Actually we haven’t. This is not merely some aesthetic design trend. It’s not some fashionable phase that can change with taste. The root causes of this are intertwined with our very state-of-the-art thinking. To solve this problem we must dismantle many of our current best ideas. A contradiction which results in defense of the status quo. It would appear that we are facing systemic incentives not to fix this.



#    Hot Zone: The Web Design Ecosystem

It bears noting that everything I will describe happens within an ecosystem that is almost perfectly engineered to focus and amplify the problems. For example, near universal access to computing platforms has enabled more people than ever before in history to lay claim to Designer and Developer roles.
Ironically, this also happens to be during a period of “Flat” design which is characterized by a minimum of affordances and fewer discrete design elements than ever. So these precious few design elements are being endlessly, incrementally adjusted – on a massive, global scale.
The narrow result of this spray of minutia is then being further massively shared on sites like Dribbble, Behance, Git Hub, CodePen and dozens of other design/developer communities, which allow for favoriting and sorting of a select few common pieces of work. The top minority of these in turn are being subsequently re-re-re-referenced ad-nauseam and co-opted more widely and freely than ever before.

Sorry, gimme a second while I fill my lungs again…

So of course everything looks the same, for Christ’s sake! This is the systemic equivalent of a perfect storm; a made-to-order-global-make-everything-be-exactly-the-same machine. A worldwide design purification filter. If you wanted any category of design to fall into complete homogeny, you couldn’t do much better than facilitate it by setting up the exact ecosystem above. Indeed such a complaint has been voiced before.

There is strength in numbers, confidence in the choices of the mob. And the majority of web designers are falling into that trap.

Despite the obviousness of this, it’s far from the worst offender. The problem cuts a lot deeper.



bio3   Patient Zero: UX

And lo, the Internet bubble burst and out of the ashes came the single most valuable invention to hit the medium since its birth: the User Experience Design discipline (UX).

If there had been a general irrational exuberance and a lack of due diligence on the web before the bubble, there was an equally irrational fear of the medium immediately following it. It was the fear-based era of rote “Web 2.0” utilitarianism and functionality. It was still before Apple had proven the value of great design to even the CFO’s of the world, where aesthetics were still regarded as largely gratuitous endeavors.


All cures contain side effects.

The UX design discipline evolved out of this period of fear and uncertainty. Following years of unfettered, chaotic experimentation, exploration and a willingness (and sometimes a likelihood) to fail, UX stepped in and rationally culled and refined the best practices (of that era) and established a sensible process whereby optimal performance was achieved through an ongoing cycle of testing, analysis and incremental revision. Today it is the scientific process that dependably leads to incrementally optimized, defensible results.

In some way those of us who have worked in this medium, who have lived through decades of recurring skepticism and doubt about the value of design, are thrilled to finally have such an empirical, validating hammer. The relevant impact of this development is that this is the first time in the history of the interactive industry that the web design discipline has been able to systemically validate its own effort. To prove the ROI of design. That’s a heady maturity that is still new in this industry.

So far so good.

But the accountability and new-found reassurance that comes from the ongoing, incremental, effectiveness of the UX process has lead most web teams to go so far as to promote UX designers to project leadership roles. You can understand of course why UX designers have been tapped to lead, since the whole of this discipline is effectively accountable for both strategic design and the tracking of success. Makes sense if you want to ensure projects stay focused on addressing the business and end up proving ROI of the effort. What’s more, the very UX discipline itself has further come to oversee the web design and development process at large in countless organizations.

On the other hand, our promotion of that sensible, responsible oversight, has resulted in several unexpected, debilitating, side effects.


F   UX Side Effect 1: The Fracturing of Design

One of the principle ways UX has unintentionally undermined innovation is that it has caused a fracture down the middle of the design process; a fracture that starts with the people and their roles.

UX, focusing on the translation from business strategy to web functionality, tends to attract and reward highly analytical people. More the “responsible scientists” than the “non-linear artists” among us, who are ultimately accountable for the results of a project. These are people who can articulate data, visualize, organize and document information, and manage a process. I’m not suggesting that such a sensibility is the wrong one to manage those rational responsibilities. However, by placing UX in project leadership roles we are facing an unintended outcome: the “down breeding” of a second, unfortunate sub-species of designer whose sole focus is merely on the UX design leftovers. The scraps. Specifically, aesthetics.

What pesky stuff, that.

The area of focus of this aesthetically-sensitive design species is no longer on the overall emotional, expressive, dramatic experience of Interactive Theater, but on the appearance of the graphic layer alone. As such, these designers have largely been relegated to colorers, or house painters, if you will.

In an attempt to draw a clear distinction between them, we call this secondary role a UI (user interface) Designer. In reality, what the majority of today’s UI Designers focus on is rather not really the whole of UI, but “GUI” (graphical user interface). And even the title “GUI Designer” may be too sweeping since today’s UX Lead has already, presumably, decided exactly what this interface will do, what components it includes, and generally how it will move and behave. UI Designers do not so much design the interface, as they merely design how it looks.

Let’s take a moment here – because this is huge.


When we innocently split (sorry, “specialized”) UX and UI design, we unintentionally peeled in two, the whole of great design. More importantly we created a stark imbalance of power and priority between design’s yin and yang that rather should always be in equal balance if truly great interactive design is the intent. We removed influence of the unexpected emotional, improvisational, performer’s sensibilities that come from an artist’s muse and mindset from the origination of interactive theater. Which is too bad, because these are the things that disrupt, that result in innovation, and that delight users. The things that users would otherwise remember.

So is it really any wonder that 90% of the web sites you visit today all look the same? That the apparent “ideal” aesthetic approach is the ubiquitously coveted Flat design, which is itself merely the direct extension of UX’s own wireframes-the flattest design of all? That they all share some variation of the same, tired parallax scrolling effect that pre-dates wide-spread UX leadership? I’ve been in rooms where the question was asked, “Where can I find other effects and transitions?”

Me: (Stares. Blinks) “What, seriously? Well… uh, I don’t mean to be a dick, but that’s what your imagination is for. Invent it! Good lord, this isn’t a #%$@ing IKEA bookshelf!”

Today, most sites lack creative invention and inspiration. Oh sure, we love to point out how this page load animation is timed slightly differently than that page load animation, or the scrolling effect is nuanced in some way, but it’s all the same. And part of the reason is that we have removed the reliably surprising imagination, the randomizing, theatrical showman, the disruptive artful inspiration from the UX process. We have removed the chaos and unpredictability of art and inspiration.

Look, I realize that every UX-centric digital agency on Earth has some line about how their UX designers are story-tellers. Which I think shows that at some level they must understand how important this is. But God love ’em, today’s inordinate breed of UX designer is really a figurative storyteller, and not much of a showman. And I don’t mean that disparagingly, the discipline simply doesn’t naturally attract those people.

Take a summer blockbuster movie; that’s also a user experience. Sure, it’s low on user, high on experience, but such theatrics and production value are merely at one distant end of the UX spectrum. What about theme parks? Who on your user experience team has even the slightest little bit of proven experience in story-telling at that level. That’s an extreme, ok, but the reality is that there is huge potential on the web somewhere between that fully immersive, drama-led linear story, and a registration form. Who on your UX team is responsible for spanning that? For even thinking about it? For imagining the magic in the middle? For finding new ways to move from a wireframe toward the Theater of Interactive? Of making the experience surprise and delight us? How often are your projects led or directed by the performer’s mindset?

Since most of the web looks and behaves the same today, like the answer to a graphic design problem, most of you should have answered, “no one, and rarely”.

At this point there is always that one person who feels compelled to point at a generation of really crappy, Flash-based, ad-ware from the early 2000s – the antithesis of “usable” – the epitome of unnecessarily dense interactive jibberish – as though that proves great interactive work can’t exist. We agree, much of that wasn’t any good. But neither is this future of timid, mincing goals.

Our overzealous response to Flash’s demise, was to throw the baby out with the bath water, to largely abandon pursuit of disruptive new interactive models. I guess it was understandable; UX comes to the table with all these facts and empirical data, whereas UI designers tend to come with playful, colorful, unproven imaginings.  We’ve seen what happens when the artist’s mindset has too much control; the stereotypical result is that it’s pretty, but doesn’t work. So looking at such a comparison one can easily argue that it’s not even a fair fight. You might think that “Of course the UI designers are secondary to UX in the process.”

But you’d be wrong. The UI Design role (perhaps not the graphic-design-only purists, but this archetypal imaginative soul) has just been intentionally positioned that way specifically to keep the unpredictable, chaotic forces of self-expression and imagination, for which there are no best practices, from running roughshod over a methodical, user-centered, prototypical approach.

In fact fostering imagination should be half the job of a project leader who works with tools that are still being collectively learned and understood. But the data-seeking mindset of UX resists this, and resultantly limits imagination. It locks one into what one knows. It causes fear as one contemplates disruption. It magnetically holds one nearer to what’s been done before.


!   UX Side Effect 2: The Failure of Experts

In 2010 MIT professor Laura Schulz, graduate students Elizabeth Bonawitz, Patrick Shafto and others, conducted a study with 4-year-olds that showed how instruction limits curiosity and new discoveries. In the study a number of children were split into two groups and each group was introduced to a new toy in a different way.

Photo: Patrick Gillooly

Photo: Patrick Gillooly

In the first group, the experimenter presented the toy in an accidental, inexpert way, “I just found this toy!” and pulled out one of its tubes as if by accident, making it squeak. She acted surprised (“Whoa!”) and pulled the tube a second time to make it squeak again.

With the second group the experimenter presented the toy with authority and expert instruction. “I’m going to show you how this new toy works.” And showed only the function of the one squeaking tube without mentioning the toy’s other hidden actions.

In reality, the toy could do quite a lot more than the children were shown in either group.

In the end the students who were shown the toy in a more open, accidental, non-expert way were more curious and discovered all the hidden aspects of the toy. Whereas the children who were expertly instructed in the use of the toy did not; they discovered only the action they were specifically taught.

Yeah, these were just 4-year-olds, but don’t discount the relevance of this study. This story is being played out in some form on every interactive design team that puts UX in the lead role.

UX-centric project leaders are experts in historic best practices and proven usability patterns; they explain “what works” which is then backed by data, and this practice is resulting in measurable, incremental ROI: or, as most business owners would call it, “success”. But as with most young companies that begin to turn a profit, such success tends to shift team focus in subtle but profound ways; attention is turned away from innovation, toward optimization.

And this trend shows no signs of stopping. There’s been a splooge of new web design tools, such as Adobe’s brand new “Experience Design CC”, which are the design tool equivalent of all this incremental, best-practice, templatized-thinking fed back to us as next generation design platforms. Where rather significant assumptions have already been made for you about what it is you will be creating.


In their attempt to make these tools and thus the UX job easier, they have tried to dramatically close the gap between raw code (hard work and complete control), and what it is you have in your head. Said another way, these tools encourage a limited range of ideas.

On the other hand, an app like Adobe’s Photoshop is, for an image creator or manipulator, a very powerful tool that gives one complete, atomic control over the 2D image. But it is therefor also quite hard to learn and master.

And I think, that may be one of the trade offs with tools like these. This popular UX-ified state we are in has reduced the complexity and possibilities such that “easier-to-use” tools like these can exist at all.

For that matter, web site template providers like have had such ample opportunity to observe and reproduce all of your favorite, reused, UX-led design trends that they can offer them back to you in fully-designed, pre-fab form. Honestly, if you have no intention of innovating any new user experiences today, their designs are quite good.

All these apps and templates are merely serving as a different kind of expert: limitation, wrapped in the subtext of “what can and should be done”.

There can be no question that starting projects on the “expert’s” platform, while incrementally beneficial in the short-term, staunchly limits site creators’ understanding, imagination and exploration of new models and techniques.

No question.


,    UX Side Effect 3: The Separation of Content and Interface

Form follows function. That’s the fundamental mantra of great design. But what exactly is being designed on the web? UX designers say they design “user experiences”, so what constitutes a user’s experience?

If you bother to read the definition on Wikipedia someday, be sure to have a cup of coffee handy. Man, what a yawner. Supposedly the “first requirement of a great user experience” is to meet the needs of the product “without fuss and bother”.

Wait, seriously? That’s a “great” user experience, is it? Literally, it says that. Hey while we’re at it – maybe it should also be without any old tunafish cans and sadness.

Ok, look, whoever wrote this is both part, and indicative of the problem. They have totally nailed the literal “user” part, but they’ve left this otherwise really intriguing notion of an “experience” off the table.

So what is it really? Let’s cut to the chase. An “experience” must largely be defined by what the user does, and in turn what the user does on the web is enabled in large part through an interface.

Shouldn’t the interface itself therefor also be regarded partly as content of the experience? Well, yes, of course it should. Because it is.

If this initially innocent thought ruffles your feathers a bit, if it seems to crack some core belief, hold on tight; I’m not done. Because as a result of the interface being part of the content, the line between the form and the function of an “experience” naturally blurs to an uncommon degree. Which is an inconvenient truth that today’s UX designers, who inordinately prefer fully segregating interface and content, have left generally unacknowledged.

For years most designers have uncritically regarded their specific web interfaces, their chrome, as being more or less separate from whatever content is being served. Almost as a kind of wrapper, packaging design, or container, rather than it being an extension of the content itself. Indeed, that’s why flat design, as a universal, distinct, separate, design trend, independent of any site content, can exist at all. Flat design can only exist as a solution if you draw a hard line between interface and content. If you regard content as something that should be placed in a template. Period.

Flat design can only exist as a solution if you draw a hard line between interface and content. If you regard content as something that should be placed in a template. Period.

In fact the persistence of most of the best practices we share today, the underlying support tools like content management systems and authoring apps, and even design template sites, are all products of the same segregation-thinking. They can only exist if content and interface are broken apart.

On the other hand, we’ve all had the very rare experience of visiting a website where something different happened. Where the site creators bucked the segregation of content and interface, and clearly married those components from the beginning. They developed some unique expression of the content through interactivity as opposed to relying on content management systems and best-practice thinking. And when it’s done well (of course all things can conversely be done poorly) you feel it. You gasp. You say “Hey you guys, check this out”. It feels magical. It wins awards. It feels true to this medium. It says more than any templated, flat-designed interface, images and body copy ever could.

Why did this happen? Why have so many chosen segregation? Why has separating interface and content become the norm? Well, if you have only been working in the industry as long as UX has been a thing, you might imagine that it’s because this is the right way, that we are headed down the true and righteous path. You might look around at all the smart people who are confidently on this path and imagine that it’s because they are in pursuit of some empirically ideal interface; some perfect range of optimized graphic affordances and interaction principles, and that continually refining toward that ideal is our purpose. Indeed if that were true, that might even validate the idea that interface and content are meant to be separate.

But that would be woefully incorrect.

Ultimately the reason we chose to separate interface and content (consciously or unconsciously) is that the nature of true experience design is just… well, it’s just really hard. Sorry, that’s it. Too hard, too expensive, too much time. The truthful, authentic approach to user experience design is harder than whatever it is we have all agreed to call “user experience design” today. So we have just rather thrown up our hands and admitted a kind of meta-failure. We gave up. Our further avoidance of this truth is a sign of our own willingness to admit defeat right from the get go.

So everything we design after that, all the busyness and importance, is an openly pale compromise. At least we know that now.

As if that wasn’t enough… are you sitting down? I hope so, because I am about to say something further that will cause the remaining half of you to grit your teeth, particularly because I don’t have comments enabled on this site to serve as an outlet.

User experience design is not the strategic design of a site architecture, flow and interface – true user experience design is also interactive content creation. As such, form and function become the same thing.

Yeow! Ok, that smarts. I’m, sorry, I know this is painful and unclear. Just so you know, it is for me too. Moving on.

When designing a television, there is a clear demarcation between the interface of your television, say, and the content it presents. The TV hardware has one user experience, and the content, another. And in such a design world there is clearly no expectation that the content creators are in any way responsible for designing the TV interface, and vice versa.

The same can be said when designing any other hardware device on which we cannot know precisely what content will be presented. Or an OS, which must serve all manner of possible experiences. We must work at arms length to the content. Cognizant and supportive, but ultimately behind the wall. In development of a browser, the classic rules of form and function still come into play. There is a clear delineation between interface and content that gives a designer focus.

But when you step inside a browser window those tenets of design blur. Here, an idealized site creator, a true UX designer, would have complete control over the content to be presented. Strategically speaking, that’s their job after all. As such, drawing a line between content and interface, or not, suddenly becomes a matter of choice, not requirement. The corollary being the designer of a television who is also expected to fill it with content, and must therefor make movies.

Can we apply the tenets of design to the creation of a feature film? What is the “function” of a movie that comes before the “form”? Is it to tell a story? To entertain? To market products? To part consumers from their money? To make an audience feel something?

To be honest, I think that probably depends on who you ask. But it’s safe to say that a feature film is less a design problem, and more an art problem. The same condition exists for user experience design on the web. In fact it’s a bit more complicated because the medium and surrounding business has been so strongly influenced by people who come from design for so long. They’ve had their say. They’ve pushed web design into their comfort zone and reduced “experience” to a myopic set of design elements and interactive conventions. And they have relegated the unpredictable sensibility of improvisation and showmanship down the food-chain to an out of the way place that is largely gratuitous.

A movie produced that way would probably result in a middling documentary, a news report, or a corporate training video. But you probably wouldn’t be breaking any box office records.

What about those aspects of an experience design which really truly need to be segregated from the content? Well, you might ask why that’s even part of your site; I mean, you might argue that any interface elements which really are unrelated to the content might rather belong somewhere else.


Virologist studies the hamburger icon

Take the ubiquitous “Hamburger” icon, for example. Since it appears to play a role on the vast majority of new sites, one could safely assert that it’s clearly not specific to any brand or strategy. Ubiquitous and non-specific the hamburger icon, one could argue, might even bubble up into the browser chrome. I mean, why not? We have a back button up there, and your fathers used to nevertheless design back buttons into their websites. Theoretically, if prototypical sites are so great, and if you take the generic user experience trend to heart, every website should have content to fill a browser-based, hamburger menu. It would free up space in the window, and give users the prototypical experience we are told they crave. It’s a win, win, right?

Ok, I know it has issues, but let’s pretend you think this isn’t as crazy as it sounds. I hope you can see that as we extend the idea of prototypical function over form, we rather quickly get into a situation where chrome owns your “UX” and you merely fill it with pre-determined content.

And hopefully you see that we are basically causing that today.



H   External Stimuli: The Fall of Flash And The Rise of Multitouch & Apps

But why haven’t we recovered in our quest to innovate and differentiate on the web? Where did our aspirations go? Why do we even accept such a simplistic view of what’s possible on the web?

Both the popular rise of UX and the en masse surrendering of interactive innovation by web designers popularly took hold around 2007, roughly following the unveiling of multitouch on the iPhone.

Was that timing just a coincidence? I don’t think so; I think they were directly related.

After over a decade of interactive experimentation with the often kludgy, challenged tools available to web designers (Shockwave, Flash, etc.), Multitouch arrived via the iPhone.

Holy crap.

That was true interactivity! Our fingers and hands mucking through the very digital mud? Pinch to zoom- seriously?! Humanity’s continuum toward the Holodeck suddenly snapped into sharper focus. Like the sun overpowering a flashlight, one could see nothing else. It wasn’t even close.

It was at that moment that anyone who was truly passionate about the design of interactive experiences and of innovating new interactive models on the web, redirected some key part of their attention to the implications of this new domain. Adobe’s Flash, the in-browser tool which up to then had been the de facto authoring tool for rich interactive innovation, in conjunction with a PC Mouse, seemed almost immediately antiquated. Next to multitouch, the resolution of interactivity on the web was pathetic.

And I believe a sufficiently large swath of interactivists, at that moment, had an (possibly subconscious) epiphany:

“Maybe,” the thinking went, “Maybe the browser isn’t really an optimal platform for innovating immersive, new interactive experiences. Maybe we have been over-shooting all this time, and the browser is already kind of figured out after all. It’s certainly boring by comparison. Maybe interactive innovation is rather the domain of countless new technical platforms yet to come. Maybe we should just re-approach the browser and refocus our work there towards the simple, obvious things that we already know it does well. Just the basics.”

You can sympathize with this thread. I mean, focusing on the known strengths of any technology, including the web, is sensible, and feels like a more mature, nuanced approach to the medium. And yet that simple recalibration, so holistically adopted, sucked the oxygen out of web design, squelching the drive and appetite for explosive, innovation from our browser-based experience designs.

Some of you read this and probably still believe that the browser is not a reasonable platform for aggressive interactive innovation today. That we “know” what web sites should be now – better than ever before.

Yes, it’s easy to fall into that trap. I was with you as recently as a year ago. But there is one thought that obliterates that opinion for me.

Let’s play out a hypothetical.

What If Technology Froze?

Let’s imagine that our technical landscape froze. Doesn’t matter when. Say, today. Just stopped advancing. That the development tools you use today just stayed exactly the same for the next 20 years, no new versions, no new web standards, or bug fixes or updates, no faster processors, just these exact tools with these exact attributes, flaws and all. What do you suppose would happen?

Would the way we choose to wield those tools remain static and unchanging as well? Would web site design and the experiences we create for the next 20 years stay exactly the same as they are today?

Of course not! Our knowledge of that frozen tool’s true capabilities would explode. We would continue to discover capabilities and nuances that we have simply not had time or wherewithal to discover today. We would fully explore everything these tools could do, every aspect. We would see massive advances in our collective understanding and use of that static technical state. We would see a renaissance in our collective skills, interactive vocabulary and creative concepts. A new language. We would get vastly more sophisticated and better at creating, understanding and using content within that static medium. In short, we would master those tools.

Paint does not have to advance for a painter to develop the skills of a master.

The tools wouldn’t change – we would.

What that suggests to me is that such depth of innovation has always been possible, and is openly possible today, but intentionally or not, we don’t pursue it. That it has otherwise always been possible to deepen our understanding of any given technical state and to innovate aggressive, new experiences, but that we just can’t, or don’t. We simply choose not to set the bar high enough.


Surely, one can argue that this is partly because technology changes too fast for creators to develop mastery. It advances out from under us. Indeed most of us can no more appreciate our true, collective lack of insight and skill any more than a cave painter might have imagined the lacking of insight and skill required to paint the Mona Lisa.

That rather than bother trying to master the tools, many of us now patiently rely on the novelty of new technical tricks and advancements to fill the void. We have off-loaded responsibility to creatively innovate on developers of the platform.

We wait around for the medium – to master us.

There are ways to fight this condition. Always have been. To reach further, and get closer to the ideal of mastery and innovation, despite technical change, than the vast majority of web design teams do. A very small handful of great teams know those tricks and techniques and achieve that today, such as North Kingdom in Sweden. But it starts with acknowledging that there is more. There are better, bigger ideas than those which the best practices and incrementalism of UX have delivered us so far.

It means you must look at the product of your design effort today and see the potential to have innovated much further.

You have to believe, once more, that your medium, exactly as it is, can do much more than you have bothered to discover.



I   The Tragic Death of The Masters

Those of us who created interactive work in the early 90s were long ago forced to come to peace with this. Those of you who created Flash-based projects for many years were probably forced to face it only recently. And as sure as you are reading this, those of you who have only been working in the medium for less than a decade will unfortunately face it soon enough.

That we live in the Digital Dark Ages.

The one thing most of us will agree on is that technology changes. But as our industry incessantly celebrates the new, we less often notice what resultantly fades from view. Yet fade away is exactly what our work does. Unlike virtually every other preservable medium, the interactive work we create erodes from the view of users of future technologies. This is not just the result of changing software, versions and standards, but also of changing hardware, input devices and platforms, and of the unwieldy nature of maintaining functionality across the exponentially vast body of work being produced. A basic change in hardware will obliterate a decades worth of work or more.

A century from now your grandchildren, who bother to look back to your career, will see little to nothing, wonder fleetingly whether you ever created anything of value, and then assume “probably not”, since absolutely nothing you created will exist. Lost forever, as though it was merely a live performance. And then they will design some experience themselves, using future tools, that attempts to result in exactly the same emotional payoff or dramatic moment that you are producing today. Maybe they won’t do it as well as you did. But no one will be the wiser.

Unlike any other recorded, preservable medium such as literature, audio recording, film & television, etc, Interactive work is the first where the great masters and masterworks that came before you disappear. Have disappeared. Vanished for being fully dependent on a temporary technical state. Consider what literature, music, film & TV would be like today if every book, song, movie and show ever recorded vanished forever 8-10 years after its creation. If you’d never seen Charlie Chaplan, Buster Keaton, David Lean or hell, even Star Wars. If the only content you could experience and reference today was all post 2008? Think about that. Across every medium. Because although we chronically celebrate the “new” in the interactive space, it’s the previous work that first addressed all the fundamental challenges interactive designers face today. And often without the burden of being instructed by experts in what can’t be done.

There are countless versions of the Internet (and earlier platforms, CD-Roms, Floppys, Consoles, etc) full of astounding interactive experiences, conventions and designs- beautiful, delightful work that you have lost the ability to reference and learn from. Even now you probably mistakenly assume whatever work existed then wasn’t all that because “we’ve moved so far past it; our tools are so much more advanced now”.

But if you imagine this, you are mistaking platform and interactive resolution for experiences, content, emotion, and behavior.

Unfortunately, many of you are relegated to having to stumble blindly into re-conceiving and rediscovering those same, been-done, inventions, stories and discoveries all over again as if for the first time. And sometimes you won’t.

The Digital Dark Age has cut today’s designers and developers off from the vital inventions and experiments of the medium’s previous masters; rich resources of experimentation and invention which might have challenged the gross commonality and safe standardization we see today. Might have allowed today’s designers to go farther. But its lacking has instead relegated today’s industry to being perpetually inexperienced.



`    Taking Control & Crushing the Crisis

We can fix this. It’s huge and has massive momentum, and it will require us to humbly accept our recent errors, but we can fix this.

Approaching project leadership differently than we do today is going to be the best lever we have for affecting this change. We need to start with the right people and priorities in leadership positions.

Individually, each of us can start by acknowledging less certainty in the things we currently take for granted. To once again accept the possibility that aspects of our process and beliefs, bleeding-edge though they may seem, are woefully incomplete. I realize that back-pedalling from progress feels counter-intuitive in a medium that is still being understood – where we’ve only just begun to feel like we “get it”.

Where so many are still only now climbing onto the UX wagon.

But this medium has always been destined to evolve past the domain of GUIs, toward “Interactive Theater”. Consider ongoing advances in Multitouch, AI, and VR among others. More and more, Interactive media is going to be defined truly by experiences, real ones, not the scroll and click brochures we see today.

User Experience will be designed to generate improvisational drama and emotion, delight and magic, in short, a show. Leading such a project, you’ll need usability-knowledgable performers and showmen, not analysts and best practitioners.

Although this shift in user experience is more obvious when you project out to such an advanced technical future, it is still starkly relevant today.

In no way am I saying that we should abandon the best practices we have developed. But I am saying that it’s patently wrong for the same psychographic that inordinately defends those best practices, to lead an interactive project.

Some of you are UI Designers who truly love exploring the aesthetics of an interface. I get that. And thanks to the present state of the medium that will remain a valid role a bit longer. But to build and maintain relevance in your career you must move past the focus on mere graphic design that UX has relegated you to. You must be thinking about motion, behavior and improvisation with your users. And needless to say, you must resist referencing so few of your peers for inspiration.

There was a time before which Bobby McFerrin had his uniquely recognizable vocal style; the one that made his sound instantly recognizable in the early 80s. In interviews he described developing that style by cutting himself off from listening to his peers for 2 years, specifically to avoid sounding like everyone else. He used this time to explore new, original ideas. He then spent 4 more years practicing and perfecting.

Perhaps for you, developing an original approach won’t take 2 years of isolation, perhaps you can find your personal, inspirational vein without retreating from the industry altogether. But part of being a strong designer is tapping into a unique creative vein that inspires you. Not taking your inspiration from the work that has been “Most Appreciated”, or “Most Viewed”, but from a thread of your own. It takes guts to put yourself out there. To accept the professional risk of stepping aside the work of the popular kids. To avoid clicking on the “Upcoming Design Trends of 2017” prediction articles, and to do something appropriately true to you. If you aren’t on such a path, then you need to get on it; either that or take satisfaction in being regarded as a craftsman. Good craft is valid, but distinct from design.

So Who Leads?

  • When designers lead projects you get solutions that look pretty but don’t work.
  • When technologists lead projects you get solutions that function but look like spreadsheets.

And we must now add a new, equally unfair generalization to the list:

  • When UX leads projects, you get solutions that convert better but are just like everyone else’s.

The ideal interactive experience takes equal insight and invention from all of these very disparate, specializations: creative, performance and design, strategy, best practice, analysis, and numerous technologies.

That’s why we must rather encourage much more systemic debate between specializations. This is a team alchemy issue.

In particular we must try to undo the damage we have done to the design community when we began deprioritizing the “artist’s muse” in the origination of user experience. The fractured sensibilities of strategic UX and aesthetic UI, as we define them today, must be re-merged, either through skill development or team structure.

We then must empower that sensibility. We must reprioritize expressiveness, artistic exploration, play and the chaos of inventiveness as critical aspects of the UX Design process. Equal in importance to the logical, rational aspects that exist today.

Project planning and scheduling further need to accommodate this change in process by building space for experimentation and the inevitable stages of failure and problem solving.

I believe that the best, most advanced interactive teams will address this leadership issue in one of three ways:

  1. They will assign project leadership to a new role, a Director, hierarchically above what stands in for UX today. UX will take on an advisory role. This director will defend the dramatic experience, like a film director who leads many technical specialists across wide ranging fields, to a creative end.
  2. They will significantly change the job requirements of UX leadership to favor people who have a strong showman’s sensibility, an artist’s openness to new ideas, a true innovator. A playful romantic. In addition to managing strategic, functional, best-practice authorities.
  3. They will remove the singular leader altogether and move to a multi-disciplinary leadership model, raising the status and empowerment of each leader of the multidisciplinary teams. This is hard, and risks committee thinking, but in a team lacking ego, whose focus is laser-set on ground breaking experiences, it absolutely works.

Many combinations of these approaches would work. Each has risks and challenges. But if done well, each also increases the likelihood that we will see more  differentiation and innovation than we do today. Hopefully we will see more magic.



I’m sure by now it’s occurred to you that I’m a total hypocrite. That here I’ve produced this stupidly long blog post that might as well have been a book, and it’s everything I have railed against.

Ok, I admit it, you’re dead right; ideally I would have expressed these ideas interactively. At least I would have supplemented my ideas that way. What can I say. I did this alone. I don’t code much myself, so I have to rely on the tools that I do have sufficient command of.

But at least I’ll admit here in front of you that I have totally failed the medium because of that. That I have produced a piece of totally inauthentic work. I only hope you can see past that. And I truly wish the rest of the industry would similarly admit the same failure.

Tomorrow we’re all going to go to work, and we’re going to be surrounded by a wide range of highly talented people who don’t yet think this way. Who don’t yet see their sore lacking. People who are comfortable with the current definitions and trends, but who collectively have all the necessary skills to conquer this problem.

Many will not see that they are more than members of the design community, but that they are also on a stage, in a theater, in front of an audience. Putting on a nicely typeset, but very dull, show.

The idea that their project is maybe off target to some extent will not be met with ease. The whole system is aligned right now to squelch your innovative inclinations. But I encourage you to evangelize the need to break the stagnation, to find a new definition for UX and a new team structure to support it. At the very least be the person who questions the status quo and is confident that we can and should invent again.

And look, changing your mindset to favor innovation will help you in your career. The medium is going to change as technology and the world changes. As it has changed countless times already since the early 90s. The birth and death of browsers, the birth and death of Shockwave and Flash, propagation of social interconnection, the fragmenting of screens, ubiquity of mobile, the Internet bubble and ad-blockers, the only certainty you have is that the next fundamental changes will be so profound that they’ll catch you off guard. And if you have merely specialized in the current state, if your days are filled with the processes and skills that serve this narrow, off-target, flat graphic, UX best-practice-based website, circa: 2016, then there is more than a good chance, when such change comes, you’ll have precious few applicable skills to bring to bear.

Focus on the big picture. Study and understand the future ideal of this medium, and work back from there.

This medium is beautiful and powerful. It carries more latent potential than any other medium before it. And yet we barely understand it. We can barely speak its language. We even lack respect for our own inadequacy. So as the young, best and brightest enter our industry to be routinely trained against such misaligned conventions, led to believe our self-satisfied celebration of “design”, all while the true reason for being of this medium is so weakly embraced, it breaks my heart.

In those very rare instances that I discover someone who actually does get the medium, who bucks all these constitutionally weak, generic trends and produces a delightful, innovative, true-use piece of work, that effectively marries strategy and functionality, imagination and performance with a fine sense of taste and craft, it gives me goose bumps and makes me want to yell and fist pump in their direction.

This medium can support magic. We only need to try.




Special thanks to Tom Knowles, Gisela Fama, and Marcus Ivarsson for some truly thought-provoking, spirited debates.