UX Part I: Wicked Problem
We are at a crossroads in the design of User Experience (UX) today:
A wicked problem is a problem that is difficult or impossible to solve because of incomplete, contradictory, and changing requirements that are often difficult to recognize. The use of term “wicked” here has come to denote resistance to resolution, rather than evil. Wikipedia
Fifty years ago Gordon Moore published an article in Electronics Magazine predicting the number of transistors on an integrated circuit would double over a fixed period of time, forever. This rate works out to be about every year and a half, or put another way; IT hardware capacity doubles approximately every 18 months. This rate has proved to be basically continuous since the mid 1960s to the present day.
Who is Gordon Moore?
Gordon Earle Moore is an American businessman, co-founder and Chairman Emeritus of Intel Corporation, and the author of Moore’s law. As of January 2015, his net worth is $6.7 billion. Wikipedia
Due to Moore’s Law, when comparing computers from fifty years ago (1966) to those of 2016, we can observe that within the same physical space we now have:
- Chips with 3,500 times the performance;
- 90,000 times the energy efficiency, while the
- Price per transistor has fallen 60,000 times (1/60,000)
For comparison, if cars had increased at the same technological rate of progress we would now be driving cars that could:
- Go 300,000 miles per hour
- Got 2,000,000 miles per gallon of fuel, and that cost about
- 4 cents apiece (per car, that is)
Xerox PARC invents modern interface in 1972
Meanwhile we are operating computers with interfaces (UX) designed on plans built around the same time that Moore made his famous prediction. Most people today are unaware that the graphical user interface we still use today was developed almost 45 years ago at PARC, or Palo Alto Research Center, Inc., founded in 1971 as a research arm of the Xerox Corporation. PARC’s critical early 1970s contributions to computer science included:
- Development of the laser printer
- ARPANET (a predecessor of the Internet)
- Various email delivery systems
- Graphical user interface, or GUI (pronounced “gooey”)
- First modern version of the computer mouse.
The Xerox Alto is the first computer designed from the start to support an operating system based on a graphical user interface (GUI) using the desktop metaphor. The first machines were introduced in March 1973, a decade before mass-market GUI machines arose.
In 1979, Steve Jobs arranged a deal in which Apple Computer would license the concepts from Xerox in exchange for Xerox being able to purchase stock options in Apple. After two famous visits to see the Alto, Apple engineers used the concepts to introduce the Apple Lisa and Macintosh systems, sparking the GUI revolution that took hold during the 1980s.
Paradox: 300,000 miles per hour in a Model T Ford
So, in other words, we are now operating computers that can go 2,000,000 miles per hour, but we are still using a “Model T” dashboard. The Ford Model T (1908 – 1927) was the primary vehicle that brought mass manufactured automobiles to the masses.
Starting a Model T was not like today’s vehicles: Many of its essential controls like a gas valve, spark advance, and choke do not even exist on today’s dashboards. Meanwhile, controls for today’s many standard features like windshield wipers, opening windows, door locks and as well as climate control (not to mention power steering, brakes or automatic transmission) were non-existent because the features had yet to be invented (Model T dashboard illustration follows).
Two generations of Americans knew more about the Ford coil than about the clitoris, about the planetary system of gears than the solar system of stars. With the Model T, part of the concept of private property disappeared. Pliers ceased to be privately owned and a tire iron belonged to the last man who had picked it up. Most of the babies of the period were conceived in Model T Fords and not a few were born in them. The theory of the Anglo Saxon home became so warped that it never quite recovered.
Wow! Sound familiar when compared to today’s IT? The Model T (1908 – 1927) had its day—in fact almost two decades—bringing cars to the masses at cheaper and cheaper prices. It just wasn’t that sophisticated. We could say the same of today’s popular smart phones, browsers, operating systems and social apps.
You may be sitting back as you read this; thinking, ‘..not that sophisticated; what is he talking about!?’ The most sophisticated parts of today’s User Experience could be said to be the voice recognition bits that are favored advertising fodder for iPhone and Android. But the truth of these programs is they are not that sophisticated. They don’t even work if the Internet is not connected. That is because they work through brute force. How? Using ‘big data’ voice patterns are compared with millions of others, providing statistically likely interpretation of the words being used. But the device still doesn’t understand your intention. It’s like talking to an alien who has a dictionary of all our words but no cultural context or understanding.
You’re operating a 300,000 miles per hour device, with the dashboard of a hand-cranked Model T Ford.
Power Versus Force
The fact is we are leaning heavily on the fact that transistors cost 1/60,000 of what they did 45 years ago, so rather than be smart about UX we can just process the hell out of data. The languid result (for it’s really not a quick or efficient processing model) means that you can use voice control on an iPad to open the music playing software, but you can’t use it to do anything too specific such as: ‘play Classical Music’. This is because although it understands (many) words you say, it still has no clue about the intention or meaning behind your words.
Raw processing power has been used as a shortcut to the results we all want. Basically: brute force. But Power is a friendlier and more useful concept than Force. Power can be efficient, directive and effortless. Force; less so. We need both Power and Force working together in computers to help transition the world to a better place. We’ve got plenty of Force—now isn’t it time to get our Power back?
The imbalance of computing Power versus Force is a Wicked Problem that shows up in numerous ways. Google results are symptomatic. X hundred thousand (or million!) results… How is that of any value or use as an answer to a human being? It is not. All of the answers, yet none of the wisdom or knowledge we are seeking frequently sums up the solutions to our online queries about life, the universe and everything.
What’s The (Real) Problem?
The way the problem has been posed is largely at fault: the unstated ‘problem’ we are all solving is we need ‘one more app’. Oh, gosh, no! The fact that real possibilities (like freedom, fun, self-expression!) have been all but forgotten is sad.
Operating environments have not come nearly close enough in guidance to players. This is obvious when the language of existing IT infrastructure of 2016 today still considers ‘users’ an adequate description of the interactive player of today. (Oxymorons may be considered a literary sign of cultural decadence.)
Even the most state-of-the-art systems today do not aspire to fulfill on the promises of computers brought to us by science fiction of fifty years back, like the voice-operated HAL of 1967’s Stanley Kubrick masterpiece 2001, A Space Odyssey. Instead, we are faced with finance driven entities such as Apple and Google not to say Facebook that do not even pretend to be helpful. Ok, Apple pretends.
My vision of the Web is a place so ‘top-down’ that you can relax, join a party, and get real: “Live, live, live, that’s the message. Life is a banquet and most poor suckers are starving to death.” Auntie Mame
Instead of ‘living’ we contend with everything from pop-ups to unsolicited and inappropriate messages and images continuously; this means that while we actually avoid our devices, in some sense, ‘getting on and getting off’ as quickly as we can, we are simultaneously being lulled by the monotony of Facebook-esque services that never QUITE hit the mark—else why we would be continually opening and investigating new ones?
The problem is the grid: or, to be precise, the lack of a well-defined player grid.
The closest we have to a personalized grid today is the collection of ‘apps’ on our ‘smart phone’ screens—a hodge-podge of interfaces, data and security protocols roughly masquerading as an interface. The real grids are the thousands and possibly even millions of grids, for all kinds and sorts of purposes run by our corporate overlords who do not share their information with us. None however, is designed for and allied with OUR needs, yours and mine. A few pretend, but there is not one structure you can truly hang all your data and communications on that you can operate from a singular real-time dashboard.
The problem is made many times worse by the fact that the level of granularity (detail) we expect to receive (and transmit) through our many devices continues to grow exponentially. It seems we may have to find a taxonomy of/for everything! The scary direction of things presently? There’s an ‘app’ for everything. At last count well over 100,000 (or was it a million?). Point being: who has time for or desire to download and learn ANY new app? (Thank you I’d rather the device just did what I mainly wanted it for and do it really, really well.)
Trouble is, our devices DON’T do a good job of the basics. Calendars, Clocks, Document Development, Business Communication, Personal Conversations, Planning, Scheduling, Organizing. All should be integrated. They just aren’t. It’s laughable just how poorly integrated language terms are to Siri, or Android girl. It’s not their fault, however it is our fault moving ahead if we gullibly continue to be force-fed this corporate pablum. Phew! Let’s clean things up a little.
We have fought and continue to fight for civil rights and gay rights around the world. When it comes to Women’s rights we are still in the dark ages. –Madonna
Misogynists, while still with us, are not generally welcome in an enlightened society, What about human haters? This concept has a similarly developed term: misanthrope. A misanthrope is literally a person who dislikes humankind and avoids human society. Or, in a more general sense to do with things, we employ the term misanthropic, or:
hatred, distrust or contempt of the human species or human nature. A misanthrope or misanthropist is someone who holds such views or feelings. The word’s origin is from the Greek words μῖσος (misos, “hatred”) and ἄνθρωπος (anthrōpos, “man, human”).
Unfortunately, when it comes to devices, misanthropy is all too common. Today’s UXs are largely misanthropic. In greatest measure, it is as though they are designed conceived and built for exactly anything BUT what the human species requires. In contrast, we have the study of ergonomics, which promises quite a different reality:
Ergonomics (or human factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance.
Today’s siren song is to solve our ‘wicked problem’: with the average 13 year old spending upwards of five hours a day on a smart phone, today’s UX designers owe it today’s youth to discover what’s missing in today’s technology and unclog the battle caused by mismatched Power (Model T interface) with Force (300,000 mph processors).
UX Misanthropy, examples
UX Misanthropy Example 1: Word
I fell in love with Word version 1.0, in 1990. The first of its kind, you could see on the screen pretty much what would be printed out, including bold, underline and italic text. And, it was a fully-featured word processor, allowing me to create mail-merge, use glossary entries and develop style profiles, all with a mouse, and if you preferred, using lightning quick keyboard shortcuts that were taught simply by the structure of the menus (the capital letter of any item was its shortcut). Word was quick, clean and clever. How I loved it!
Flash forward 26 years. I’ve fallen out of love. The current versions of Word I use (different versions on different devices because I don’t like what many of them do) still show me pretty much on the screen what will print (once I ignore all the unwanted screen pollution). The similarities stop there. Today’s glossary, style sheet and other options have become buried in byzantine menus that have changed so much over the years most of my early learning is wasted. Worst of all, operating this word processor now REQUIRES that you remove your hands from the keyboard and either touch/tap the screen or use the mouse for many basic commands. The newest version does not even start out with the clean clear editing screen that set Word 1.0 apart from its predecessors. Now, like a dedicated word processor of years gone by, it comes up with a screen highlighting by default only the files stored on Microsoft’s proprietary cloud. And, if I don’t have cloud access on my iPad at the time, guess what? No files. L
Microsoft Word Player Power Score: -1
Side note; big surprise: Word’s original interface was developed out of the same pool that also created the computer mouse at, you guessed, it: Xerox.
The first version of Microsoft Word was developed by Charles Simonyi and Richard Brodie, former Xerox programmers hired by Bill Gates and Paul Allen in 1981. Both programmers worked on Xerox Bravo, the first WYSIWYG (What You See Is What You Get) word processor.
UX Misanthropy Example 2: Apple
For my second example of User Experience Misanthropy I’m not quite sure where to start! Whether it is Apple’s arrogant and high-handed attitude towards customers on their Music service (including deleting customers’ own music from their own devices (including original tunes), then renting back access to customers for an infinite number of rental payments, or the dumbing down of Siri from the original extraordinary and fun vision to the very limited concept standardized on IOS; Apple has demonstrated in too-numerous ways of late that their User Experience is truly misanthropic; human-hating!
These aspects of the Apple UX have been well documented by many writers who can describe what has happened better than I can. If you want more details, do feel free to check out the following extracts.
“The software is functioning as intended,” said Amber.
“Wait,” I asked, “so it’s supposed to delete my personal files from my internal hard drive without asking my permission?”
“Yes,” she replied.
I had just explained to Amber that 122 GB of music files were missing from my laptop. I’d already visited the online forum, I said, and they were no help. Although several people had described problems similar to mine, they were all dismissed by condescending “gurus” who simply said that we had mislocated our files (I had the free drive space to prove that wasn’t the case) or that we must have accidentally deleted the files ourselves (we hadn’t). Amber explained that I should blow off these dismissive “solutions” offered online because Apple employees don’t officially use the forums—evidently, that honor is reserved for lost, frustrated people like me, and (at least in this case) know-it-alls who would rather believe we were incompetent, or lying, than face the ugly truth that Apple has vastly overstepped its boundaries. – Vellum Atlanta
Siri’s backers know Apple’s version of the assistant has not yet lived up to its potential. “The Siri team saw the future, defined the future and built the first working version of the future,” says Gary Morgenthaler, a partner at Morgenthaler Ventures, one of the two first venture capital firms to invest in Siri. “So it’s disappointing to those of us that were part of the original team to see how slowly that’s progressed out of the acquired company into the marketplace.” –Huffington Post
Apple Player Power Score: -2
UX Misanthropy Example 3: Print, hardware integration
As the founding President of the Vancouver Electronic Publishers Association (1990s), the translation from screen to printed page has always held a special fascination for me. One of my first customizations was to create a Word printer driver so that it would drive high speed high quality mailmerges for downtown offices, printing page one on letterhead, page two on second sheet and page / item three (envelope) on a (you guessed it) envelope. This was about 1995. The high capacity robust office printers originally designed for dedicated word processors had versatile and easily programmable forms feeding options.
Today, over twenty years later, with state-of-the-art computer, printer and software the same feat is no longer duplicable. Between HP (printer), Apple (computer) and Microsoft (software) all entities refuse to acknowledge lack of handshaking or responsibility to make products perform as advertised. Result? While I have a 3-bin printer (just like in the 90s) that can theoretically sit up and do tricks, what I really have is a lot of extra hardware that is useless. I must put paper in the manual feed tray or it just won’t print. Do not get me started about attempting to print from my Android devices, nor IOS (iPad) device. I’m sure anyone reading this has his or her own experiences about trying to solve related issues in which forum after forum frequently has YEARS worth of frustrated user entries with the identical problem and nary a peep from any of the manufacturers involved. This situation is actually so endemic and irritatingly frustrating the only rational conclusion in the moment is is fuhgeddaboudit.
To be honest, it’s just too much work to do something that was figured out perfectly well over two decades back: misanthropic! Can we really, truly not do any better?
2016 Hardware Integration: Print Player Power Score: -3
We have begun moving backwards in our expectations of UX, a sure sign of a decadent and out-of-touch industry. Time is today to gain control back of our IT devices!
Just remember, whatever has been done, can be outdone! –Gordon Moore
Introduction to UX Part II:
If misanthropy in UX is evidence of the problem, can a solution be crafted in terms of its opposite: anthropomorphism?
anthropomorphism definition. (an-thruh-puh-mawr-fiz-uhm) The attributing of human characteristics and purposes to inanimate objects, animals, plants, or other natural phenomena, or to God.
The classical visual representation of anthropomorphism is Leonardo da Vinci’s Vitruvian man:
This image demonstrates the blend of art and science during the Renaissance and provides the perfect example of da Vinci’s deep understanding of proportion. In addition, this picture represents a cornerstone of da Vinci’s attempts to relate man to nature. Encyclopaedia Britannica online states, “Leonardo envisaged the great picture chart of the human body he had produced through his anatomical drawings and Vitruvian Man as a cosmografia del minor mondo (cosmography of the microcosm). He believed the workings of the human body to be an analogy for the workings of the universe.”
We will take from this drawing all the qualities we hope to craft in our solution:
- easy to grasp
Please subscribe for UX Part II:
Seen through the lens of branding archetype is not only an amazing access point to UX design fundamentals, it is a lighthouse shining towards a fresh notion of history and even the development of paradigm; towards a sustainable and on-going renaissance of society, creativity and commerce.
Bryce Winter is a chef, inventor, consumer and gastronome in the field of brands and branding. His elementary work as a UX Architect is in the area of taxonomy; a field necessarily influenced by archetype within his humanist approach to communal knowledge. Bryce’s public branding work was with significant global and local players and was effective for scores of brands from cigarette nationals to top bottled water brands and major Canadian, French and UK brands including CHANEL, Evian, Coca-Cola and TD Canada Trust.