Samsung didn't know user experience until Apple slapped them in the face with it

While most of us have spent the week researching ways to circumvent NBC's Olympics time delay and worrying about earthquake weather in LA, Apple and Samsung have been duking it out in court over a $2.5B patent dispute with grave implications for technological innovation in this country. From my understanding (read: I'm a product guy, not a lawyer), Apple is attempting to enforce a patent on rounded rectangles, and hence, the design of the iPhone, which Samsung (and countless others) have since emulated. The design patents at the center of the case are not only hilariously audacious (sorry, Charlie, your device can’t have rounded corners, those are mine), they're also a serious and immediate threat to innovation in this country. For reasons that will become apparent, I want to make it clear upfront that I think Apple's patent claims are baloney, but that's not what I want to talk about in this post.

What I want to talk about is the central thesis that Apple is trying to prove in the case: that Samsung intentionally copied the design of the iPhone. Based on documents released this week, the answer seems to be, more or less, "yes", but what’s fascinating to me as a product designer, is that despite their best efforts to copy the the iPhone, the user experience of Samsung’s devices remains vastly inferior.

Two documents in particular reveal that by early 2010, Samsung's senior management had realized that the iPhone was eating their lunch, and the reason was that the device was simply more enjoyable to use than anything Samsung had managed to produce.

The first document revolves around a meeting in February 2010, in which the head of Samsung’s mobile division characterized the difference in user experience between the iPhone and Samsung smartphones as "the difference between heaven and earth". The memo portrays a company in which design decisions are dictated from the top down, and hierarchy is strictly enforced. The problem is, no one, no matter how awe-inspiring their executive title is, can ever dictate how a product should be designed and expect it to have a great user experience. The best products evolve out of an iterative process of observing problems, solving them, and repeating. This can only be done when decisions are made by designers and engineers, not managers and memos, because only the people in the trenches, who interact with real users with real problems, have the observational evidence necessary to design meaningful solutions.

When design decisions come from the top down, user experience always suffers, because the designers and engineers end up working for the approval of their superiors, rather than the optimal experience for the end user.

The second document, dated a month after the first, and presumably a direct response to the management mandate to bring Samsung's smartphone UX up to par with the iPhone, details 126 differences between the iPhone and the Galaxy S1, and gives recommendations for improvement. I’d bet dimes to dollars that the document was prepared by non-designers, because it's a surface level analysis that fails to grasp the underlying principles behind the differences.

For example, on the iPhone, the interface for the keypad of the phone is visually similar to the interface for actions only available during a call, so the designers employ an animated transition between the two in order to communicate to the user that the interface has changed. The need for such an animation is brought about by the decision not to display the keypad and in-call actions at the same time, which is the decision that actually adds to the simplicity and clarity of the interface. The authors of the Samsung comparison note only the absence of the transition on their device, not the more important difference between the elegant minimalism of the iPhone interface, and the cluttered Samsung interface. The result? A recommendation to add an animation to the screen, without addressing the underlying problem.

Imagine the scene when this report was delivered to the designers and engineers responsible for executing the recommendations: 130 plus pages of instructions to replicate superficial design elements from another product without understanding the decisions that led to their implementation. A report that keeps a straight face while making recommendations like: “Need to develop dynamic effects to bring out fun factor to user.” It must have been a soul-crushing experience to have such mandates handed down from on high; a double-edged sword of having no choice but to implement a bunch of foolishness, and being pulled away from doing things that might actually be well informed (gasp) or even innovative (oh, the humanity!).

With the context that’s come out in these documents, it’s no surprise that even given the example of the iPhone, and a management directive to replicate the user experience, Samsung still fell short on UX. I'd go so far as to say that even given a team of the most talented designers in the world, and an explicit order to create an exact replica of the iPhone, Samsung would still fail to reproduce the user experience of iPhone.

Why? Because executives removed from the craft of product design could pop their heads in at any time, and say “Oh, that pocket knife works really well as a pocket knife, but what would really be cool is if that pocket knife also worked as a spoon. Oh, and a fork. Also a compass. And also it should be more fun to use.” They would fail because producing a product with a user experience as elegant and polished as the iPhone requires a level of discipline and integrity that most companies simply cannot muster.


Great user experiences are transparent. They enable users to work through them while focusing on what they want to do, how they want to do it, not how the designer of the device thought they should. When you use the phone or e-mail on the iPhone, you're not "using the phone app", or "using the mail app", you're simply making a call or checking your email. You don't have to think about the tool, only the task at hand. Such experiences require tremendous courage and discipline in order to pare down an experience to its bare minimum, to its essence. Such experiences require putting design decisions in the hands of designers, not managers.

Samsung may have developed iPhone-like prototypes prior to the release of the iPhone, but they never would have made it to production with those prototypes if Apple hadn't shown them the way. Samsung’s attempts to copy the iPhone may have bought them market share (and a $2.5B lawsuit from Apple along with it), but it hasn’t bought them user experience parity. Apple's competitive advantage isn't that they're the only company that has designers capable of putting out the iPhone, it's that they're the only company that has the balls and integrity to put decision making power in the hands of those designers. Unless that changes, it's highly unlikely that Samsung, or any other competitor for that matter, will be able match Apple when it comes to user experience.

Wasted Use: The Problem with Healthcare IT Software

Software has the potential to revolutionize the practice, experience, and business of healthcare, but the current generation of systems are more apt to cause frustration among doctors and confusion among patients, than increased efficiency or a better patient experience.

 

A Huge, Hostage, Market

It's no secret that healthcare is one of the largest industries in this country. By some accounts our national health expenditure is expected to hit $3 trillion dollars in 2012, so it's no wonder that investments in Healthcare Information Technology (HIT) are growing at double digit rates, with spending on Electronic Health Records (EHR) projected to hit $3.8 billion by 2015. The explosive growth is driven, in part, by $28.5 Billion in governmental investment and incentive programs which kick in this year. Starting in 2012, switching to EHR opens up $44,000 in incentives for individual physicians (over 5 years), and $2 million as a base payment for hosptials that switch. This will be further compounded by adjustments to Medicare payments in 2015 based on meaningful use of the systems. As a result, hospitals and healthcare providers have had very little choice but to switch, and switch now to EHR. 

 

Software That Ignores Doctors' Existing Behavior

Talk to any doctor about their thoughts on the current state of EHR software, and you're likely to receive colorful, expletive-ridden descriptions about how cumbersome and difficult to use software they are forced to use on a daily basis is, followed by some surreal story of a kafka-esque experience they've had with an IT department or provider trying to convince them that a critical defect is a feature not a dangerous bug (Spoiler alert: I happen to have a father, sister, and soon-to be brother in law who are all physicians, and like all physicians I’ve spoken with, are quite vocal on the subject).

My father, Dr. Val Catanzarite, puts it in perspective:

Instead of looking at the existing workflows, which included both the good, the bad, and the ugly, and attempting to keep the best and improve the worst, IT developers figured that they could cure healthcare’s ills with gigabytes of…text. The EHRs that are supplanting the old paper records are text based, eliminating the problem of handwriting but also losing the functionality of drawings, etc. A checkoff sheet that took less than a minute to complete a year ago might have been replaced with an EHR page that takes 5 to 10 minutes. Touch screens? In your dreams. Speech recognition? Speech what?  

Compare the front end of a gigabucks EHR implementation to an iPad and it’s immediately apparent why most EHR implementations result in a huge increase in provider time to document a patient encounter. In the past, a quick checkup might be 15 minutes— 12 minutes face to face doctor and patient time, and 3 minutes to complete a form or write a note. Now, the same visit might require 8-10 minutes to complete the note, that is, if  the EHR is “up”. (EHRs are famous for long lunch breaks, and often like an afternoon nap as well). Even after a year or two after educating the users, fine tuning the systems, etc, the provider productivity typically remains at least 10%, and often over 20%, below baseline.

Keep in mind, Dr. Val Catanzarite is no luddite. It's no wonder many doctors and nurses reflect fondly on the simpler days of paper charts.  The bottom line is, the healthcare software is failing physicians, and it's not the user's fault, these systems appear to have been designed with a sole focus on technical requirements and budgetary constrants, with no consideration for aesthetics or usability. The market seems ripe for a disruptive startup to deliver a product that works and wipe the floor with the competition.

Unfortunately, in the healthcare industry, the costs of switching software platforms are prohibitively high. When considering the costs of switching software systems for a hospital or network of healthcare providers, the actual cost of the software is a relatively small part of overall cost of the system, with the majority of the cost going towards services to customize the implementation and provide ongoing support and maintenance, while additional cost is driven by the need to train staff to be proficient with the software. Lock-in is further compounded by the questionable cross-compatibility of data between systems, which shows little hope of being resolved in the near term. Because of these factors, it's likely that the current generation of HIT software will dominate the industry for the foreseeable future. 

 

Unusable Software for patients

It's not just software for physicians that leaves much to be desired. Examine Kaiser Permanente’s industry leading Health Information Technology (HIT) implementation, showered in prestigious awards and copious praise like the *Stage 7 Award for Excellence* awarded by an organization that has obviously never used the system as a patient, because it's terrible. Yes, Kaiser's system affords access to EHR to 9 million subscribers, but how many can actually figure out how to use it? 

As a software product manager, designer, and rabid early adopter, I'd consider myself to be somewhere in the upper 90th percentile in terms of tech-savviness as well as ability to intuit new technologies. That said, I am regularly so confused by Kaiser's online portal that I end up (somewhat embarrassedly) calling their phone system, which is only slightly less awful. The problem is, the system seems to have been designed to comply with some massive set of business requirements mandated to meet the standards of a "viable" EHR, with a total disregard for usability, information architecture, or user experience. It's as if the designer took the bulleted list of requirements and literally translated them to the page.

The question is, why is the software so bad? Kaiser Permanente generated $42 Billion in revenue in 2009. Obviously, they have the resources to secure quality software. Like many organizations, they chose to use the system developed by a multi-award winning, industry leading software provider, Epic. Epic is a rapidly growing company that generated $1.2 Billion in revenue in 2011, and is a company whose sole purpose is developing HIT software. So again, the question is: why is the software so bad? 

 

Revenue Growth is More Attractive than Innovation

My sister, Dr. Tatiana Catanzarite made the point that the big HIT corporations are “So focused on outbidding the competition for contracts with major hospital systems that there are few resources actually left for innovation.”  Consider Cerner, the second largest HIT provider by market share. 21% of their 2007 operating expenses went to research and development, but the majority of that went towards upgrading legacy systems, and only 3% of their operating revenue actually went towards developing new software. How much of that small slice of the pie went to evolutionary improvements, how much went to real innovation?

The truth is, the revenue growth of most HIT companies is driven by winning contracts with new hospitals and providers networks, not by creating better software. In the first 9 months of 2011, 67% of Cerner’s revenue came from Service, Support, and Maintenance, while only 31% came from software sales. From the perspective of a revenue machine like Cerner, if existing offerings are making a killing, and there is still room for growth, why rock the boat? The the top 25 HIT companies grossed over $26.7 billion in 2010, but an average of just 22% of that revenue came from software. For these companies building better software actually erodes revenues over the short term due to the cost of upgrades for existing contracts and cannabalization of existing products! It’s a classic Innovator’s Dilemma.

 

Unrealized Potential

Software has an enormous potential to increase the quality, accessibility, and overall experience of healthcare, but in order to realize the potential, the developers of the software need to re-align their focus on usability, user experience, and innovation, instead of short term revenue growth. 

Consider for a moment that the industry leading EHR solutions lack even the a simple feature to allow the doctor to query a set of their own patients around a certain medical condition, or medication, or compare the charts of recently treated patients. The physician has to literally contact the IT department to run queries directly in the database to collect that type of information, and you can forget about reports on really useful stuff like what medications worked best, average number of visits until diagnosis, etc. 

There are some very promising startups in the healthcare space, such as dr chronoPractice Fusion, and One Medical Group, each innovating around one or more of the core problems such as EHR UX, practice management, or digital patient experience, but who will challenge the Goliaths of HIT like Cerner, McKesson, Epic, and Siemens that control the lion's share of the market through integrated solutions and contracts with hospitals and large provider networks?

The reality is, the opportunities for innovation are much, much larger than just building usable systems for doctors and patients to manage electronic health information. There is an entire, virtually unexplored realm of possibility that includes data analysis, pattern recognition, and other forms of intelligence derived from data that is already being collected.

 

Connecting the Dots

Picture a system in which doctors could not only run queries searching for patterns within their own patients, but could use an intelligent software tool to organically discover previously overlooked patterns or correlations of medical conditions or disease within not just their own patients, but across larger data sets of provider networks, hospitals, or even annonymized data for entire countries or the whole world? What new relationships between conditions, diseases, medications, etc. would we discover if we could harness the power of the data we are already collecting? 

In order too realize the true data intelligence potential afforded by electronic health information, there needs to be a centralized, secure, way to share information between organizations. Even if EHR providers were to magically fix all of the problems with their existing software overnight, we would still be no closer to a solution to the fact that the EHR landscape is hopelessly fragmented by data incompatibility and server siloing. A nationwide or worldwide EHR database is a pipe dream today, but standardization would create an enormous amount of value, and will happen eventually, although it certainly won't happen over night.

A paradigm shift looms on the health IT horizon, and it won’t remain a pipe dream forever. Imagine an intelligent EHR system that learns about patients and automatically looks for patterns by matching data points against a massive, centralized database. Imagine a patient experience that makes managing personal health information as simple as Mint.com makes managing financial information. Such system functionalities have been envisioned for decades, and they could literally transform healthcare, but only when a company enters the space that thinks and operates like Netflix, or Intuit, Apple, or Amazon, and my guess is it will be a startup, not an incumbent player. 

An Inside Look at Kaiser Permanente's Award Winning EHR System

Lets play a game called: "Guess which link contains the actual details of my health plan."

Solving that one took clicking through half a dozen dead end links (some of which literally have no content to display).

Another fun one came up recently when I attempted to make an appointment with my doctor after I dislocated my shoulder. When I attemped to schedule a same day appointment, I got an error message:

That seemed reasonable, so I expanded the search to a week, and got an error message:

Perplexed, I expaned the search to three weeks, and got an error message:

Obviously, I had a time-sensitive problem and needed to be seen, so I called the appointment line, where the opperator informed me that my doctor was on vacation for the month, so obviously that's why I couldn't schedule the appointment. She then booked me an appointment with another doc, and then cheerfully reminded me that in the future I could perform most simple tasks such as appointment booking using the online portal. Too bad Kaiser's system doesn't make it visible when a doctor is on vacation, and even if they did, it's not possible (as far as I have been able to discern) to schedule appointments with other doctors without a referral. 

Here's one last example of award-winning HIT system design.

Yes, Kaiser my email changed, so the one you have is the right one. Who writes this stuff?

User experience is ripe for a revolution

I had a lovely chat on saturday with an eldely couple who were interested in the Kindle I was reading while eating breakfast. What struck me about the exchange was something that I have been mulling over for some months, but the conversation was sort of a culmination of an idea slowly building up over time: technology has outpaced the ability of people to use it, and user experience is ripe for a revolution.

The man, a former math teacher, and the woman, a former nurse and staunch library advocate, were interested in how I use the device– do I use it for books, newspapers, magazines, for pleasure, work, school? They asked about how much books cost, and how easy it is to put other reading materials on to it. I love the kindle and am quite an enthusiastic advocate of new technology and shiny gadgets, so I gushed about the battery life, the way it has replaced heavy stacks of books when I travel, the merits of reading on a matte screen vs. the back-lit eye strain of the iPad, how cool it is that I can sync the text from articles and blog posts on the web to my kindle with a simple click of the Instapaper bookmarklet in my web browser or iPhone, and how I can put pretty much any text document on the kindle by simply emailing it.

They were quite intrigued by the promise of the new technology and merits of a small lightweight device that can hold thousands upon thousands of books, but their question was– is it easy to use? I thought about it for a second, and then said "No." The kindle has solved a lot of the problems associated with reading digital content quite elegantly. The screen is nearly as eye-friendly as paper, the battery life is astounding, the experience of reading is refreshingly simple. However, the process of putting non-kindle store content on the device is not user friendly at all. If I didn't spend 60-80 hours a week working on internet-conected devices, devouring tech news via twitter, hackernews, and my increasingly neglected google reader account, I probably wouldn't have discovered Instapaper, and without Instapaper I don't think I would have realized on my own that I could email documents to my kindle and download them for free via wifi. Even with the happy accident of stumbling upon instructions for kindle delivery through Instapaper, it took me a while to figure out that there was a method of doing this for free and not paying for 3G delivery of freely available content. The point is, as good as the kindle is, it took some technological proclivity and more than a little luck to truly take advantage of the device. The irony is, the kindle is one of the simplest devices I've ever used, and even it still has some major shortcomings, so what does that say about the broader state of human- computer interaction?

The pace of technological advancement has increased exponentially over the past century; everything from computing power to storage has been essentially doubling every couple of years. Raymond Kurzweil calls this incredible rate of advancement the law of accelerating returns (often confused with Moore's Law, which relates specifically to the doubling rate of transistors on a circuit), and it means that between 1950 and 2000, computing power increased by a factor of roughly 100 million. The pace of technological adoption has also increased radically over the same period. Radio, invented shortly before the dawn of the 20th centruy, took 31 years to reach mass adoption. 100 years later, the web reached mass adoption in only 7 years. Facebook reached mass adoption in roughly half that time, and Groupon half again of that, reaching an astonishing 50 million users just over 2 years.

However, in the face of this nearly incomprehensible rate technological advancement and adoption, the way that we interact with computers has advanced painfully slowly. The WIMP graphical user interface (windows, icons, menus, pointing device) that dominates computing from cell phones to computers has remained fundamentally unchanged since its invention over 30 years ago. Evolutionary advancements have been made since, and Apple's iOS has introduced some more natural, post-WIMP interactions outside of the box of the prevailing mental model, but these are really baby steps, nothing compared to the staggering advancements made in computing hardware, algorithms, and technological capacity over the same period.

Today’s technological landscape frames a unique moment in history. From cheap, abundant computing power, to an unprecedented availability of public data, and powerful APIs that allow developers to rapidly develop brilliantly advanced applications that would have been prohibitively expensive for all but the largest organizations just 5-10 years ago, even the most advanced technology is becoming a commodity. Perched on the foundation of this vast technological platform, we're standing on the precipice of a paradigm shift that will be even more radical than the shift from the command line interface to the graphical user interface. The opportunity to develop an intuitive computing experience that allows the masses to harnesses the vast technology at our disposal is so obvious and so enormous, that a revolution in user experience seems all but inevitable.

The question is, what will the coming user experience revolution look like? Perhaps it will be driven by the current generation of Natural User Interface explorations, but just as likely it will be something as radically innovative, and in hindsight as obvious, as the WIMP graphical interface is compared to the command line interface that preceded it. As cool as the famous minority report interface looks on a movie screen, I'm personally not going to be lining up anytime soon to trade in my comfy chair and laptop for a big fat case of gorilla arm. Even the now ubiquitous mulit-touch interface has some serious limitations when it comes to serious full-time use-cases, and multitouch laptop screens certainly aren't positioned to replace the mouse and keyboard.

In order to be successful, the user experience revolution needs to promote natural and intuitive experiences, humane interfaces, and adaptive technologies that interpret context in order to meet individual users' unique needs. The great challenge will be to build upon the abundant and rich technological platforms that surround us, from machine learning to location data, and to create user-centered applications that learn the user rather than requiring the user to learn them, and anticipate and adapt to the user's needs and expectations rather than demanding that the user conform to a rigid mold defined by the software. 

Whatever it looks like, the one thing that is certain is that there is an immense and immediate opportunity for innovation in user interface and experience design, and that companies like Apple who understand that technology is intrinsically worthless to the end user unless it is usable, stand to profit enormously by making technology easier and more enjoyable to use, not just by geeks like me, but by curious but intimidated individuals like the couple I met this weekened, who stared longingly at me absently flipping the pages of my sleek kindle with one hand while leisurely eating with the other, as they fumbled clumsily with their unwieldy newspapers.

iPhone vs. BlackBerry: superior UX trumps inferior messaging

As a long time Apple geek, but newly minted iPhone user, I've been spending way too much time this week poking and prodding and petting my shiny new toy. Now it's not like I've never used an iphone before, and I've had an iPad since they came out, but in the process of switching to full time iPhone duty after years using BlackBerry, I've been absolutely delighted by the user experience. What's been the most striking to me is how in love with the experience of using the phone I am, even though it is obviously inferior to my BlackBerry in some ways.

When it comes to messaging, iPhone isn't even in the same league as BlackBerry. RIM's BlackBerry Internet Service push technology syncs emails in near realtime without destroying battery life, and when it comes to instant messaging, BlackBerry Messenger is a class by itself. Clearly, BlackBerry's technology trumps iPhone in one of the most critical mobile use-cases: messaging. The thing is, the rest of the iPhone experience is so good, I just don't care.

The iPhone user experience starts before you even open the box. Before you even hold the product in your hand, you have the experience of opening a meticulously crafted box that's so precise it doesn't even need the plastic it comes wrapped in to hold together. The level of craftsmanship and care that Apple lavishes upon its products is immediately obvious.

What struck me immediately after the always enjoyable Apple unboxing, was how sleek and solid the iPhone felt in my hand. My old BlackBerry Tour felt cheap, clunky, and unwieldy by comparison.

The next thing that struck me was the subtle magic of the iPhone keyboard. The first line of text that I typed on the iPhone was riddled with errors, but by the second the error rate had dropped significantly. The keyboard quickly and quietly adapted to my fat fingers, and just worked.

Another nice touch that seems so obvious that it's hard to even notice is how the state of the keyboard locks automatically when you switch from QWERTY to numbers and symbols. One of my pet peeves on my BlackBerry was the outrageously inefficient process of typing a phone number in a text message: alt-6 alt-1 alt-9… it sucked. And yes, I know there is some way to lock keyboard states on the BlackBerry, but I never bothered to figure it out, and that's the crucial difference on the iPhone: I hardly noticed it, and it just worked.

There are dozens, hundreds of other brilliant UX touches in the iPhone, like obscenely detailed transitions, and intuitive message refreshing, that make the device a joy to use, and allow me to totally look past fact that BlackBerry absolutely slays the iPhone when it comes to the backend infrastructure that serves email and messages.