Samsung didn't know user experience until Apple slapped them in the face with it

While most of us have spent the week researching ways to circumvent NBC's Olympics time delay and worrying about earthquake weather in LA, Apple and Samsung have been duking it out in court over a $2.5B patent dispute with grave implications for technological innovation in this country. From my understanding (read: I'm a product guy, not a lawyer), Apple is attempting to enforce a patent on rounded rectangles, and hence, the design of the iPhone, which Samsung (and countless others) have since emulated. The design patents at the center of the case are not only hilariously audacious (sorry, Charlie, your device can’t have rounded corners, those are mine), they're also a serious and immediate threat to innovation in this country. For reasons that will become apparent, I want to make it clear upfront that I think Apple's patent claims are baloney, but that's not what I want to talk about in this post.

What I want to talk about is the central thesis that Apple is trying to prove in the case: that Samsung intentionally copied the design of the iPhone. Based on documents released this week, the answer seems to be, more or less, "yes", but what’s fascinating to me as a product designer, is that despite their best efforts to copy the the iPhone, the user experience of Samsung’s devices remains vastly inferior.

Two documents in particular reveal that by early 2010, Samsung's senior management had realized that the iPhone was eating their lunch, and the reason was that the device was simply more enjoyable to use than anything Samsung had managed to produce.

The first document revolves around a meeting in February 2010, in which the head of Samsung’s mobile division characterized the difference in user experience between the iPhone and Samsung smartphones as "the difference between heaven and earth". The memo portrays a company in which design decisions are dictated from the top down, and hierarchy is strictly enforced. The problem is, no one, no matter how awe-inspiring their executive title is, can ever dictate how a product should be designed and expect it to have a great user experience. The best products evolve out of an iterative process of observing problems, solving them, and repeating. This can only be done when decisions are made by designers and engineers, not managers and memos, because only the people in the trenches, who interact with real users with real problems, have the observational evidence necessary to design meaningful solutions.

When design decisions come from the top down, user experience always suffers, because the designers and engineers end up working for the approval of their superiors, rather than the optimal experience for the end user.

The second document, dated a month after the first, and presumably a direct response to the management mandate to bring Samsung's smartphone UX up to par with the iPhone, details 126 differences between the iPhone and the Galaxy S1, and gives recommendations for improvement. I’d bet dimes to dollars that the document was prepared by non-designers, because it's a surface level analysis that fails to grasp the underlying principles behind the differences.

For example, on the iPhone, the interface for the keypad of the phone is visually similar to the interface for actions only available during a call, so the designers employ an animated transition between the two in order to communicate to the user that the interface has changed. The need for such an animation is brought about by the decision not to display the keypad and in-call actions at the same time, which is the decision that actually adds to the simplicity and clarity of the interface. The authors of the Samsung comparison note only the absence of the transition on their device, not the more important difference between the elegant minimalism of the iPhone interface, and the cluttered Samsung interface. The result? A recommendation to add an animation to the screen, without addressing the underlying problem.

Imagine the scene when this report was delivered to the designers and engineers responsible for executing the recommendations: 130 plus pages of instructions to replicate superficial design elements from another product without understanding the decisions that led to their implementation. A report that keeps a straight face while making recommendations like: “Need to develop dynamic effects to bring out fun factor to user.” It must have been a soul-crushing experience to have such mandates handed down from on high; a double-edged sword of having no choice but to implement a bunch of foolishness, and being pulled away from doing things that might actually be well informed (gasp) or even innovative (oh, the humanity!).

With the context that’s come out in these documents, it’s no surprise that even given the example of the iPhone, and a management directive to replicate the user experience, Samsung still fell short on UX. I'd go so far as to say that even given a team of the most talented designers in the world, and an explicit order to create an exact replica of the iPhone, Samsung would still fail to reproduce the user experience of iPhone.

Why? Because executives removed from the craft of product design could pop their heads in at any time, and say “Oh, that pocket knife works really well as a pocket knife, but what would really be cool is if that pocket knife also worked as a spoon. Oh, and a fork. Also a compass. And also it should be more fun to use.” They would fail because producing a product with a user experience as elegant and polished as the iPhone requires a level of discipline and integrity that most companies simply cannot muster.


Great user experiences are transparent. They enable users to work through them while focusing on what they want to do, how they want to do it, not how the designer of the device thought they should. When you use the phone or e-mail on the iPhone, you're not "using the phone app", or "using the mail app", you're simply making a call or checking your email. You don't have to think about the tool, only the task at hand. Such experiences require tremendous courage and discipline in order to pare down an experience to its bare minimum, to its essence. Such experiences require putting design decisions in the hands of designers, not managers.

Samsung may have developed iPhone-like prototypes prior to the release of the iPhone, but they never would have made it to production with those prototypes if Apple hadn't shown them the way. Samsung’s attempts to copy the iPhone may have bought them market share (and a $2.5B lawsuit from Apple along with it), but it hasn’t bought them user experience parity. Apple's competitive advantage isn't that they're the only company that has designers capable of putting out the iPhone, it's that they're the only company that has the balls and integrity to put decision making power in the hands of those designers. Unless that changes, it's highly unlikely that Samsung, or any other competitor for that matter, will be able match Apple when it comes to user experience.

Wasted Use: The Problem with Healthcare IT Software

Software has the potential to revolutionize the practice, experience, and business of healthcare, but the current generation of systems are more apt to cause frustration among doctors and confusion among patients, than increased efficiency or a better patient experience.

 

A Huge, Hostage, Market

It's no secret that healthcare is one of the largest industries in this country. By some accounts our national health expenditure is expected to hit $3 trillion dollars in 2012, so it's no wonder that investments in Healthcare Information Technology (HIT) are growing at double digit rates, with spending on Electronic Health Records (EHR) projected to hit $3.8 billion by 2015. The explosive growth is driven, in part, by $28.5 Billion in governmental investment and incentive programs which kick in this year. Starting in 2012, switching to EHR opens up $44,000 in incentives for individual physicians (over 5 years), and $2 million as a base payment for hosptials that switch. This will be further compounded by adjustments to Medicare payments in 2015 based on meaningful use of the systems. As a result, hospitals and healthcare providers have had very little choice but to switch, and switch now to EHR. 

 

Software That Ignores Doctors' Existing Behavior

Talk to any doctor about their thoughts on the current state of EHR software, and you're likely to receive colorful, expletive-ridden descriptions about how cumbersome and difficult to use software they are forced to use on a daily basis is, followed by some surreal story of a kafka-esque experience they've had with an IT department or provider trying to convince them that a critical defect is a feature not a dangerous bug (Spoiler alert: I happen to have a father, sister, and soon-to be brother in law who are all physicians, and like all physicians I’ve spoken with, are quite vocal on the subject).

My father, Dr. Val Catanzarite, puts it in perspective:

Instead of looking at the existing workflows, which included both the good, the bad, and the ugly, and attempting to keep the best and improve the worst, IT developers figured that they could cure healthcare’s ills with gigabytes of…text. The EHRs that are supplanting the old paper records are text based, eliminating the problem of handwriting but also losing the functionality of drawings, etc. A checkoff sheet that took less than a minute to complete a year ago might have been replaced with an EHR page that takes 5 to 10 minutes. Touch screens? In your dreams. Speech recognition? Speech what?  

Compare the front end of a gigabucks EHR implementation to an iPad and it’s immediately apparent why most EHR implementations result in a huge increase in provider time to document a patient encounter. In the past, a quick checkup might be 15 minutes— 12 minutes face to face doctor and patient time, and 3 minutes to complete a form or write a note. Now, the same visit might require 8-10 minutes to complete the note, that is, if  the EHR is “up”. (EHRs are famous for long lunch breaks, and often like an afternoon nap as well). Even after a year or two after educating the users, fine tuning the systems, etc, the provider productivity typically remains at least 10%, and often over 20%, below baseline.

Keep in mind, Dr. Val Catanzarite is no luddite. It's no wonder many doctors and nurses reflect fondly on the simpler days of paper charts.  The bottom line is, the healthcare software is failing physicians, and it's not the user's fault, these systems appear to have been designed with a sole focus on technical requirements and budgetary constrants, with no consideration for aesthetics or usability. The market seems ripe for a disruptive startup to deliver a product that works and wipe the floor with the competition.

Unfortunately, in the healthcare industry, the costs of switching software platforms are prohibitively high. When considering the costs of switching software systems for a hospital or network of healthcare providers, the actual cost of the software is a relatively small part of overall cost of the system, with the majority of the cost going towards services to customize the implementation and provide ongoing support and maintenance, while additional cost is driven by the need to train staff to be proficient with the software. Lock-in is further compounded by the questionable cross-compatibility of data between systems, which shows little hope of being resolved in the near term. Because of these factors, it's likely that the current generation of HIT software will dominate the industry for the foreseeable future. 

 

Unusable Software for patients

It's not just software for physicians that leaves much to be desired. Examine Kaiser Permanente’s industry leading Health Information Technology (HIT) implementation, showered in prestigious awards and copious praise like the *Stage 7 Award for Excellence* awarded by an organization that has obviously never used the system as a patient, because it's terrible. Yes, Kaiser's system affords access to EHR to 9 million subscribers, but how many can actually figure out how to use it? 

As a software product manager, designer, and rabid early adopter, I'd consider myself to be somewhere in the upper 90th percentile in terms of tech-savviness as well as ability to intuit new technologies. That said, I am regularly so confused by Kaiser's online portal that I end up (somewhat embarrassedly) calling their phone system, which is only slightly less awful. The problem is, the system seems to have been designed to comply with some massive set of business requirements mandated to meet the standards of a "viable" EHR, with a total disregard for usability, information architecture, or user experience. It's as if the designer took the bulleted list of requirements and literally translated them to the page.

The question is, why is the software so bad? Kaiser Permanente generated $42 Billion in revenue in 2009. Obviously, they have the resources to secure quality software. Like many organizations, they chose to use the system developed by a multi-award winning, industry leading software provider, Epic. Epic is a rapidly growing company that generated $1.2 Billion in revenue in 2011, and is a company whose sole purpose is developing HIT software. So again, the question is: why is the software so bad? 

 

Revenue Growth is More Attractive than Innovation

My sister, Dr. Tatiana Catanzarite made the point that the big HIT corporations are “So focused on outbidding the competition for contracts with major hospital systems that there are few resources actually left for innovation.”  Consider Cerner, the second largest HIT provider by market share. 21% of their 2007 operating expenses went to research and development, but the majority of that went towards upgrading legacy systems, and only 3% of their operating revenue actually went towards developing new software. How much of that small slice of the pie went to evolutionary improvements, how much went to real innovation?

The truth is, the revenue growth of most HIT companies is driven by winning contracts with new hospitals and providers networks, not by creating better software. In the first 9 months of 2011, 67% of Cerner’s revenue came from Service, Support, and Maintenance, while only 31% came from software sales. From the perspective of a revenue machine like Cerner, if existing offerings are making a killing, and there is still room for growth, why rock the boat? The the top 25 HIT companies grossed over $26.7 billion in 2010, but an average of just 22% of that revenue came from software. For these companies building better software actually erodes revenues over the short term due to the cost of upgrades for existing contracts and cannabalization of existing products! It’s a classic Innovator’s Dilemma.

 

Unrealized Potential

Software has an enormous potential to increase the quality, accessibility, and overall experience of healthcare, but in order to realize the potential, the developers of the software need to re-align their focus on usability, user experience, and innovation, instead of short term revenue growth. 

Consider for a moment that the industry leading EHR solutions lack even the a simple feature to allow the doctor to query a set of their own patients around a certain medical condition, or medication, or compare the charts of recently treated patients. The physician has to literally contact the IT department to run queries directly in the database to collect that type of information, and you can forget about reports on really useful stuff like what medications worked best, average number of visits until diagnosis, etc. 

There are some very promising startups in the healthcare space, such as dr chronoPractice Fusion, and One Medical Group, each innovating around one or more of the core problems such as EHR UX, practice management, or digital patient experience, but who will challenge the Goliaths of HIT like Cerner, McKesson, Epic, and Siemens that control the lion's share of the market through integrated solutions and contracts with hospitals and large provider networks?

The reality is, the opportunities for innovation are much, much larger than just building usable systems for doctors and patients to manage electronic health information. There is an entire, virtually unexplored realm of possibility that includes data analysis, pattern recognition, and other forms of intelligence derived from data that is already being collected.

 

Connecting the Dots

Picture a system in which doctors could not only run queries searching for patterns within their own patients, but could use an intelligent software tool to organically discover previously overlooked patterns or correlations of medical conditions or disease within not just their own patients, but across larger data sets of provider networks, hospitals, or even annonymized data for entire countries or the whole world? What new relationships between conditions, diseases, medications, etc. would we discover if we could harness the power of the data we are already collecting? 

In order too realize the true data intelligence potential afforded by electronic health information, there needs to be a centralized, secure, way to share information between organizations. Even if EHR providers were to magically fix all of the problems with their existing software overnight, we would still be no closer to a solution to the fact that the EHR landscape is hopelessly fragmented by data incompatibility and server siloing. A nationwide or worldwide EHR database is a pipe dream today, but standardization would create an enormous amount of value, and will happen eventually, although it certainly won't happen over night.

A paradigm shift looms on the health IT horizon, and it won’t remain a pipe dream forever. Imagine an intelligent EHR system that learns about patients and automatically looks for patterns by matching data points against a massive, centralized database. Imagine a patient experience that makes managing personal health information as simple as Mint.com makes managing financial information. Such system functionalities have been envisioned for decades, and they could literally transform healthcare, but only when a company enters the space that thinks and operates like Netflix, or Intuit, Apple, or Amazon, and my guess is it will be a startup, not an incumbent player. 

Minimum Viable Values

Curebit is under fire after a heated exchange with DHH of 37 signals over copying of aesthetics, code, and assets from the Highrise website. In the ensuing discussion, some commenters have questioned whether it’s such a big deal. After all, copying isn't really stealing, so isn’t it ok for a "small startup" to hustle and use “inspiration” to run quick and dirty tests? The answer is no, and here’s why:

 

The actions of a business reveal its values

All the small, trivial seeming actions that a business takes, when viewed in aggregate, are a reflection of its values. If you take a short view on decision making, if you allow the constant tyranny of urgency to drive actions, if you fail to carefully articulate the values you want your business to be built on, then you run the risk of allowing capricious actions to corrupt the values of your business.

Building a business takes courage, tenacity, and integrity. It requires that you balance the needs and desires of a myriad of different stakeholders including customers, employees, investors, etc. A business needs to stand for something, or else it will too easily bend to the influence and be pulled apart in a thousand different directions. In order to build a business that’s worth a damn you need a strong foundation of values on which to build.

 

Buzzwords are not a substitue for knowing what you’re talking about

The “we’re lean” excuse is a cancer that we need to excise from the startup community immediately. This is just the most recent and highest profile case of a startup using a concept they clearly don’t understand as an excuse for cutting corners. The Lean Startup may be the most important challenge to entrenched, traditional management in recent memory, but unfortunately the principals defined within are so widely misunderstood that they have been repeatedly used to defend shoddy work, and at times to even argue the complete opposite of what the points made in the book.  The Lean Startup is a set of guidelines to help in the execution of entrepreneurial ideas, it is absolutely not an excuse for cutting corners in your product, or in your business. The concept of a “Minimum Viable Product” has nothing to do with delivering half-assed work, it is simply a strategy for validating ideas with the minimum viable amount of work. Sometimes that means testing a product concept with a landing page and a sign up button before writing code, it doesn't mean pushing out half-finished products, or copying others to get things done faster.

 

The bottom line is: ideas are cheap, execution is hard, but there are no shortcuts to building things that matter, especially when it comes to defining what you stand for.

An Inside Look at Kaiser Permanente's Award Winning EHR System

Lets play a game called: "Guess which link contains the actual details of my health plan."

Solving that one took clicking through half a dozen dead end links (some of which literally have no content to display).

Another fun one came up recently when I attempted to make an appointment with my doctor after I dislocated my shoulder. When I attemped to schedule a same day appointment, I got an error message:

That seemed reasonable, so I expanded the search to a week, and got an error message:

Perplexed, I expaned the search to three weeks, and got an error message:

Obviously, I had a time-sensitive problem and needed to be seen, so I called the appointment line, where the opperator informed me that my doctor was on vacation for the month, so obviously that's why I couldn't schedule the appointment. She then booked me an appointment with another doc, and then cheerfully reminded me that in the future I could perform most simple tasks such as appointment booking using the online portal. Too bad Kaiser's system doesn't make it visible when a doctor is on vacation, and even if they did, it's not possible (as far as I have been able to discern) to schedule appointments with other doctors without a referral. 

Here's one last example of award-winning HIT system design.

Yes, Kaiser my email changed, so the one you have is the right one. Who writes this stuff?

How Google+ Got its Groove Back

When Google+ first launched, I hustled my way to get an early invite so that I could grope the shiny UI touches like circles, and oogle the sexy tech features like Hangouts, but up until now, I haven't been an active user of Google+, preferring to rely on my established networks of friends on Facebook, and news and industry knowledge on twitter, but when I saw  the Google+ Hacker News circle, I added myself and followed the circle, realizing it was an awesome, immediate way to tap into the thoughts of the most interesting network on the web

Today, however, I realized that the value of the circle had increased exponentially with the launch of Search plus Your World, adding a layer of magic, it-just-works utility to Google+ that Facebook and twitter just cant match.  Paired with the right network, Google+ now powers the most powerful and natural curated search out there, surfacing vetted results from your network (you'll need to be a member of Google+ and be logged in in order to see the change).

The service just rolled out, and Google+ is still a fairly young network, but based on my initial impressions I really think this is a brilliant play that 1) elegantly solves a need lots of users have, 2) takes Google+ beyond a defensive strategy against facebook and twitter, and 3) doesnt feel too creepy or big-brotherish. 

Facebook's Timeline would be better if it were really a timeline

Facebook’s Timeline is a beautiful evolution of what online sharing has become, and has the potential to redefine the way we preserve and interact with our memories, but only if Facebook gets serious about integrating their most popular existing feature into the Timeline.

Whether we share with friends, family, or publish our posts to the public, most of us generate a digital history of what we’re reading, doing, and thinking about on a daily basis. New relationships, new jobs, bad days, spontaneous trips, epic parties, major news stories, and all the mundane and magnificent events that make up beauty and magic of everyday life flow through Facebook and are quietly documented for posterity. The timeline metaphor brilliantly illustrates the twists and turns, the starts and stops of lives lived online, and taps into nostalgia, regret, triumph, and hope. 

The flood of rediscovered and reposted content from years past that has flooded my news feed as the Timeline was switched on to the bulk of my social graph over the past week has been a testament to just how much potential the Timeline has as another powerful flavor of social opiate to keep users riding the blue dragon, in large part because the transition was seamless, and immediately makes so much sense. Timeline apps like music, social news readers, and run trackers add some spice the experience, but the real meat, and the core of what has triggered the response that I’ve been seeing thus far is the feature that has been Facebook’s most popular feature since it rolled out: photos. The unexpected delight of going through your own timeline and finding all of your trips, friendships, and relationships elegantly summarized through photos is like finding the journal you forgot to keep five years ago, and the new timeline interface makes the experience easy, intuitive, and enjoyable. There has never been such a frictionless [*cringe* I know, but it's true] way to look back on what you were doing who you were spending your time with, who you were in the past.

There’s just one problem: just about all of the dates are wrong. Except in the rare (in past years) cases when someone posted a mobile upload or was extremely diligent about posting photos right away, most of the dates are off by a few days to a week or more, because they reflect the date when the photos were posted as opposed to the date when the photos were taken. This is problematic because the navigation of the timeline revolves around dates, specifically months, so when the pictures from Halloween go up in November rather than October, they’re a bit hard to find. It’s also just kind of a bummer to browse through a Timeline that’s not really a timeline, and find photos from the same event scattered across several days or weeks rather than grouped nicely into one event.

This issue makes me wonder how focused Facebook is on integrating Photos into the experience of this new product (as opposed to say, integrating apps, apps, and more apps). Unless there are a lot of Facebook users uploading scanned copies of printed photos, Facebook should be able to grab the metadata from uploaded photos and automatically tag the appropriate date (and in some cases, even location) of the photo. Unfortunately, photos we’ve put into the system so far have been stripped of their lovely, delicious metadata, but hopefully Facebook gets their act together quickly and starts using the data that’s already available to deliver on the promise of their newest product.

User experience is ripe for a revolution

I had a lovely chat on saturday with an eldely couple who were interested in the Kindle I was reading while eating breakfast. What struck me about the exchange was something that I have been mulling over for some months, but the conversation was sort of a culmination of an idea slowly building up over time: technology has outpaced the ability of people to use it, and user experience is ripe for a revolution.

The man, a former math teacher, and the woman, a former nurse and staunch library advocate, were interested in how I use the device– do I use it for books, newspapers, magazines, for pleasure, work, school? They asked about how much books cost, and how easy it is to put other reading materials on to it. I love the kindle and am quite an enthusiastic advocate of new technology and shiny gadgets, so I gushed about the battery life, the way it has replaced heavy stacks of books when I travel, the merits of reading on a matte screen vs. the back-lit eye strain of the iPad, how cool it is that I can sync the text from articles and blog posts on the web to my kindle with a simple click of the Instapaper bookmarklet in my web browser or iPhone, and how I can put pretty much any text document on the kindle by simply emailing it.

They were quite intrigued by the promise of the new technology and merits of a small lightweight device that can hold thousands upon thousands of books, but their question was– is it easy to use? I thought about it for a second, and then said "No." The kindle has solved a lot of the problems associated with reading digital content quite elegantly. The screen is nearly as eye-friendly as paper, the battery life is astounding, the experience of reading is refreshingly simple. However, the process of putting non-kindle store content on the device is not user friendly at all. If I didn't spend 60-80 hours a week working on internet-conected devices, devouring tech news via twitter, hackernews, and my increasingly neglected google reader account, I probably wouldn't have discovered Instapaper, and without Instapaper I don't think I would have realized on my own that I could email documents to my kindle and download them for free via wifi. Even with the happy accident of stumbling upon instructions for kindle delivery through Instapaper, it took me a while to figure out that there was a method of doing this for free and not paying for 3G delivery of freely available content. The point is, as good as the kindle is, it took some technological proclivity and more than a little luck to truly take advantage of the device. The irony is, the kindle is one of the simplest devices I've ever used, and even it still has some major shortcomings, so what does that say about the broader state of human- computer interaction?

The pace of technological advancement has increased exponentially over the past century; everything from computing power to storage has been essentially doubling every couple of years. Raymond Kurzweil calls this incredible rate of advancement the law of accelerating returns (often confused with Moore's Law, which relates specifically to the doubling rate of transistors on a circuit), and it means that between 1950 and 2000, computing power increased by a factor of roughly 100 million. The pace of technological adoption has also increased radically over the same period. Radio, invented shortly before the dawn of the 20th centruy, took 31 years to reach mass adoption. 100 years later, the web reached mass adoption in only 7 years. Facebook reached mass adoption in roughly half that time, and Groupon half again of that, reaching an astonishing 50 million users just over 2 years.

However, in the face of this nearly incomprehensible rate technological advancement and adoption, the way that we interact with computers has advanced painfully slowly. The WIMP graphical user interface (windows, icons, menus, pointing device) that dominates computing from cell phones to computers has remained fundamentally unchanged since its invention over 30 years ago. Evolutionary advancements have been made since, and Apple's iOS has introduced some more natural, post-WIMP interactions outside of the box of the prevailing mental model, but these are really baby steps, nothing compared to the staggering advancements made in computing hardware, algorithms, and technological capacity over the same period.

Today’s technological landscape frames a unique moment in history. From cheap, abundant computing power, to an unprecedented availability of public data, and powerful APIs that allow developers to rapidly develop brilliantly advanced applications that would have been prohibitively expensive for all but the largest organizations just 5-10 years ago, even the most advanced technology is becoming a commodity. Perched on the foundation of this vast technological platform, we're standing on the precipice of a paradigm shift that will be even more radical than the shift from the command line interface to the graphical user interface. The opportunity to develop an intuitive computing experience that allows the masses to harnesses the vast technology at our disposal is so obvious and so enormous, that a revolution in user experience seems all but inevitable.

The question is, what will the coming user experience revolution look like? Perhaps it will be driven by the current generation of Natural User Interface explorations, but just as likely it will be something as radically innovative, and in hindsight as obvious, as the WIMP graphical interface is compared to the command line interface that preceded it. As cool as the famous minority report interface looks on a movie screen, I'm personally not going to be lining up anytime soon to trade in my comfy chair and laptop for a big fat case of gorilla arm. Even the now ubiquitous mulit-touch interface has some serious limitations when it comes to serious full-time use-cases, and multitouch laptop screens certainly aren't positioned to replace the mouse and keyboard.

In order to be successful, the user experience revolution needs to promote natural and intuitive experiences, humane interfaces, and adaptive technologies that interpret context in order to meet individual users' unique needs. The great challenge will be to build upon the abundant and rich technological platforms that surround us, from machine learning to location data, and to create user-centered applications that learn the user rather than requiring the user to learn them, and anticipate and adapt to the user's needs and expectations rather than demanding that the user conform to a rigid mold defined by the software. 

Whatever it looks like, the one thing that is certain is that there is an immense and immediate opportunity for innovation in user interface and experience design, and that companies like Apple who understand that technology is intrinsically worthless to the end user unless it is usable, stand to profit enormously by making technology easier and more enjoyable to use, not just by geeks like me, but by curious but intimidated individuals like the couple I met this weekened, who stared longingly at me absently flipping the pages of my sleek kindle with one hand while leisurely eating with the other, as they fumbled clumsily with their unwieldy newspapers.

iPhone vs. BlackBerry: superior UX trumps inferior messaging

As a long time Apple geek, but newly minted iPhone user, I've been spending way too much time this week poking and prodding and petting my shiny new toy. Now it's not like I've never used an iphone before, and I've had an iPad since they came out, but in the process of switching to full time iPhone duty after years using BlackBerry, I've been absolutely delighted by the user experience. What's been the most striking to me is how in love with the experience of using the phone I am, even though it is obviously inferior to my BlackBerry in some ways.

When it comes to messaging, iPhone isn't even in the same league as BlackBerry. RIM's BlackBerry Internet Service push technology syncs emails in near realtime without destroying battery life, and when it comes to instant messaging, BlackBerry Messenger is a class by itself. Clearly, BlackBerry's technology trumps iPhone in one of the most critical mobile use-cases: messaging. The thing is, the rest of the iPhone experience is so good, I just don't care.

The iPhone user experience starts before you even open the box. Before you even hold the product in your hand, you have the experience of opening a meticulously crafted box that's so precise it doesn't even need the plastic it comes wrapped in to hold together. The level of craftsmanship and care that Apple lavishes upon its products is immediately obvious.

What struck me immediately after the always enjoyable Apple unboxing, was how sleek and solid the iPhone felt in my hand. My old BlackBerry Tour felt cheap, clunky, and unwieldy by comparison.

The next thing that struck me was the subtle magic of the iPhone keyboard. The first line of text that I typed on the iPhone was riddled with errors, but by the second the error rate had dropped significantly. The keyboard quickly and quietly adapted to my fat fingers, and just worked.

Another nice touch that seems so obvious that it's hard to even notice is how the state of the keyboard locks automatically when you switch from QWERTY to numbers and symbols. One of my pet peeves on my BlackBerry was the outrageously inefficient process of typing a phone number in a text message: alt-6 alt-1 alt-9… it sucked. And yes, I know there is some way to lock keyboard states on the BlackBerry, but I never bothered to figure it out, and that's the crucial difference on the iPhone: I hardly noticed it, and it just worked.

There are dozens, hundreds of other brilliant UX touches in the iPhone, like obscenely detailed transitions, and intuitive message refreshing, that make the device a joy to use, and allow me to totally look past fact that BlackBerry absolutely slays the iPhone when it comes to the backend infrastructure that serves email and messages.

Wash the Dishes When Nobody Else Will

When you're starting a business, you have to wear a lot of hats. You don't have an HR department, or an IT department, or a marketing department, or any specialized department, so if a press release needs to be written, or a networked printer needs to be set up, or you need to find the personal email address of a prominent celebrity chef, you have to figure out how to do it, and do it yourself. When you're starting up you have to take the initiative to identify things that need to be done, and do them, even if they're outside of your “job description” or area of expertise. Even if you don’t know how to do them. Especially if you don’t know how to do them.

Here at InTheMO we've been growing lately, and we're fortunate enough to be staffing out specialized roles for things like PR, HR, and quality assurance, but one of my personal goals is to preserve that scrappy sense of startup hustle that we all felt back when we were sitting in a cramped, black widow infested office in a back alley in Culver city, making calls to Florence at 2am trying to book our first Michelin-Starred Restaurant shoot.

When Cary Levine, our CEO, talks about owning projects and taking responsibility, he often says “Operate as if you were the CEO of the company.” He says this all the time, and I think that the tendency is to think that being CEO of the company is glamourous: having authority and freedom, being the big dog, sitting at the top of the chain of command.

When I hear Cary say "Be the CEO," I think of a story from when I first started working at InTheMO. Back at our first real office (and I say “real” because before that we were essentially squatting in empty office space), we had a kitchen, but no dishwasher, and no cleaning service. A lot of people brought their lunches, cooked in the kitchen, used the dishes, and generally made a huge mess. Everything would just accumulate until it got really out of hand, but somehow, it always got cleaned up. I was an intern at the time and only working 3 days a week, so i just figured everyone kind of pitched in when I wasn’t there.

One day, I walked by the kitchen and saw Cary washing the dishes. “Huh” I thought, “The CEO is pitching in. Cool.” I figured it must have been his turn or something, and sort of forgot about it.

A week or so passed, and then it happened again. The kitchen became a ginormous mess, and then one day, there’s Cary, washing dishes. Then it happened again, and again. Each time, he’d look up, and nod, then go back to scrubbing dishes. One day, I walked by the kitchen and noticed it was a huge mess. So I washed the dishes.

Being the CEO of the company isn’t about power, authority, or glamour, it’s about washing the dishes when nobody else will.

Be the CEO of the company. Take responsibility for things that other people ignore. That’s the definition of leadership, and if you make a habit of it, pretty soon you'll be inspiring the people around you to do the same.

Update: great discussion going on at Hacker News

Starting is the Hardest Part

"What is not started today is never finished tomorrow" -Johann Wolfgang Goethe

Why is starting so hard?

Earlier this year, my friend Justin Levine sent out an email about a list of goals that his father, Gary, created early in his career. Young and broke, but determined and ambitious, Gary wrote down detailed personal and professional goals for what he wanted to accomplish in life. Recently, he rediscovered his list of goals, and was able to cross off every goal that he had set for himself 25 years earlier.

Inspired by Gary's story, I set about creating my own list of goals, and realized immediately that the first few things I put down were things that I'd wanted to do for years, but just hadn't gotten around to starting. A few days later I bought some books on programming and started spending a few hours each night working through the lessons, and at the same time, I also began thinking about ideas for what is now this blog. These are just beginnings, but talking the first steps made me ask the question: if I've know for years that I wanted to learn to program and write a blog, why was it so damn hard to get started?

 

Inertia 

Sometimes the strongest resistance to starting something new comes from resistance to change. If you have a routine, or a way of doing something that works, it is very difficult to take the first step towards finding something better because, why rock the boat?

For me, It had simply been easier to come home from work and veg out in front of the TV every night, rather than spend a couple of hours working on personal project.

The thing is, inertia works both ways. An object at rest tends to stay at rest, but an object in motion tends to stay in motion. Now that I've broken that complacency and established a new routine, I can feel myself hurtling along a new path. Now that I've started, continuing seems much easier than it did when I was just thinking about it.

 

Fear of Failure 

One of the most daunting aspects of starting is the possibility of failure. It's pretty easy to talk yourself out of starting something once you ask yourself questions like "What if I'm not capable at this? What if I'm not good at it? What if, what if, what if?"

I'm deeply afraid of failure, a fact that I think stems at least partially from how competitive I am. If I'm not not already good at something, every attempt feels like failure, which is so frustrating that I often don't want to do it at all.

What's interesting is that fear of failure only applies to things that I'm not good at. With things that I'm good at, my sentiment towards failure is quite the opposite: I don't fear it at all, in fact I welcome it, even celebrate it.

For example, the best designs I create come out of innumerable failures. The more iterations I throw away, the better the final product inevitably becomes. In fact, with designs for products and interfaces, I am wary of not failing. If i get to a final version too quickly without throwing away enough iterations, I find my design highly suspect and question whether it is in fact as good as i think it is. Conversely, if i have gone through hundreds of iterations, exhausted all possibilities, and considered every scenario possible, I know that it's good, and am prepared to defend my position. Each failure opens up new ideas that influence the final product, even if it's just knowing what won't work

The reality is, failing is good, valuable, even essential. The more you fail, the closer you get to succeeding.The more solutions you've found that don't work, the more confident you are when you find the one that does. Rather than fearing failure when starting something, it is much more productive to embrace failure, and learn how to fail quickly and efficiently.  This is of course, easier said than done.

 

Insecurity 

Something else that makes starting difficult is fear that people will judge me or think I am incompetent, stupid, or naive. I tend to expect myself to be an expert at everything– specifically to be better than other people. As a result I tend to over analyze how I will look to others, rather than just going for it.

This over-analysis can be absolutely paralyzing. Take starting a conversation with a stranger for example. The hardest part is the opening. You want to say something clever, witty or interesting, the question is: what? How many conversations have I failed to start because I didn't have the perfect opening line? The irony is how easy conversations become once they start. The first response is infinitely easier than the opening, the next even easier than, and at some point the conversation becomes natural and takes on a life of its own. Who cares if you start with a stupid comment or boring observation? Getting started is the important part.

Such is the nature of starting anything. When I first started coming up with ideas for client projects, it would take me days, weeks even, to produce a viable idea. I would avoid saying anything until I thought I had something amazing, for fear of looking stupid. In the process I would throw away lots of ideas that were probably pretty good, or at least good enough to consider. Now, little more than a year later, I can spitball ideas on the fly, come up with dozens of possibilities and chase each one in many different directions without being embarrassed when i reach a dead end or realize that one of the ideas wont work. I'm secure enough in my creative abilities and technical knowledge that I'm no longer afraid of small failures because I know that in the end i can figure it out and make it work.

It's hard not to feel insecure when starting, because of uncertainty. Experience grants you the power to know or at least feel very confident that you can get things done, but when you're starting something new you don't have that luxury. You're forced to leave your comfort zone, and step into an uncertain future. Over time you gain confidence, and become more comfortable putting yourself out there, because you're secure in your abilities.

 

Fear of Commitment 

Another thing that makes starting hard is committing to something without being sure I will succeed. You can't truly start something without committing to seeing it through to the end, otherwise you're not giving yourself a fair shot at it.

When I decided in 2008 that I was going to run my first marathon, I made a commitment that had a significant impact on my life for a year. I couldn't say "I'm going to run a marathon and then decide later if it is something I want to pursue long term." I had to commit to it fully and without reservation from the very beginning, because the moment I let myself lapse on the training, I would have already quit.

When you're starting something new you have two choices: honestly and truthfully commit to giving it everything you'e got, or quit immediately. This is is terrifying, because it makes you vulnerable to giving it everything you've got and simply not being able to do it. Starting requires a great deal of faith in yourself, you have to trust that you are capable of accomplishing your goal.

 

What Haven't You Started? 

A few months ago I was talking to Gary about business ideas and he said "Opportunity is everywhere, the hard part is recognizing it and choosing which ones to answer."

A blank page is only a few thousand unwritten words away from being an eloquent blog post. A stranger is only a few unspoken sentences away from being a friend. Your body is only a few hundred un-run miles from being ready to run a marathon. Your startup idea is only a dozen years of hard work away from being a billion dollar business. The truth is, opportunity is everywhere, and endings are easy to see, the hard part is committing to pursue an opportunity, and starting.

So, the question is, what haven't you started?