Mobile

The future of computing is mobile, they say, and they’re not referring to that lovely city in Alabama. Being a fan of smartphones and their UI design, I’ve considered myself mostly in the loop with where things were going, but recently it’s dawned on me I might as well have been lodged in a cave for a couple of years, only to just emerge and see the light. The future of computing is more mobile than I thought, and it’s not actually the future. (It’s now — I hate cliffhangers).

I had a baby a few years ago. That does things to you. As my friend put it, parenthood is not inaccurately emulated by sitting down and getting up again immediately. It’s a mostly constant state of activity. Either you’re busy playing with the child, caring for it, planning for how you’re going to care for the child next, or you’re worrying. There’s very little downtime, and so that becomes a valuable commodity in itself.

One thing parents do — or at least this parent — is use the smartphone. I put things in my calendar and to-do list because if it’s not in my calendar or to-do list I won’t remember it. I don’t email very much, because we don’t do that, but I keep Slack in my pocket. I take notes constantly. I listen to podcasts and music on the device, I control what’s on my television with it, and now I’m also doing my grocery shopping online. (I’d argue that last part isn’t laziness if you have your priorities straight, and having kids comes with an overwhelming sense that you do — it’s powerful chemistry, man.)

So what do I need my laptop for? Well I’m an interface designer, so I need a laptop for work. But when I’m not working I barely use it at all. To put it differently, when I’m not working, I don’t need a laptop at all, and if I were in a different business I’d probably never pick one up. There’s almost nothing important I can’t do on my phone instead, and often times the mobile experience is better, faster, simpler. By virtue of there being less realestate, there’s just not room for clutter. It requires the use of design patterns and a focus on usability like never before. Like when a sculptor chips away every little piece that doesn’t resemble, a good mobile experience has to simplify until only what’s important remains.

It’s only in the past couple of years that the scope of this shift has become clear to me, and it’s not just about making sure your website works well on a small screen. Computers have always been doing what they were told, but the interaction has been shackled by lack of portability and obtuse interfacing methods. Not only can mobile devices physically unshackle us from desks, but their limitations in size and input has required the industry to think of new ways to divine intent and translate your thoughts into bits. Speaking, waving at, swiping, tapping, throwing and winking all save us from repetitive injuries, all the while being available to us on our own terms.

I’m in. The desktop will be an afterthought for me from now on — and the result will probably be better for it. I joked on Twitter the other day about watch-first design. I’ve now slept on it, and I’m rescinding the joke part. My approach from now on is tiny-screen first and then graceful scaling. Mobile patterns have already standardized a plethora of useful and recognizable layouts, icons and interactions that can benefit us outside of only the small screens. The dogma of the desktop interface is now a thing of the past, and mobile is heralding a future of drastically simpler and better UI. The net result is not only more instagrams browsed, it’s more knowledge shared and learned. The fact that I can use my voice to tell my television to play more Knight Rider is just a really, really awesome side-effect.

The Plot To Kill The Desktop

As a fan of interface design, operating systems — Android, iOS, Windows — have always been a tremendous point of fascination for me. We spend hours in them every day, whether cognizant about that fact or not. And so any paradigm shifts in this field intrigue me to no end. One such paradigm shift that appears to be happening, is the phasing out of the desktop metaphor, the screen you put a wallpaper and shortcuts on.

Windows 8 was Microsofts bold attempt to phase out the desktop. Instead of the traditional desktop being the bottom of it all — the screen that was beneath all of your apps which you would get to if you closed or minimized them — there’s now the Start screen, a colorful bunch of tiles. Aside from the stark visual difference, the main difference between the traditional desktop and the Start screen, is that you can’t litter it with files. You’ll have to either organize your documents or adopt the mobile pattern of not worrying about where files are stored at all.

Apple created iOS without a desktop. The bottom screen here was Springboard, a sort of desktop-in-looks-only, basically an app-launcher with rudimentary folder-support. Born this way, iOS has had pretty much universal appeal among adopters. There was no desktop to get used to, so no lollipop to have taken away. While sharing files between apps on iOS is sort of a pain, it hasn’t stopped people from appreciating the otherwise complete lack of file-management. I suppose if you take away the need to manage files, you don’t really need a desktop to clutter up. You’d think this was the plan all along. (Italic text means wink wink, nudge nudge, pointing at the nose, and so on.)

For the longest time, Android seems to have tried to do the best of both worlds. The bottom screen of Android is a place to see your wallpaper and apps pinned to your dock. You can also put app shortcuts and even widgets here. Through an extra tap (so not quite the bottom of the hierarchy) you can access all of your installed apps, which unlike iOS have to manually be put on your homescreen if so desired. You can actually pin document shortcuts here as well, though it’s a cumbersome process and like with iOS you can’t save a file there. Though not elegant, the Android homescreen works reasonably well and certainly appeals to power-users with its many customization options.

Microsoft and Apple both appear to consider the desktop (and file-management as a subset) an interface relic to be phased out. Microsoft tried and mostly failed to do so, while Apple is taking baby-steps with iOS. If recent Android leaks are to be believed, and if I’m right in my interpretation of said leaks, Android is about to take it a step beyond even homescreens/app-launchers.

One such leak suggests Google is about to bridge the gap between native apps and web-apps, in a project dubbed “Hera” (after the mythological goddess of marriage). The mockups posted suggest apps are about to be treated more like cards than ever. Fans of WebOS1 will quickly recognize this concept fondly:

The card metaphor that Android is aggressively pushing is all about units of information, ideally contextual. The metaphor, by virtue of its physical counterpart, suggests it holding a finite amount of information after which you’re done with the card and can swipe it away. Like a menu at a restaurant, it stops being relevant the moment you know what to order. Similarly, business cards convey contact information and can then be filed away. Cards as an interface design metaphor is about divining what the user wants to do and grouping the answers together.

We’ve seen parts of this vision with Android Wear. The watch can’t run apps and instead relies on rich, interactive notification cards. Android phones have similar (though less rich) notifications, but are currently designed around traditional desktop patterns. There’s a homescreen at the bottom of the hierarchy, then you tap in and out of apps: home button, open Gmail, open email, delete, homescreen.

I think it’s safe to assume Google wants you to be able to do the same (and more) on an Android phone as you can on an Android smartwatch, and not have them use two widely different interaction mechanisms. So on the phone side, something has to give. The homescreen/desktop, perchance?

The more recent leak suggests just that. Supposedly Google is working to put “OK Google” everywhere. The little red circle button you can see in the Android Wear videos, when invoked, will scale down the app you’re in, show it as a card you can apply voice actions on. Presumably the already expansive list of Google Now commands would also be available; “OK Google, play some music” to start up an instant mix.

The key pattern I take note of here, is the attempt to de-emphasize individual apps and instead focus on app-agnostic actions. Matias Duarte recently suggested that mobile is dead and that we should approach design by thinking about problems to solve on a range of different screen sizes. That notion plays exactly into this. Probably most users approach their phone with particular tasks in mind: send an email, take a photo. Having to tap a home button, then an app drawer, then an app icon in order to do this seems almost antiquated compared to the slick Android Wear approach of no desktop/homescreen, no apps. Supposedly Google may remove the home button, relegating the homescreen to be simply another card in your multi-tasking list. Perhaps the bottom card?

I’ll be waiting with bated breath to see how successful Google can be in this endeavour. The homescreen/desktop metaphor represents, to many people, a comforting starting point. A 0,0,0 coordinate in a stressful universe. A place I can pin a photo of my baby girl, so I can at least smile when pulling out the smartphone to confirm that, in fact, nothing happened since last I checked 5 minutes ago.

  1. Matias Duarte, current Android designer, used to work on WebOS  

Good Decisions, Else Options

There’s a mantra in the WordPress development community: decisions, not options. It’s meant to be a standard to which you hold any interface design decision: if you make a decision for users it’ll ultimately be better than forcing them to make decisions themselves. It’s a decent mantra — if you’re not mindful you’ll end up with feature creep and UI complexity, and it’s orders of magnitude more difficult to remove an old option than it is to add one in the first place. Adding an option instead of making a decision for the user is almost always bad UI design.

Except when it’s not.

The problem with a mantra like this is that it quickly gets elevated to almost biblical status. When used by a disgruntled developer it can be used to shoot down just about any initiative. Like Godwins law for WordPress: once you drop the “decisions not options” bomb, rational discussion comes to a halt.

The thing about open source development is that it’s much like natural evolution: it evolves and adapts to changes in the environment. Unfortunately that also means features once useful can become vestigial once the problem they used to solve becomes obsolete. Baggage like this can pile up over years, and maintaining backwards compatibility means it can be very difficult to rid of. Sure, “decisions not options” can help cauterize some future baggage from happening, but it’s also not the blanket solution to it.

The problem is: sometimes the right decision is unfeasible, therefore beckoning an option in its absence. WordPress is many things to many people. Some people use it for blogging, others use it for restaurants, portfolios, photo galleries, intranets, you name it. Every use case has its own sets of needs and workflows and it’s virtually impossible to make a stock experience that’s optimal for everyone. Most bloggers would arguably have a better experience with a slew of WordPress features hidden or removed whereas site owners might depend on those very same features for dear life. By catering to many use-cases at once, user experiences across the board are bound to be unfocused in some way or other.

The “Screen Options” tab is an example of a feature that would probably not exist were “decisions not options” taken at face value. Screen Options exists on the idea that not everyone needs to see the “Custom Fields” panel on their Add New Post page, yet acknowledges that some users will find that feature invaluable. It is an option added in absence of a strong decision, for example, to limit WordPress to be a blogging-only tool. I consider it an example of an exception to the mantra for the good of the user. Sure, the UI could use some improvement, let’s work on that, but I really appreciate the ability to hide the “Send Trackbacks” panel.

I’m a fan of WordPress. I’m a fan of good decisions, and I’m a fan of good UI design. I believe that if we relieve ourselves of arbitrary straitjackets and approach each design objective with a sense of good taste and balance, we can make excellent open source software. Cauterizing entire avenues of UI simply because it adds options, however, negates the fact that sometimes those options exist in absence of a higher-up decision that just can’t be made for whatever reason.

Jony's iOS 7

Back in October last year, Scott Forstall was replaced by Jony Ive, and I asked the question: did iOS just get interesting again? Last night we found out, and the answer is yes.

design_hero_screen_2x class=

There’s a lot to like about the new iOS 7. As a whole, the result looks mostly unique. There’s a nice clean aesthetic going with the thin Helvetica, the white UI chrome, the sandblasted layers and the almost complete absence of gaudy textures. It’s also colorful. Which is a good thing. Right?

Leading up to this there were jungle-drums touting how flat the new UI was going to look (as though every UI will suddenly be clean and uncluttered if you just run it over with a bulldozer). Fortunately that’s not what happened. Don’t get me wrong: I do like my UIs to be clean and simple, I just find the term “flat” to be mostly meaningless when applied to design. There are no magic bullets, there’s only good design and bad design, and I think Jony Ive gets that. So instead of trumpeting flat, Apple trumpeted true simplicity. Oh, and grid-based icons:

Sure, there’s certainly a grid there. I was mostly paying attention to the light-source for those gradients, though: why does Phone looks embossed while Mail looks inset? Also: Game Center? Again?

There will be no tears shed for the linen texture. I will not mourn the loss of green felt. Still, the new iconography alone makes iOS 7 such a departure that there’s bound to be some learning curve, which begs the question: why didn’t they go further now they were at it?

They had a real opportunity here. Jony could’ve said to his team:

Team! We’ve dominated the smartphone market for the last 5 years with a grid of round-rect icons. How do we re-think it from the ground up for the next decade? How do we create something that’ll make Samsung scramble to copy us again?

Perhaps they did just that. Conceivably they created giant mood-boards. Maybe they decorated hip little cubicles with smiling model faces and photos of subway signs and collages of differently colored post-it notes. Could be they brain-stormed all the places they see the mobile space go in the next ten years: creepy glasses, holographic watches, voice-controlled smart underwear. No doubt they considered the convergence of the cloud with all these new-fangled features. Perchance they arrived right back at a grid of icons: Eureka! We had it right all along!

carintegration_icon_2x

I hope that’s not the case. I hope they had grander ideas… post-smartphone ideas. I’m hoping they were just so lazer-focused on shipping on time they had to punt their ideas for replacing Springboard. I’m hoping Jony felt the most important thing was to uproot the old linen-clad ways and set out a strong new direction for all future Apple UIs. I want to believe.

I want to believe that maybe one day we’ll have smartphones whose strongest visual cues aren’t defined by the graphical prowess of 3rd party icon designers. I want to believe that maybe one day we’ll look back at websites that use confirm() to alert us of their mobile apps as a dark age. I want to believe that maybe one day it’ll be possible to avoid all social interaction in a manner more impressive than tapping in and out of apps. Is that so much to ask?

Jony’s iOS 7

Back in October last year, Scott Forstall was replaced by Jony Ive, and I asked the question: did iOS just get interesting again? Last night we found out, and the answer is yes.

design_hero_screen_2x class=

There’s a lot to like about the new iOS 7. As a whole, the result looks mostly unique. There’s a nice clean aesthetic going with the thin Helvetica, the white UI chrome, the sandblasted layers and the almost complete absence of gaudy textures. It’s also colorful. Which is a good thing. Right?

Leading up to this there were jungle-drums touting how flat the new UI was going to look (as though every UI will suddenly be clean and uncluttered if you just run it over with a bulldozer). Fortunately that’s not what happened. Don’t get me wrong: I do like my UIs to be clean and simple, I just find the term “flat” to be mostly meaningless when applied to design. There are no magic bullets, there’s only good design and bad design, and I think Jony Ive gets that. So instead of trumpeting flat, Apple trumpeted true simplicity. Oh, and grid-based icons:

Sure, there’s certainly a grid there. I was mostly paying attention to the light-source for those gradients, though: why does Phone looks embossed while Mail looks inset? Also: Game Center? Again?

There will be no tears shed for the linen texture. I will not mourn the loss of green felt. Still, the new iconography alone makes iOS 7 such a departure that there’s bound to be some learning curve, which begs the question: why didn’t they go further now they were at it?

They had a real opportunity here. Jony could’ve said to his team:

Team! We’ve dominated the smartphone market for the last 5 years with a grid of round-rect icons. How do we re-think it from the ground up for the next decade? How do we create something that’ll make Samsung scramble to copy us again?

Perhaps they did just that. Conceivably they created giant mood-boards. Maybe they decorated hip little cubicles with smiling model faces and photos of subway signs and collages of differently colored post-it notes. Could be they brain-stormed all the places they see the mobile space go in the next ten years: creepy glasses, holographic watches, voice-controlled smart underwear. No doubt they considered the convergence of the cloud with all these new-fangled features. Perchance they arrived right back at a grid of icons: Eureka! We had it right all along!

carintegration_icon_2x

I hope that’s not the case. I hope they had grander ideas… post-smartphone ideas. I’m hoping they were just so lazer-focused on shipping on time they had to punt their ideas for replacing Springboard. I’m hoping Jony felt the most important thing was to uproot the old linen-clad ways and set out a strong new direction for all future Apple UIs. I want to believe.

I want to believe that maybe one day we’ll have smartphones whose strongest visual cues aren’t defined by the graphical prowess of 3rd party icon designers. I want to believe that maybe one day we’ll look back at websites that use confirm() to alert us of their mobile apps as a dark age. I want to believe that maybe one day it’ll be possible to avoid all social interaction in a manner more impressive than tapping in and out of apps. Is that so much to ask?

Did iOS just get interesting again?

Scott Forstall, head of iOS and apparent fan of skeumorphism, has been booted out of Apple. Jony Ive, designer of beautiful hardware and previous critic of Apples software interface design takes over:

Amazingly, it’s said that Forstall’s coworkers were so excited to show him the door that they volunteered to split up his workload — Eddy Cue takes on Siri and Maps while OS X’s Craig Federighi gets iOS. And Ive, who has cemented his reputation as a legendary industrial designer over his two-decade Apple career, gets the opportunity to refresh an iOS user experience that has stagnated over the last several generations.

It’s no secret that I’m not a fan of the current iOS design philosophy, but I’m a huge fan of Jony Ives design sensibilities (even if he sometimes takes it too far, like the half-sized arrow keys on the keyboards). Now I’m suddenly excited to see iOS 7.

Reimagining the omnibar

Sean Whipps reimagines Google Chrome’s omnibar. The result is super clean and minimal. I certainly liked it. What I’ve come to learn, however, is that we’re in a state of interface design flux. We’re moving away from the keyboard, and towards touchscreen-friendly huge buttons and voice recognition — both things that should be welcome to sufferers of RSI. That future doesn’t look quite like the keyboard/commandline friendly future Jef Raskin imagined.