iPhone switch observations, just a few days in

Just a few days ago, I made a temporary switch to an iPhone 5C as my daily driver, just to get it under my skin. Here are some observations I’ve made so far:

  • Man, there are a lot of passwords to type in on a new phone.
  • The fact that I have to type in my password in the App Store all the time, even for free apps, is driving me crazy. I know the fingerprint scanner makes this a non issue, but it still seems sub-optimal that there’s not even an off by default option to not ask for passwords.
  • The camera… Even on this two years old tech it takes better photos than most Android cameras I’ve used.
  • The 3rd party keyboard implementation is so janky it’s almost not even worth using a sliding keyboard. And on the stock keyboard, letters on the keyboard are capitalized even when what they output is not. That has to be the last vestigial skeuomorph in the ecosystem.
  • The physical mute switch is a stroke of genius, especially when the Settings > Sounds > Vibrate on Silent option is unchecked.
  • The decision to not allow me to pick a default browser to open links in, feels completely arbitrary and archaic, especially since some apps like Digg Reader implement workarounds to give you the choice.
  • The app situation is good in this ecosystem.
  • Notifications aren’t great. Clearing them even less so.
  • I miss the permanent Android back button in the bottom left. It seems every back button in the system is different. Some screens but crucially, not all, allow you to swipe left to go back. I bet this is an issue on the 6+.
  • I’ve missed this small form factor. Imagine if they removed all the bezels to make the screen larger, I bet they could put a 4.5 inch on it without an increase in size.

Switching to iPhone for a bit

I’ve been a fan of Googles products ever since I switched from Alta Vista. So it felt like a natural fit to get an Android device back in the day when it was time for me to upgrade from my dumbphone, and I’ve been using an Android device ever since. I wrote about ecosystems a while ago, and the ecosystem is exactly what’s kept me there: you sign in to your phone with your Google account, and mail, calendar, notes, contacts and photos sync automatically. Also there’s a really great maps application.

In my day job I make web-apps that have to work on mobile first, and iOS is an important platform for me to know. Now I’ve used iOS for years — it’s the phone I bought for my wife and recommended to my dad. We also have an iPad, and I have used an iPhone for testing for years. I’m no stranger to how things work there. But I feel like something special happens when you make a conscious switch to the platform, make it your daily driver. Phones have become so utterly personal devices, they’re always with us and we invest ourselves in them. Unless I jump in fully, I have a feeling there’s some bit I’m missing.

So starting today I’m an iPhone user. No, I wouldn’t call this a switch — call it a “soak test”. I fully expect to switch back to Android — I’m actually eyeing a Moto X 2014. That is, unless the experience of investing myself fully in the iPhone is so compelling that I have no desire to go back, which is entirely possible. I won’t know unless I give it a proper test. Since I’m in the fortunate position to be able to make this switch, there’s no good reason not to. I’ll be using my white iPhone 5C testing device. I expect to be impressed by the camera. I expect to enjoy a jank-free fluidness of the OS, even if I expect to turn off extraneous animation. I’m curious how I’ll enjoy the homescreen and its lack of customizability compared to Android, and I can’t wait to see if the sliding keyboards in the App Store are as good as they are on Android. I should have some experiences to share on this blog in a month or so. Let me know any apps you want me to try!

Archive, Don’t Delete

I’m one of the lucky … actually I have no idea how many or few have Google Inbox. In any case, I was graciously sent an invite, and have been using it on the web and on my Android phone since then. I love almost everything about it. I particularly love the fact that Inbox seems to be able to divine what archetype an email has. Is it spam? Don’t show it to me. Is it travel-related? Bundle it up. Same with purchases, social network notifications, promos, etc. It even does a good job of prioritizing each bundle, and only showing notifications when it thinks it’s urgent — configurable of course. It’s pretty great.

I don’t love how hard it is to delete an item. You have to dive down deeply into an overflow menu on a particular email to find the “Trash” button. I wish it was more easily accessible — I don’t know man, I guess I’m a deleter. I remember buying a 320mb harddrive called “Bigfoot” because it was so humongous, but even then I had to manage my space in order to fit everything. So I can’t help but feel like this is a generational issue, and I’m now a relic of the past. It had to happen eventually, and I’m getting a really strong vibe that the ceremonial burial of the trash button was very much intentional. It’s behaviorism: teaching you not to delete, because archiving is faster and safer.

The crux of the Inbox app is the embracing of the idea that an email is a task. This is contrary to a very popular notion that you should very much separate those two paradigms as much as you can, so it’s very interesting to see Google leaning into it. Combined with their concept of “bundles”, I think it makes it work.

Let’s walk through it: it’s Monday morning and you just arrived at the office to open up your email. You received a couple of promos from Spotify and Amazon in one bundle, an unbundled email from mom, 9 bundled Facebook notifications, and two shipping notifications in a bundle. The one email worth looking at is immediately obvious, so you can either tap “Done” on the “Promos”, “Purchases” and “Social” bundles to end up with only the one email, or you can pin moms email and tap the “Sweep” button. Everything but the email that needs your attention is archived and marked “Done”, and it took seconds.

This is how Inbox is supposed to work. You archive tasks you’re done with, you don’t delete. If something important did happen to be in one of the tasks you quickly marked done, it’s still there, accessible via a quick search. If you get a lot of email, I really do believe that embracing Inbox will take away stress from your daily life. All it asks is that you let go of your desire to manage your archive. You have to accept that there are hundreds of useless Facebook notification emails in your archive, emails you’d previously delete. It’s okay, they’re out of sight, out of mind, and no you won’t run out of space because of them. Checking 9 boxes and then picking the delete button, as opposed to simply clicking one “Done” button — the time you spend adds up, and you need to let go.

I know this. I understand this. As a webdesigner myself, I think there are profound reasons for hiding the delete button. It’s about letting machines do the work for you, so you can do more important things instead, like spending time with your family. It’s the right thing to do. And I’m not quite ready for it yet. Can I have the trash button be a primary action again, please, Google?

Atheism is not a religion

Every once in a while, the topic of religion (or lack there-of) comes up in discussion among me and my friends. I often try to explain what atheism is, or actually what it isn’t, and almost like clockwork it comes up: sounds a lot like religion. It’s an innocent statement, but it also means my explanation failed yet again. It’s a rousing topic full of nuance and passion, no matter the religion, agnosticism or atheism of the participant in the discussion. And it fascinates me so because it’s supposed to be simple! After all, it’s just semantics:

atheism, noun
disbelief or lack of belief in the existence of God or gods

religion, noun
the belief in and worship of a superhuman controlling power, especially a personal God or gods.

Clearly just by looking at the dictionary, one seems incompatible with the other. All the delicious nuance stems from the fact that the term “god” is part of both definitions.

Quick intermezzo before we get into the weeds: I have many friends with a multitude of different religions, people whom I love and respect deeply. I’m not here to take anyones faith away. This is not about whether religion is a force for good or not, there are far more intelligent debates to be had elsewhere. I just like discussing semantic nuances.

What makes it so difficult to pin down is the fact that atheism is really just a word in the dictionary. We’re not even very protective about such words, so we change its meaning from time to time. New information comes to light! The term evolves and mutates and comes to include more meaning still. Looking broadly, though, the definition of atheism forks in two general directions. One direction has it defined mainly as a disbelief in a god or gods, while the other considers it a lack of belief in a god or gods. Did you catch the difference between the two? It’s quite subtle, yet substantial.

Disbelief means you believe there are no gods. You’ve put your two and two together, and decided hey — it just doesn’t make sense to me. This is unlike religion in a couple of obvious ways, first of all the fact that there’s no holy text that describes what you must or must not believe. There’s no promise of an afterlife or lack thereof if you don’t, err, not believe in god. There’s no codex of laws you have to follow to be a “true” atheist. And there are no places you can go to to meet other atheist to, uh, not pray with. (Actually you can still say a prayer if you want to, it’s not like The Atheist Police comes knocking on your door if you do).

The absence of belief, on the other hand, is a bit trickier to pin down. If for whatever reason you never learned about god, well, then you are without belief in god. How could you believe in something you never heard of? Take my daughter for instance. She’s 3, and she’s only talked for the past year or so. I don’t think anyone has told her about religion, not that I know of at least. So she is, by definition, without belief in god. Literally atheos — greek for “without god(s)”. It wasn’t her choice, how could she even make one? I’m not even sure she’d understand what I was talking about if I tried — she’d probably ask for her juicebox and crayons. From this perspective, being an atheist is, in many ways, a default position. It’s what you’re born as. Even if you later in life find solace and happiness in religion, until you found that religion you were for all intents and purposes, an atheist. There’s no shame in that, it’s just a word.

I half expect some readers (thanks for reading 737 words discussing semantics by the way) to ask me: why so defensive, are you sure you’re not describing a religion? Sure, once in a while you’ll encounter someone who takes their atheism so seriously it borders on being a religious experience for them. But that’s fine, they can call themselves atheists too. It’s not like you get a badge at the door. Atheism isn’t organized behind a hashtag, and it’s not about ethics in games journalism.

You are an atheist until you choose not to be, and there’s room for all of us.

The Old World Display

Maybe a decade ago, a web-designer and friend of mine told me a classic “client from hell” story. The details have since become fuzzy, but the crux of the story revolved around a particular design the client wouldn’t approve. There was this one detail that was off, a particular element that just wouldn’t center properly in the layout (it was insisted). Thankfully the client had come up with a seemingly simple fix: just draw half a pixel! Who would’ve guessed that just a decade later, “The Apple Retina Display” would herald the arrival of just that: the halfpixel?

While the term “retina” is mainly marketing chatter meant to imply that you can’t see the pixels on the screen, it’s not just about making sure the display is arbitrarily high resolution. In fact, it’s pixel-doubling. The 1024×768 iPad doubled to 2048×1536 when going retina, and while the implicit goal of this was clarity and crispness, the exact doubling of screen dimensions means UIs elements scale perfectly: 1 pixel simply becomes 4 pixels. It’s quite brilliant, and there’s only one pitfall.

For a designer it’s more than easy to get carried away with quadruple the canvas. In the plebean resolution of yore, tiny UI elements had little room for error: icons had to be perfectly aligned to even pixels, and ideally their stem weight would be a whole pixel-value. The atom of the display — the pixel — was the tiniest unit of information possible, and if it wasn’t mindfully placed it would blur and artifact under heavy antialiasing. In The Old World we had a term for this: pixel-perfect.

However inside the retina display lives the halfpixel siren, and her song is alluring. She’ll make you forget about The Old World. She’ll draw tiny tiny pixels for you and she’ll make your designs feel pixel-perfect, even when they’re not. It’s an irresistable New World.

David Pierce reviewing the new iMac 5K for The Verge:

I drove an Audi and never looked at my Saturn the same way again. Remember the first time you used a capacitive touchscreen, threw your 56k modem out the window and switched to broadband, or switched from standard-def TV to 1080p?

It only took about ten minutes of using Apple’s new iMac with Retina display to make me wonder how I’m ever supposed to go back. Back to a world where pixels are visible on any screen, even one this big.

It’s a good life in The New World. It’s a good life here in the first world. It’s so true: no-one should have to endure the pin-pricking misery of looking at old-world 1x screens! (Actually, my 35 year old eyes aren’t good enough to see pixels on my daughters etch-a-sketch, but I can still totally empathize with how completely unbearable the pain must be).

It gets worse — are you sitting down? Here it comes: most screens aren’t retina. One wild guess puts the population of non-retina desktop users (or old-worlders as I call them) at 98.9%.

That’s not good enough, and it’s time to fight for the halfpixel. We’re at a fateful crucible in history, and I can see only two possible paths going forward. Either we start a “retina displays for oil” program to bring proper high resolutions to the overwhelming majority of people who even have computers, or we just have to live with ourselves knowing that these old-worlders will have to endure not only disgustingly low resolutions, but indeed all of the added extra blur and artifacts that will result of future computer graphics being designed on retina screens and not even tested for crispness on 1x screens1.

Oh well, I suppose there’s a third option. I suppose we can all wake up from this retina-induced bong haze and maybe just once in a while take one damn look at our graphics on an old world display.

  1. Hey Medium… We need to talk.  

The One Platform Is Dead

I used to strongly believe the future of apps would be rooted in web-technologies such as HTML5. Born cross-platform, they’d be really easy to build, and bold new possiblities were just around the corner. I still believe webapps will be part of the future, but recently I’ve started to think it’s going to be a bit more muddled than that. If you’ll indulge me the explanation will be somewhat roundabout.

The mobile era in computing, more than anything, helped propel interface design patterns ahead much faster than decades of desktop operating systems did. We used to discuss whether your app should use native interface widgets or if it was okay to style them. While keeping them unstyled is often still a good idea, dwelling on it would be navelgazing, as it’s no longer the day and night indicator whether an app is good or not. In fact we’re starting to see per-app design languages that cross not only platforms, but codebases too. Most interestingly, these apps don’t suck! You see it with Google rolling out Material Design across Android and web-apps. Microsoft under Satya Nadella is rolling out their flatter-than-flat visual language across not only their own Windows platforms, but iOS and Android as well. Apple just redesigned OSX to look like iOS.

It feels like we’re at a point where traditional usability guidelines should be digested and analyzed for their intent, rather than taken at dogmatic face value. If it looks like a button, acts like a button, or both, it’s probably a button. What we’re left with is a far simpler arbiter for success: there are good designs and there are bad designs. It’s as liberatingly simple as not wearing pants.

dogma (noun)
a principle or set of principles laid down by an authority as incontrovertibly true

The dogma of interface design has been left by the wayside. Hired to take its place is a sense of good taste. Build what works for you and keep testing, iterating and responding to feedback. Remembering good design patterns will help you take shortcuts, but once in a while we have to invent something. It either works or it doesn’t, and then you can fix it.

It’s a bold new frontier, and we already have multiple tools to build amazing things. No one single technology or platform will ever “win”, because there is no winning the platform game. The operating system is increasingly taking a back seat to the success of ecosystems that live in the cloud. Platform install numbers will soon become a mostly useless metric for divining who’s #winning this made-up war of black vs. white. The ecosystem is the new platform, and because of it it’s easier than ever to switch from Android to iOS.

It’s a good time to build apps. Come up with a great idea, then pick an ecosystem. You’ll be better equipped to decide what type of code you’ll want to write: does your app only need one platform, multiple, or should it be crossplatform? It’s only going to become easier: in a war of ecosystems, the one that’s the most open and spans the most platforms will be the most successful. It’ll be in the interest of platform vendors to run as many apps as possible, whether through multiple runtimes or just simplified porting. It won’t matter if you wrote your app in HTML5, Java, or C#: on a good platform it’ll just work. Walled gardens will stick around, of course, but it’ll be a strategy that fewer and fewer companies can support.

No, dear reader, I have not forgotten about Jobs’ Thoughts on Flash. Jobs was right: apps built on Flash were bad. That’s why today is such an exciting time. People don’t care about the code behind the curtain.

If it’s good, it’s good.