Parenthood is a club. Not necessarily a prestigious or elitist club, just a club. Like nerds really into Settlers of Catan, so do parents share a profound, mystical understanding. No, it’s not that non-parents aren’t welcome into this club, it’s just that — like the Matrix — you cannot be told what it is like, you have to experience it for yourself. To clarify, by “parents” I also mean surrogate-moms and dads, adopters and sure even pet-owners — it’s not about the blood, it’s about taking on the responsibility of a life other than your own.
It’s the little things that set parents apart. Like spotting a passing baby-carriage and quieting down as you pass it by. It’s having been desensitized to diaper jokes. It’s going to bed early and honestly looking forward to the morning coffee at 05:19. Mostly, it’s carrying a void in your heart when you’re away from the little one for too long.
From an outsiders perspective, parents are super annoying. They appear to be completely self-centered around their own little world. They bring their kids to grocery stores. And on flights, oh god they bring kids on flights make it stop. And they yell, and their children scream, and they lose their temper, and they should be bringing up their kids differently I’d show them how I’d teach’em good. And oh man the topics they drone on about, on and on and on and on, hours on end. “Did you know the diapers are really cheap in that store you don’t normally shop in?” “Oh you really should be using cotton diapers, those one-time diapers aren’t good for you.” “Selma’s teething now, it makes me look forward to the morning coffee at 05:19.” Terrible.
Bear with us. Becoming a parent does something to you. The sleep deprivation combined with the intrinsic knowledge that failure won’t ever be an option, sprinkled with the occasional tiny smile you receive from the creature in your care. It’ll hit you like you haven’t been hit before. It may only be chemistry, but it’ll make you see through time and feel like you can punch through a wall. When I held my baby girl in my hands for the first time, while it was the biggest moment in my life, it was frankly bittersweet. The moment reminded me that everyone was once a cute little baby. That angry cat lady down the street who keeps yelling at you for no good reason. The sad homeless guy carrying an ominous sign. They were both once little cute babies, with a mother who nursed them and cared for them. Or, even more heartbreaking, lost their mothers.
It makes you realise you have something to lose now. Like a chronic tristesse, it drastically widens your perspective. Life takes on new meaning. Yeah, it’ll likely take a while before you can watch the news again. Yeah, it’ll make you focus your complete attention on children in your vicinity — not only your own, but other children as well. And yes, doing so will make you seem completely self-centered to your peers. It’s a steep price and there are no returns. Fortunately it takes only one smile from the little creature and you’re willing to pay double.
Google I/O is wednesday, which traditionally means a peek at the next version of Android. Having used Android since version 2, I thought now would be a great time to reflect on how far Android has come.
The Android open source project has been around since 2005, but it wasn’t until Android 2.0 (no unique dessert name, Android 1.6 was “Donut”) was released alongside the Droid phone that Android started its rise to some sort of smartphone dominance. Looking back, version 2 of Android was a pretty uninspired affair with very few good apps to brag about. Some apps were crashy and copy and paste wasn’t everywhere and not particularly good. The experience as a whole felt sluggish and laggy.
What made it worth getting instead of the iPhone, however, was the fact that everything synced as soon as you were logged in with your Google account. There was not a trace of iTunes, and did I mention the superior turn by turn navigation? Douchy hipsters would ask why anyone in their right minds would get an Android phone when they could buy an iPhone instead. Even back then, the answer was: sync and maps.
The Nexus Phones
While Android 2.0 started the rivalry between Apple and Google, Android 2.1 (“Eclair”) which coincided with the Nexus One, set the war ablaze. Pinch to zoom was omitted due to threats of themonuclear war, but the phone itself was still the best Android to date.
Only, there was a problem: way too little internal storage. 256M if I remember correctly. This little space had to hold the entire operating system, including apps, including application data. Which meant, of course, that you’d run out of space within days if you used the phone like you were presumably supposed to. Android 2.2 (“Froyo”) tried to mitigate this embarrassing hardware decision by allowing you to store apps on the SD card, but since application data was still stored on the system partition this change did little to fix the situation. Visually, Eclair received relatively minor tweaks, Froyo likewise.
The Nexus S was released alongside Android 2.3 (“Gingerbread”) and it solved most of the problems that plagued the Nexus One. There was plenty of internal storage. Copy and paste was now unified across the operating system. There was a new, darker and flatter skin that made the experience a bit more elegant but the design felt weirdly half-baked. As a whole, the phone felt snappier, more coherent, and generally more pleasant.
Only, once again there was a problem. The stock Android browser bundled with the Nexus S was optimized for Snapdragon processors, not Hummingbird processors. The Nexus S had the latter, so browsing anything not mobile optimized was slower than it was on the Nexus One. You had to go out of your way to find an alternate (inferior) browser such as “Dolphin”. Not cool.
The Honeycomb Detour
We eventually found out what ailed the Nexus S. Google was busy making a tablet-friendly version of Android, and either didn’t have time to completely optimize the Nexus S, or simply chose to focus on the tablet instead. Matias Duarte, the original designer for WebOS, had been brought in to spearhead a strong visual direction for Android 3.0, “Honeycomb”. At the time, Gingerbread was just about ready to ship, and Honeycomb development was already underway. So the half-baked feeling that came with Gingerbread was due to the furious race toward the tablet.
For the very same reason, Andy Rubin had made the call that Honeycomb would be tablet only. There simply wouldn’t be time to scale the experience down to the phone form factor, that would have to happen in a later release. There was a lot to like about the end result, but arguably more to dislike. Regardless, a strong direction had been laid, and difficult structural decisions were in place.
Goodbye, Menu Button
Cue Android 4.0, “Ice Cream Sandwich”.
Like sandwiches are combinations of things, Android 4 was for both phones and tablets. It drastically iterated on the Honeycomb UI. The spacey clock was now minimalist, and the pretty terrible Tron font had been replaced with a custom Helvetica-esque “Roboto” font. Applications, icons, even menu items were given a strong design direction, and the result for apps that used this new “Holo” theme was pretty gorgeous. Ice Cream Sandwich was released with the Samsung Galaxy Nexus, and later rolled out for the Nexus S (complete with a stock browser that was finally optimized for the Hummingbird processor).
Impressively, Ice Cream Sandwich managed to shed some of the legacy shackles that had held back earlier Androids. The Menu button, once a requirement on Android phones, was now frowned upon, and developers were asked not to rely on it. Every menu item would come with an icon and shown directly in the action-bar if there was room (and land in the Action Overflow menu if there wasn’t). The death of the menu button was welcome since the button itself was the epitome of mystery meat navigation. Ironic then, that toolbar items would be icon-only. Still, Ice Cream Sandwich was a huge release with fundamental and difficult changes to Android, necessary for the platform to stay competitive.
For every problem Android releases would solve, however, new problems would become apparent. Like a waltz — two steps forward, one step back — Ice Cream Sandwich was no different. While the menu button had been killed, the problems with the back button had become increasingly apparent. I’m not even going to try and explain how the back button works, but here’s a chart:
It’s not optimal. But it’s certainly fixable. Especially on the Galaxy Nexus, where buttons are software. If killing the back button is on the … menu… then it’s possible. If not, there has to be a way to make its behavior more predictable.
In a similar vein, now that Android is beautiful, it’s becoming increasingly clear how most developers don’t care about optimizing their apps for Android. Most apps aren’t using the new Holo theme (which is legitimately beautiful). There are notable exceptions — Tasks, Foursquare, Pocket — but even first-party apps like Google Listen haven’t been updated to the new 4.0 SDK level. If Google can’t eat their own dog-food, how can they expect developers to?
Wednesday is Android 4.1 day and it’ll be interesting to see how Google intends to tackle the problems facing their platform. Perhaps it’s time to mimic Apple and create the “Android Design Awards”, showcasing well-designed Android 4 apps in the market. Might as well give a reason for developers to update the SDK level.
There’s also the problem with timely updates. As it turns out, an operating system running on an ARM processor is fundamentally different from one that runs on, say, an Intel Processor. Where on the latter, you can simply make one OS distribution you can install on every Intel processor out there, ARM operating systems have to be written directly for the specific version of the processor. Which incidentally explains why you won’t be able to install Windows RT (Windows 8 for ARM) yourself. So how can Apple do it? Well they build everything themselves, so they don’t have to target more than one processor.
Still, all of that is just software. Software is written by humans. We tell software what to do. If updates for Android are hard to do because there’s no generic interface for the ARM CPU, then make one. Whatever you do, Google, the big next challenge on your table is making Android easy to update.
Hey Google? One more thing. It would be nice if the Nexus phones you make aren’t so big they don’t fit in my pockets.
Hey if you’re visiting this site on a 2x screen (i.e. an expensive Apple product), check out how crisp the graphics are!
Smartphones don’t have permanently visible scrollbars. Neither does OSX Lion (unless you’re using a mouse in which case they pop back in). On the phone, there’s a space issue, so the lack of scrollbars seems a good tradeoff. On the desktop, there’s no such space issue. So why the tradeoff?
If Microsoft’s vision for the future – Surface — is any kind of true (and that remains to be seen), soon there will be no desktop. Fine, but tablets do still have room for scrollbars, so why not enable them there?
Let’s look at the pros and cons. On the list of reasons why hiding the scrollbar is a good thing, I have this (and feel free to augment this in the comments):
- It’s prettier. Less UI is often a good thing. If you don’t miss it, then you have a better experience for it.
- It’s consistent with phones and tablets (from the same vendor) and gives a sense of coherence.
- If the future is indeed touchbased (as in: your future desktop is a docked tablet or phone), developers should probably already now start to yank out hover-induced menus and make their scrollpanes indicate overflow when no scrollbar is visible. Having a desktop OS that mimics this, I suppose, is a helpful reminder of what may be coming.
Still, the scrollbar has been around for a while. In fact I would argue it’s a cornerstone in modern GUIs. Such a thing should not be buried willy-nilly. Here are reasons to keep the scrollbar visible at all times:
- I can think of many ways to indicate that there’s more content to be seen, but none of them are as easy to understand as the scrollbar.
- A scrollbar doesn’t have to be 18px wide, opaque, with a huge inset gutter, so long as it looks like a scrollbar. In fact, if only Lion scrollbars didn’t fade out completely, this post would probably not have been written.
- A permanently visible scrollbar, by virtue of its relative height, will sit silently at the side of your view and cue you in how much content remains to be seen. No bottom shadow or clipped content will indicate that. It’s like a minimap of your document.
It’s not that I love scrollbars. Most of them are pretty ugly. Scrollbars, as we’ve grown to know them, can be especially hideous when shown on dark designs. Still, I’m not entirely convinced the solution to this challenge is to hide them. That sounds like mystery meat navigation to me.
When you reach a certain age, that is, the age when you start sentences with “When you reach a certain age”, you start to think that kids today aren’t what they used to be. Which is of course an eternal falsetruth because kids both are, and are not what they used to be. And kids today say “fail”.
Actually, kids say many dumb things, including “if it ain’t broke, don’t fix it”, but the word “fail” when used as a noun, makes me die a little inside. Like the sound frequency that breaks glass, the mere utterance of the word initiates an intellectual necrosis in my being. It makes me sad, tired, and a little on-edge. Instantly.
It’s not so much the meaning, I’m fine with failing. In fact, I do it all the time. Sometimes I even learn from my failures. That’s when experience is generated. Yay for that.
It’s when the word is used in its impoverished, truncated non-verb form. Fail. It makes me think of George Orwell and Idiocracy. It confirms my fears of the future and amplifies them. We’re dumbing down the language to a point where expression is becoming a scarce resource; and this at a time where the tools for publishing said are increasingly numerous and easy to use. Yet time and again expressions are truncated, not even filling the 140 character limit. Poof. Gone with the wind in a cacophany of who cares.
Go start a blog or something, write about your cat or the difficulty of the human condition. If you must use the word “fail”, use it in a sentence. On the other hand, if enough people use the noun-form word in a meaningful way — excrutiating as it would be — one day “fail” would be canonized a noun in the dictionary. What would really sanction the word would be if Stephen Hawking used it to describe string theory. That would be the day I embraced newspeak.
One of the features of the Mac that I’ve come to love is the ability to override/customize the keyboard shortcut of any menu-accessible command:
Go to System Preferences, search for “Keyboard shortcuts”, select “Application Shortcuts” then click the +. Now you can specify any menu item and keyboard shortcut for the app you select.