Not when they look like this:
More at the original designers site. This is, by the way, an Android skin.
Not when they look like this:
More at the original designers site. This is, by the way, an Android skin.
Google I/O is wednesday, which traditionally means a peek at the next version of Android. Having used Android since version 2, I thought now would be a great time to reflect on how far Android has come.
The Android open source project has been around since 2005, but it wasn’t until Android 2.0 (no unique dessert name, Android 1.6 was “Donut”) was released alongside the Droid phone that Android started its rise to some sort of smartphone dominance. Looking back, version 2 of Android was a pretty uninspired affair with very few good apps to brag about. Some apps were crashy and copy and paste wasn’t everywhere and not particularly good. The experience as a whole felt sluggish and laggy.
What made it worth getting instead of the iPhone, however, was the fact that everything synced as soon as you were logged in with your Google account. There was not a trace of iTunes, and did I mention the superior turn by turn navigation? Douchy hipsters would ask why anyone in their right minds would get an Android phone when they could buy an iPhone instead. Even back then, the answer was: sync and maps.
While Android 2.0 started the rivalry between Apple and Google, Android 2.1 (“Eclair”) which coincided with the Nexus One, set the war ablaze. Pinch to zoom was omitted due to threats of themonuclear war, but the phone itself was still the best Android to date.
Only, there was a problem: way too little internal storage. 256M if I remember correctly. This little space had to hold the entire operating system, including apps, including application data. Which meant, of course, that you’d run out of space within days if you used the phone like you were presumably supposed to. Android 2.2 (“Froyo”) tried to mitigate this embarrassing hardware decision by allowing you to store apps on the SD card, but since application data was still stored on the system partition this change did little to fix the situation. Visually, Eclair received relatively minor tweaks, Froyo likewise.
The Nexus S was released alongside Android 2.3 (“Gingerbread”) and it solved most of the problems that plagued the Nexus One. There was plenty of internal storage. Copy and paste was now unified across the operating system. There was a new, darker and flatter skin that made the experience a bit more elegant but the design felt weirdly half-baked. As a whole, the phone felt snappier, more coherent, and generally more pleasant.
Only, once again there was a problem. The stock Android browser bundled with the Nexus S was optimized for Snapdragon processors, not Hummingbird processors. The Nexus S had the latter, so browsing anything not mobile optimized was slower than it was on the Nexus One. You had to go out of your way to find an alternate (inferior) browser such as “Dolphin”. Not cool.
We eventually found out what ailed the Nexus S. Google was busy making a tablet-friendly version of Android, and either didn’t have time to completely optimize the Nexus S, or simply chose to focus on the tablet instead. Matias Duarte, the original designer for WebOS, had been brought in to spearhead a strong visual direction for Android 3.0, “Honeycomb”. At the time, Gingerbread was just about ready to ship, and Honeycomb development was already underway. So the half-baked feeling that came with Gingerbread was due to the furious race toward the tablet.
For the very same reason, Andy Rubin had made the call that Honeycomb would be tablet only. There simply wouldn’t be time to scale the experience down to the phone form factor, that would have to happen in a later release. There was a lot to like about the end result, but arguably more to dislike. Regardless, a strong direction had been laid, and difficult structural decisions were in place.
Cue Android 4.0, “Ice Cream Sandwich”.
Like sandwiches are combinations of things, Android 4 was for both phones and tablets. It drastically iterated on the Honeycomb UI. The spacey clock was now minimalist, and the pretty terrible Tron font had been replaced with a custom Helvetica-esque “Roboto” font. Applications, icons, even menu items were given a strong design direction, and the result for apps that used this new “Holo” theme was pretty gorgeous. Ice Cream Sandwich was released with the Samsung Galaxy Nexus, and later rolled out for the Nexus S (complete with a stock browser that was finally optimized for the Hummingbird processor).
Impressively, Ice Cream Sandwich managed to shed some of the legacy shackles that had held back earlier Androids. The Menu button, once a requirement on Android phones, was now frowned upon, and developers were asked not to rely on it. Every menu item would come with an icon and shown directly in the action-bar if there was room (and land in the Action Overflow menu if there wasn’t). The death of the menu button was welcome since the button itself was the epitome of mystery meat navigation. Ironic then, that toolbar items would be icon-only. Still, Ice Cream Sandwich was a huge release with fundamental and difficult changes to Android, necessary for the platform to stay competitive.
For every problem Android releases would solve, however, new problems would become apparent. Like a waltz — two steps forward, one step back — Ice Cream Sandwich was no different. While the menu button had been killed, the problems with the back button had become increasingly apparent. I’m not even going to try and explain how the back button works, but here’s a chart:
It’s not optimal. But it’s certainly fixable. Especially on the Galaxy Nexus, where buttons are software. If killing the back button is on the … menu… then it’s possible. If not, there has to be a way to make its behavior more predictable.
In a similar vein, now that Android is beautiful, it’s becoming increasingly clear how most developers don’t care about optimizing their apps for Android. Most apps aren’t using the new Holo theme (which is legitimately beautiful). There are notable exceptions — Tasks, Foursquare, Pocket — but even first-party apps like Google Listen haven’t been updated to the new 4.0 SDK level. If Google can’t eat their own dog-food, how can they expect developers to?
Wednesday is Android 4.1 day and it’ll be interesting to see how Google intends to tackle the problems facing their platform. Perhaps it’s time to mimic Apple and create the “Android Design Awards”, showcasing well-designed Android 4 apps in the market. Might as well give a reason for developers to update the SDK level.
There’s also the problem with timely updates. As it turns out, an operating system running on an ARM processor is fundamentally different from one that runs on, say, an Intel Processor. Where on the latter, you can simply make one OS distribution you can install on every Intel processor out there, ARM operating systems have to be written directly for the specific version of the processor. Which incidentally explains why you won’t be able to install Windows RT (Windows 8 for ARM) yourself. So how can Apple do it? Well they build everything themselves, so they don’t have to target more than one processor.
Still, all of that is just software. Software is written by humans. We tell software what to do. If updates for Android are hard to do because there’s no generic interface for the ARM CPU, then make one. Whatever you do, Google, the big next challenge on your table is making Android easy to update.
Hey Google? One more thing. It would be nice if the Nexus phones you make aren’t so big they don’t fit in my pockets.
Smartphones are great. I can use them to read, browse, look up who that guy in that movie is, listen to podcasts, and even take photos with them. Supposedly it can also make calls, but I don’t know anyone that uses smartphones for that anymore. Only, when my smartphone dings in the middle of the night because it found that I have a new email and it absolutely has to tell me right now, it’s not quite as smart as the prefix suggests.
Smartphones should know when to bug you but most importantly, when not to. On the Android, I’ve fallen in love with Setting Profiles, a programmable context settings manager.
There’s a permanent shortcut in your windowshade showing which profiles are active. Click the shortcut and you’ll see all your profiles for easy access. Yup, would look much nicer were the app updated to the new Ice Cream Sandwich look … developer, ping?
So essentially, the app is about profiles and contexts. For example, “Rotation lock” is simply a shortcut to a feature I’d otherwise have to dig up from deep within the settings panel; quite useful for when you’re lying down and reading. “Quiet time”, on the other hand, is auto-activated from 22:00 to 09:00 every day, i.e. night-time — it essentially mutes the ringer and disables email sync.
“Quiet time” is a activated by a schedule context, but it could also have been activated by a location (as decided by GPS, Wi-Fi SSID or cell-tower ID), or when you dock your phone in a car, when you plug in a headset, when you miss a call or a number of other contexts.
The end result is that I have to do a lot less managing of my smartphone. That’s really nice, and it’s certainly smarter than the phone was when I got it. Still, it requires you to set it up when in fact your phone should be able to handle a lot of these things itself. I bet that’s the next big thing: actually making smartphones smart.
Let’s hope they’ll iterate as fast on the Android version as they are the desktop versions.
Yesterday, Samsung and Google announced an October 10th event, probably to unveil a long-rumored new Nexus phone running the new version of Android. Today, that new Android version was shakily demoed. Being a huge Android fan I follow this intently. I love Android because it’s so open that Amazon can go ahead and build something entirely different with it. Living in Gmail and Google Calendar, I love that everything syncs headache-free when I sign in to my phone. The Gmail app, specifically, is what makes Android my favorite dish among an increasingly diverse mobile marketplace. But despite my love for Android, I think Android’s next release, “Ice Cream Sandwich”, will be a make or break release for Google.
Make or break? Really? Well, make or break for the Google curated version of Android, yes. Obviously the Linux core is not going to disappear, but Android is at a crossroads. One path sees Android eventually showing a return on investment for Google, the other does not.
I like to pretend I understand the broken windows theory more thoroughly than I actually do, so I often invoke it outside of its criminological roots. The gist of the theory is that if you walk past an abandoned building with a couple of broken windows, there’s a greater chance that you would reach for a rock and break one of the remaining windows, than had the building manager made sure to repair the broken windows before you got there. Evil you!
Android is under fire from all directions. Apple vehemently sues HTC and Samsung for stealing their look and feel, Microsoft is attacking for underlying Linux patents they claim to have, and Oracle arguably has the upper hand in one high-profile lawsuit. If Android was a fortress in a desert, it would be under siege from all directions, and at some point the supplies will run out. Google appears unfazed by the attacks but I bet it’s getting to them. Having recently bolstered their patent war chest with the purchase of Motorola, Google is better positioned to fend off the lawsuits. Heck, they might even turn around Motorola and have the company produce delicious, Google-curated Android devices. But by the time this happens, a little year from now, it may already be too late. Right now, Android has a lot of broken windows.
The attacks against Android are reaching the public ear. “Google’s copying Apple”, “Android isn’t really open”, “Android users don’t buy apps”. It doesn’t even matter whether these stories are true or not — if they persist, they’re likely to make the customer walking into a Verizon store skip the Android phone and pick the platform he thinks is “going to be around”. (Or he’ll buy anything, but that’s not a business model.)
That’s a grim future which sees Android falter. But fortunately that’s just one potential outcome. Android still has a disruptive business model: it’s a free operating system with free top-shelf GPS navigation, and it gets users to use Google apps so there’s a halo effect. Now all Google needs is a decisive victory. They need a phone that just looks great, has a UI that’s responsive, fluid and extremely delightful to use. And Google needs this phone to sell like ice cream. Sandwiches.
I would assume Google knows this, and that it’s why they hired Matias Duarte to up the ante on the UI design. The Nexus S is a gorgeous device, all black like the night without the stars, so clearly Samsung can create beautiful hardware when they put their minds to it, wink wink. If the combination (which may be revealed October 10th) is both user-friendly, snappy and delightful, it might just sell like those aforementioned treats. This’ll inspire HTC and Samsung to stick with Android. It’ll further Androids reach, ensuring a larger portfolio of apps. It might even make an Android tablet a value proposition. Put simply, if Google can rally the forces behind a decisive platform release and instill renewed motivation in its partners, these partners might continue their legal fights with fresh energy as opposed to settle and pick other platforms.
On the other hand, if Ice Cream Sandwich is not the watershed release Google needs, the platform might slowly wither away. As stuffed as Googles pockets are, they’re not going to keep throwing money at Android with no return on investment in sight. There’s no sense in being the number 1 smartphone platform if it’s not making you money. That would be a Pyrrhic victory.
“Android users don’t buy apps”, people will tell you. I have no idea whether that’s true, but I do know I switched to The Mac in part due to the presence of great apps, apps not present on Windows. I don’t think it’s a stretch to claim that a platform will gain in popularity by virtue of having great apps. Which makes launching new platforms difficult. Inherently, new platforms won’t have many apps at launch and unless some really good ones are written fast, your platform might never take off.
Let’s define a great app as being an app that’s simple, beautiful, solves a problem for you, and is fast and stable.
I like Windows. I’ve used it for a decade. There are window-management features I still miss, having switched. I hope Windows 8 will do great. But I can’t say Windows ever had great apps; Windows had good apps. I particularly miss Directory Opus, an over-the-top-powerful file management application with integrated FTP, regex file renamer and too many nice features to mention. This was a good app, and I would love a Mac version. But it’s not a beautiful app. It’s got an uninspiring icon, the UI is cluttered by default, the bundled icons don’t look good and the app itself is only as pretty as Windows native UI is. But does it matter that an app isnt’ beautiful?
My noodling on the matter says yes. During the formative months or years of a new operating system — case in point, OSX — the apps that come out will generally dicatate what follows for that platform. If a slew of functional, great-looking apps come out, these apps will define where the bar is set. Once the platform, for a variety of reasons including the presence of aforementioned apps becomes popular enough, it will obviously attract a slew of crappy apps as well, sure. But the higher the bar was set initially, the fewer crap apps will follow. There’s simply no need to look beyond that one app that filled a niche.
Back when I was still powerusing Windows, ALT-tabbing and generally working things to my liking, I was surprised at my Mac friends and their utter determination to make sure all their dock icons were pretty. Sure, I can appreciate a good icon design, but an app can be good without a great icon, can’t it? This mac-using-friend-determination went further and involved criticising the lack of native UI in the Firefox browser, an otherwise tech-hipster darling at the time. I couldn’t care less at the time. As Yogi Berra said: if the app is good the app is good. Right?
Right. And also sometimes wrong. Windows has good apps, but few of them are beautiful. That’s how it’s always been. As the PC has grown from its DOS infancy, apps have improved in both features and looks. But Windows itself, although functional, was never particularly beautiful to look at. Almost reflecting this, neither were Windows apps. Still, it was the platform with the most apps by far, probably still is. The downside is that most of them are crap. Google
windows video converter and you’ll more results than is funny. How are you going to find the one good one among them?
The Mac, on the other hand, made a clean break with OSX. Apps had to be rewritten from scratch, and the operating system itself had received a “lickable” design — it was very pretty to look at by yesteryears standards. The Mac was in a bad place at the time, marketshare-wise, so the trickle of new OSX-ready apps wasn’t overwhelming. Still, because of the clean break and the presence of a userbase, apps did appear. For some reason, these apps were simple, beautiful and userfriendly. Like the OS. You could think the Mac developers at the time felt their apps should reflect the sense of taste the OS itself exuded. Whatever happened, a philosophy of building the one app to rule each niche seems to have been born at this time. Microsoft never made this clean break with Windows, so there was never an opportunity for developers to stop and rethink their apps, and the standard for “pretty” was never very high. The result is a billion apps that do the same thing, because no developer filled a niche in any significant fashion.
I sound like a long-time Apple lover, which I’m not. I switched to The Mac because of the UNIX commandline. Make no mistake about it, there are things about The Mac Way that I sincerely loathe. OSX Lion, for example, is the worst $29 I’ve spent in years. I’m also firmly entrenched with The Android, the Gmail app and seamless syncing is enough to ensure that.
But thinking about the weird voodoo necessary for a new platform to take off, it’s really hard to get around both the Mac and the iPhones portfolio of apps and the standard they’ve set. While it’s all a bunch of evening noodling and gut-feelings, this all tells me that if you want great apps on your platform, you need to combine a beautiful UI with a clean break. It appears Microsoft may be taking this route. Android take note.
Do the tablets in Kubricks 2001 movie constitute “prior art” to the iPad?
This question recently incited much heated discussion on Twitter1. What made this spike my interest in such a fashion is my love for science fiction, and in particular the works of Arthur C. Clarke. Many of his ideas specifically, came to fruition decades later. For example, in 1945 Arthur C. Clarke inadvertently invented satellites. He didn’t patent them; as he put it:
I’m often asked why I didn’t try to patent the idea of communications satellites. My answer is always, “A patent is really a license to be sued”.
Now Clarke merely described what would later become satellites. He didn’t build one, nor did he design how such a thing looks. And indeed satellites today come in all manner of configurations and designs, yet they are still, clearly, satellites.
These days Apple is busy suing Samsung for infringing on Apples look and feel patents with their Galaxy line of phones and tablets. Put simply, Galaxy S phones are too like the iPhone, and the Galaxy Tab 10.1 is too like the iPad. While the comparison photos in the suit filing appear to have been doctored2, I’m not going to argue that Samsung TouchWiz is inspired by Apples iOS (which it clearly is)3.
Focusing on what sparked this discussion — could the tablet devices seen in the 2001 movie constitute prior art for the iPad — I do think that’s fair to say and I’ll get to why I think that is. Whether or not they’re merely portable televisions, they are electronic devices and their form factor is certainly strikingly similar to that of the iPad. But is it prior art?
Prior art [...], in most systems of patent law, constitutes all information that has been made available to the public in any form before a given date that might be relevant to a patent’s claims of originality. If an invention has been described in prior art, a patent on that invention is not valid.
To be specific, Apple is suing Samsung over 4 patents. Two of those are related to the iPhone form factor. One is related to how iOS works. The fourth patent is over the tablet form factor; here’s the illustration from the patent application:
If you explore the patent application itself (beware, TIFF file), you’ll note that no specific size is noted in the patent application. The tablet illustrated doesn’t necessarily have a 10 inch screen.
Samsung is in a tight spot. While I find it surprising (and disappointing) that these four patents were granted in the first place, they clearly appear to have been infringed upon. Were I in Samsungs shoes, (and if I were I’d never have released TouchWiz in the first place) I’d be doing everything I could to defend against this suit. Certainly if I was able to find prior art that invalidated any of the four patents in question, I’d look wherever I could, even in my old sci-fi DVD collection. In the case of that one patent Apple has on the tablet form factor, I do see why Samsung would try and invoke prior art on that (though I’m surprised they didn’t pick Picards tablet instead). You see, if Samsung can convince the judge that patent #4 is invalid — that the slabs shown in 2001 are reminiscent of the pencil sketch shown above — it would cut their woes by a fourth.
Samsung is not my favorite Android vendor. They’re not even my favorite hardware vendor. Perhaps it would be good for them to suffer a defeat at the hands of Apple.
But I do consider Arthur C. Clarkes description of a satellite to be prior art. I consider Larry Nivens description of a ring-world to be prior art to the ring shown in the Halo video game. And so, hearing Samsung cite Kubricks tablets as prior art to the iPad is not the dumbest thing I ever heard. Apples tablet is a wonderful combination of a well-designed user-experience and durable, delicious hardware. Even so, the form factor described in their tablet patent is not a unique snowflake, as countless sci-fi authors would have you know.