Ethical Adblocking

Apple just released iOS 9 yesterday, and with it allowed adblockers into the app store. Since the mobile web is increasingly a Big Deal, this fact heralds a sea change for the web.

An article about adblocking made the rounds a few weeks ago. Here’s a pullquote:

If blocking becomes widespread, the ad industry will be pushed to produce ads that are simpler, less invasive and far more transparent about the way they’re handling our data — or risk getting blocked forever if they fail.

That’s a load of manure.

A big part of the problem is how slow the ad industry itself has been to adapt. To this day most ads are still big squares (300×250) or giant skyscrapers (120×600). They’re not hi-DPI, they’re not responsive, and they’re usually ugly blinking GIFs. With all the technology we have available to us today, you’d think we’d be able to see better ads at this point.

Ads don’t offend me. Well some specific ads do, but the idea of exchanging my attention for a free service such as reading news on the web, that doesn’t offend me. I’m an adult, I can make an informed decision as to which services I will leave my data with, whether those services are free through ads or are entirely paid.

The problem creeped up on us slowly: the more attention you could sell, the more money you could get. Ads became bigger and more plentiful. First came popups, then they were blocked. Now we’re dealing with full take-over ads, interstitials, lightbox ads, and if you dare browse the mobile web, you’ll be looking through blinds in the form of social sharing links at the top, and “dismiss” buttons that don’t actually work. It’s pretty bad, and it makes browsing websites slower.

In the end, it only takes a few horrible ads to poison the well, and adblocking would eventually become inevitable. It’s like television, and Ghostery is the Tivo of the web. With iOS 9 content blockers, adblocking is going to be mainstream fast, and this is where the pullquote above falls apart: ad networks aren’t going to get better, probably the opposite.

Today it’s possible to make a living running a site that’s free to read, solely because of ad revenue. Some can even make a good living. As adblocking grows more widespread, ads are going to be more intrusive to get around this, more guerilla, and even bigger, all in a fight to make the same income off the dwindling flock that still aren’t blocking ads. It’ll happen to good people that run these sites. Despite their best intentions, their staff have families to feed, and if they just use this slightly larger ad and add an interstitial, things can stay the same for a while and no-one has to be fired.

It would be unfair to blame them. It’s human nature: millions and millions of sites aren’t suddenly going to see the light at the same time and change their ways all at once. Even if they did, it’s unlikely everyone would suddenly stop using adblockers because of this. Once the adblocker is installed, once web-ads have been poisoned by years of bad practices, ads aren’t coming back.

John Gruber tweets:

I feel your pain, John. It’s the same pain GigaOm felt when they died this year. It’s not pretty. And I like Deck ads. They’re nice. I agree they shouldn’t be blocked. But they’re still ads, and adblockers block ads. It’s not your fault, it was that monkey ad, remember? Shoulder to shoulder, we stand. Love is a battlefield.

There is no ethical adblocker which blocks only the bad ads and leaves the “good” ads. I’d like to feel like an activist fighting for pure content when I install Marco Arments $2.99 “Peace” ad blocker. I want to believe that by blocking ads, I help force positive change on the advertising companies (and the livelyhoods that depend on them), force them to adapt.

But that’s a beautiful illusion. What’s more likely is that web ads are going to get way worse, adblocking is going to go way up, and at some point in this arms race, after the death of many a media company, eventually some will indeed have adapted. The big question is whether you’ll like the alternatives. It can be apps. It can be inside Apple’s Newsstand (featuring unblockable ads). It can be inside Facebooks instant articles. It can be subversive native ads. It can be paywalls. Think in-app purchases: “Pay $1 for this article, or pay by watching a video.”

Nature will find a way. But we aren’t suddenly going to wake up to rainbows and unicorns. No matter how cool that would be.

A version of this post originally appeared on Google+. Yes, that ghost town you may have heard of. Bring chains and white blankets, let’s haunt things.

Apparently I Like Bad Movies

I watched Jupiter Ascending yesterday, and from the moment I saw flying roller blades, I was in love. The film is saturated with color, culture, style and fashion and detail. I has layers and layers and layers, it’s creativity all the way down! Did you notice the design of the wooden bill the robot servitor bribed the bureaucrat with? It had the slickest little design and it was on screen for barely two seconds. The amount of work that went into this film was astounding, and apparently Rotten Tomatoes doesn’t care, and that makes me sad.

It’s not that I’d prefer everyone like the things I like. I’m routinely made fun of for thinking Time Cop is a good movie, and for ranking Sky Captain and the World of Tomorrow close to Raiders of the Lost Ark on the list of my all time favorites. It’s fine, we don’t all have to agree, I’m comfortable with my taste in movies.

What gets me is that that we’ll probably never see another movie like Jupiter Ascending. We’ll certainly never get a sequel. Neither did Serenity, or John Carter, or A Series of Unfortunate Events. Or Ninja Assassin. Yet they made Transformers 2, Transformers 3, Transformers 4, and they’re making Transformers 5. That seems so wrong to me.

I understand how it works. The movies I mentioned either did bad at the box office, or critically, or both. Transformers 2 on the other hand pulled in $836m on a $200m budget. Little did it matter that it is almost universally deemed bad. I did see the full thing and to this day regret not staring at drywall for 2h30 instead. I don’t often criticize things — okay actually I do that a lot — but Transformers 2 deserves it. You could cut it down to a 30 minute short, and not only would the film be better, but there might actually be enough story to warrant its runtime.

Jupiter Ascending really didn’t deserve the critique it got. Even if the film wasn’t for you, it had so many other things going for it: the elaborate space backdrops, the castle-like spaceships, the dresses, the sets, hell even the spacesuits that looked like they were ornately carved from wood. Did I mention the flying roller-blades? Jupiter Ascending oozed creativity and worked on so many levels. I still can’t think of a single level Transformers 2 worked on, and I played 100+ levels worth of Desert Golfing.

Successful movies get sequels, and the Transformers franchise is like a pinata filled with money and shame. It’s only natural that studio execs want to keep wailing on it with 2-by-4s. It’s just so unfair.

Windows 11

I’m not sure Microsoft Windows will be around in a decade, and that makes me sad.

I used to pick Windows computers. I used to like the operating system and feel more productive on it. I’m sure the price point helped.
I still miss full-size arrow keys and having a functional text-selection model, but today I’m decidedly a Mac user. I like that the terminal is a Unix terminal, and I like that I can uninstall an app by throwing it in the trash.
My phone runs Android, and I like how sharing information between apps work, enough that I’m willing to put up with phones that are too big and cameras that aren’t great.
But there’s no longer a place in my life for Windows. Sure, I run it in a virtual machine to test things, but that hardly counts.

Although Windows 8 was a nightmare hellride to actually use, I really liked how starkly new it felt compared to how operating systems have looked and functioned for decades. The swiss design style1 is something I never thought we’d see in computer interfaces. Going all in with this on Windows 8 was a ballsy and rather couragous move, even though it obviously didn’t pan out. Turns out you can’t just throw out decades of interface paradigms between versions, who knew?
Windows 8 was a glorious failure, but it did include a new application runtime that’s shared with Windows Phone, and it looks like Windows 10 will be fixing the UI wonkiness. I’m still left wondering if it’ll be enough to turn things around.

I’ve been a big fan of new CEO Satya Nadella’s work in the past year. He seems to thinking what we’ve all been thinking for decades: it’s weird that Microsoft hasn’t been putting their apps on iOS and Android. Windows RT was stupid. No-one is using Windows Phone.

But that last one is disconcerting to me. While I’m a happy Android user and fan of iOS, a duopoly in smartphone platforms isn’t good for anyone. I would prefer Microsoft to have a semi-succesful presence in the mobile space, if only to keep Google and Apple on their toes. Most developers aren’t going to voluntarily maintain an app for a platform that only has 3% of the market, and without apps, no-one will adopt the platform. Recent news suggests Nadella understands this, and is giving their mobile efforts one final shot. The hope is that by making Windows 10 a free upgrade, app developers might have more incentive to use the new app runtime so their apps will run on desktop and mobile alike. I would think if this strategy fails, it’s likely Microsoft will more or less be conceding the smartphone form factor entirely.

On the one hand this seems like exactly the kind of tough choice a forward-looking CEO needs to make in order to ensure Microsoft has a future at all, but on the other hand it leaves an even bigger question of where that leaves Windows for PCs if Microsoft concedes defeat on smartphones. While in the near term Windows for desktops and laptops is probably safe, in the longer term there are growing threats from Chrome OS, a potential Android on laptops, and apps running in the cloud. Even if Windows marketshare survives past these challenges, the price and therefore revenue of selling operating systems has been converging on zero for a while now. It’s only a matter of time.

So what’s Nadella’s plan? When Windows revenue eventually drops to zero, and Microsoft has no platform (and therefore app store with a revenue cut) on smartphones, what will be their livelyhood? In order for Microsoft to stay in the consumer space and not become the next dull IBM, they’ll need a source of income that is not Windows, and it’s probably not hardware either, no matter how good the Surface Pro 3 was.

So what remains of Microsoft must be what Nadella bets on as the next source of income. So that’s Office, Xbox, various cloud services and new things.

Microsoft has always been good at new things, but bad at productizing them. It seems Nadella has some skills in that area, so this will be an exciting space to watch in the next few years, but like all new ideas it’s like buying a lottery ticket. You increase your chance of winning by buying a ticket, but you might still not win.

The rest is tricky. The problem is that without owning the platform it’ll be orders of magnitude harder for Microsoft to sell their services. Unlike Google, Microsoft has to broker deals in order to have their apps preinstalled on Android phones, and though Android is pretty open, since they don’t own the platform they’ll always be subject to changing terms and APIs. Apple is a closed country entirely: you’ll have to seek out and install their apps if you want them, and even if you do, Microsofts digital assistant will never be accessible from the home button. It’s a steep and uphill battle, but I really hope Microsoft finds new footing. Because like how birds do, if life in one ecosystem turns miserable, I want to be able to migrate to another one, ideally a flourishing one. Oh, and I want to see how Windows looks when Microsoft turns it up to eleven.

  1. I refuse to call it Flat Design™ because that’s a stupid term that suggests a flat sheet of color is somehow a recent invention.  []

Smart Watch

At the end of January was lucky to get my hands on a Moto 360 smartwatch. Though I’ve never considered myself a watch-person, I do enjoy tech, so naturally I’ve been wearing it since then. As Apple is about to launch their foray into the watch form factor, I thought I’d jot down some of my thoughts on how their competition is doing so far.

Android Wear is the umbrella term for the software that runs on the Moto 360 and other Android watches by Asus and LG. The 360 features a lovely round display and unless you know what to look for, you’ll mistake it for a traditional watch when you see it on someones wrist. The battery is conveniently all day and the screen decently readable outside. It’s also off most of the time but turns on with a wrist-flick. It’s an excellent first version, and that is a tremendously important milestone to pass.

I’m rarely a first-adopter of potentially sea-change inducing technology, but with this watch, I feel like I am. Give it a year or two, and the convenience level of these devices will have gone from that of a soft-close toilet seat to full on dishwasher. You’ll want one. But probably not today.

Android Wear does a few things well. It checks your heart rate, counts your steps, shows you all your phone notifications and lets you act on them. It’s a remote control for the media you play, and it’s feels pretty magical to play/pause a movie cast from Plex on your phone to the television through the Chromecast. Oh, and it lets you set timers, read your agenda, create reminders, and show you basic Google search results. Yes, there are flight notifications. It doesn’t yet speak danish, so all watch replies to my wife are currently transcribed from an adopted southern California accent. We have fun.

What gets me excited about the form factor is the potential that’s hidden here. All of the quantified self health stuff is all but inevitable, and that’s cool, but another way in which smartwatches can be transformative is in letting you get rid of your smartphone. XKCD speaks about the brief period in which our wrists were free, but failed to mention that this glorious period happens to coincide with a time when everyone’s looking at their smartphones instead. I don’t quite know if the smartwatch will make us talk again, but I hope so. As a sidebar, please dear Facebook, don’t put Instagram on the smartwatch.

The Android Switch

A little over two months ago, I switched to using an iPhone as my daily driver, having used various Android devices for the past half decade. I’m just about to switch back for a variety of reasons I’ll detail here.

I switched initially to get a deeper understanding for iOS, one that can only be had by committing fully. Given where things are going in UI design, it’s only prudent I know the platforms. If there is one single overwhelming conclusion I took away from this experiment, it’s this: you’re lucky to have either! It really is a remarkable time to be into gadgets — we carry little supercomputers in our pockets that are designed in a such a userfriendly way that more people than ever before can use them. The platform really doesn’t matter much anymore: smartphones are marvels of modern science that democratize technology in an unprecedented way. We are spoiled to live in an age where we can literally ask our phones questions and have answers presented based on the sum of human knowledge. For that reason I find it very hard, perhaps even petty, to criticise one platform over the other.

However, I also have an aversion to the words “if it ain’t broke, don’t fix it”. Nothing is ever perfect, and critical discourse is how we improve things. Neither iOS nor Android are perfect, but I’m still going back to the latter. Some of my reasons for doing so are bound to be due to muscle memory from using Android for a long time. Other reasons many will no doubt directly disagree with. Still, maybe some of the reasons I’m about to list are issues that are worth addressing in future versions of iOS.

Thank goodness you’re free to choose which platform you invest in.

Continue reading

iPhone switch observations, just a few days in

Just a few days ago, I made a temporary switch to an iPhone 5C as my daily driver, just to get it under my skin. Here are some observations I’ve made so far:

  • Man, there are a lot of passwords to type in on a new phone.
  • The fact that I have to type in my password in the App Store all the time, even for free apps, is driving me crazy. I know the fingerprint scanner makes this a non issue, but it still seems sub-optimal that there’s not even an off by default option to not ask for passwords.
  • The camera… Even on this two years old tech it takes better photos than most Android cameras I’ve used.
  • The 3rd party keyboard implementation is so janky it’s almost not even worth using a sliding keyboard. And on the stock keyboard, letters on the keyboard are capitalized even when what they output is not. That has to be the last vestigial skeuomorph in the ecosystem.
  • The physical mute switch is a stroke of genius, especially when the Settings > Sounds > Vibrate on Silent option is unchecked.
  • The decision to not allow me to pick a default browser to open links in, feels completely arbitrary and archaic, especially since some apps like Digg Reader implement workarounds to give you the choice.
  • The app situation is good in this ecosystem.
  • Notifications aren’t great. Clearing them even less so.
  • I miss the permanent Android back button in the bottom left. It seems every back button in the system is different. Some screens but crucially, not all, allow you to swipe left to go back. I bet this is an issue on the 6+.
  • I’ve missed this small form factor. Imagine if they removed all the bezels to make the screen larger, I bet they could put a 4.5 inch on it without an increase in size.

Switching to iPhone for a bit

I’ve been a fan of Googles products ever since I switched from Alta Vista. So it felt like a natural fit to get an Android device back in the day when it was time for me to upgrade from my dumbphone, and I’ve been using an Android device ever since. I wrote about ecosystems a while ago, and the ecosystem is exactly what’s kept me there: you sign in to your phone with your Google account, and mail, calendar, notes, contacts and photos sync automatically. Also there’s a really great maps application.

In my day job I make web-apps that have to work on mobile first, and iOS is an important platform for me to know. Now I’ve used iOS for years — it’s the phone I bought for my wife and recommended to my dad. We also have an iPad, and I have used an iPhone for testing for years. I’m no stranger to how things work there. But I feel like something special happens when you make a conscious switch to the platform, make it your daily driver. Phones have become so utterly personal devices, they’re always with us and we invest ourselves in them. Unless I jump in fully, I have a feeling there’s some bit I’m missing.

So starting today I’m an iPhone user. No, I wouldn’t call this a switch — call it a “soak test”. I fully expect to switch back to Android — I’m actually eyeing a Moto X 2014. That is, unless the experience of investing myself fully in the iPhone is so compelling that I have no desire to go back, which is entirely possible. I won’t know unless I give it a proper test. Since I’m in the fortunate position to be able to make this switch, there’s no good reason not to. I’ll be using my white iPhone 5C testing device. I expect to be impressed by the camera. I expect to enjoy a jank-free fluidness of the OS, even if I expect to turn off extraneous animation. I’m curious how I’ll enjoy the homescreen and its lack of customizability compared to Android, and I can’t wait to see if the sliding keyboards in the App Store are as good as they are on Android. I should have some experiences to share on this blog in a month or so. Let me know any apps you want me to try!

Archive, Don’t Delete

I’m one of the lucky … actually I have no idea how many or few have Google Inbox. In any case, I was graciously sent an invite, and have been using it on the web and on my Android phone since then. I love almost everything about it. I particularly love the fact that Inbox seems to be able to divine what archetype an email has. Is it spam? Don’t show it to me. Is it travel-related? Bundle it up. Same with purchases, social network notifications, promos, etc. It even does a good job of prioritizing each bundle, and only showing notifications when it thinks it’s urgent — configurable of course. It’s pretty great.

I don’t love how hard it is to delete an item. You have to dive down deeply into an overflow menu on a particular email to find the “Trash” button. I wish it was more easily accessible — I don’t know man, I guess I’m a deleter. I remember buying a 320mb harddrive called “Bigfoot” because it was so humongous, but even then I had to manage my space in order to fit everything. So I can’t help but feel like this is a generational issue, and I’m now a relic of the past. It had to happen eventually, and I’m getting a really strong vibe that the ceremonial burial of the trash button was very much intentional. It’s behaviorism: teaching you not to delete, because archiving is faster and safer.

The crux of the Inbox app is the embracing of the idea that an email is a task. This is contrary to a very popular notion that you should very much separate those two paradigms as much as you can, so it’s very interesting to see Google leaning into it. Combined with their concept of “bundles”, I think it makes it work.

Let’s walk through it: it’s Monday morning and you just arrived at the office to open up your email. You received a couple of promos from Spotify and Amazon in one bundle, an unbundled email from mom, 9 bundled Facebook notifications, and two shipping notifications in a bundle. The one email worth looking at is immediately obvious, so you can either tap “Done” on the “Promos”, “Purchases” and “Social” bundles to end up with only the one email, or you can pin moms email and tap the “Sweep” button. Everything but the email that needs your attention is archived and marked “Done”, and it took seconds.

This is how Inbox is supposed to work. You archive tasks you’re done with, you don’t delete. If something important did happen to be in one of the tasks you quickly marked done, it’s still there, accessible via a quick search. If you get a lot of email, I really do believe that embracing Inbox will take away stress from your daily life. All it asks is that you let go of your desire to manage your archive. You have to accept that there are hundreds of useless Facebook notification emails in your archive, emails you’d previously delete. It’s okay, they’re out of sight, out of mind, and no you won’t run out of space because of them. Checking 9 boxes and then picking the delete button, as opposed to simply clicking one “Done” button — the time you spend adds up, and you need to let go.

I know this. I understand this. As a webdesigner myself, I think there are profound reasons for hiding the delete button. It’s about letting machines do the work for you, so you can do more important things instead, like spending time with your family. It’s the right thing to do. And I’m not quite ready for it yet. Can I have the trash button be a primary action again, please, Google?

Atheism is not a religion

Every once in a while, the topic of religion (or lack there-of) comes up in discussion among me and my friends. I often try to explain what atheism is, or actually what it isn’t, and almost like clockwork it comes up: sounds a lot like religion. It’s an innocent statement, but it also means my explanation failed yet again. It’s a rousing topic full of nuance and passion, no matter the religion, agnosticism or atheism of the participant in the discussion. And it fascinates me so because it’s supposed to be simple! After all, it’s just semantics:

atheism, noun
disbelief or lack of belief in the existence of God or gods

religion, noun
the belief in and worship of a superhuman controlling power, especially a personal God or gods.

Clearly just by looking at the dictionary, one seems incompatible with the other. All the delicious nuance stems from the fact that the term “god” is part of both definitions.

Quick intermezzo before we get into the weeds: I have many friends with a multitude of different religions, people whom I love and respect deeply. I’m not here to take anyones faith away. This is not about whether religion is a force for good or not, there are far more intelligent debates to be had elsewhere. I just like discussing semantic nuances.

What makes it so difficult to pin down is the fact that atheism is really just a word in the dictionary. We’re not even very protective about such words, so we change its meaning from time to time. New information comes to light! The term evolves and mutates and comes to include more meaning still. Looking broadly, though, the definition of atheism forks in two general directions. One direction has it defined mainly as a disbelief in a god or gods, while the other considers it a lack of belief in a god or gods. Did you catch the difference between the two? It’s quite subtle, yet substantial.

Disbelief means you believe there are no gods. You’ve put your two and two together, and decided hey — it just doesn’t make sense to me. This is unlike religion in a couple of obvious ways, first of all the fact that there’s no holy text that describes what you must or must not believe. There’s no promise of an afterlife or lack thereof if you don’t, err, not believe in god. There’s no codex of laws you have to follow to be a “true” atheist. And there are no places you can go to to meet other atheist to, uh, not pray with. (Actually you can still say a prayer if you want to, it’s not like The Atheist Police comes knocking on your door if you do).

The absence of belief, on the other hand, is a bit trickier to pin down. If for whatever reason you never learned about god, well, then you are without belief in god. How could you believe in something you never heard of? Take my daughter for instance. She’s 3, and she’s only talked for the past year or so. I don’t think anyone has told her about religion, not that I know of at least. So she is, by definition, without belief in god. Literally atheos — greek for “without god(s)”. It wasn’t her choice, how could she even make one? I’m not even sure she’d understand what I was talking about if I tried — she’d probably ask for her juicebox and crayons. From this perspective, being an atheist is, in many ways, a default position. It’s what you’re born as. Even if you later in life find solace and happiness in religion, until you found that religion you were for all intents and purposes, an atheist. There’s no shame in that, it’s just a word.

I half expect some readers (thanks for reading 737 words discussing semantics by the way) to ask me: why so defensive, are you sure you’re not describing a religion? Sure, once in a while you’ll encounter someone who takes their atheism so seriously it borders on being a religious experience for them. But that’s fine, they can call themselves atheists too. It’s not like you get a badge at the door. Atheism isn’t organized behind a hashtag, and it’s not about ethics in games journalism.

You are an atheist until you choose not to be, and there’s room for all of us.