The real reason Android is (and has always been) in trouble

Over on Daring Fireball, John Gruber links to a Business Insider piece by Jay Yarow, called “Android Is Suddenly in a Lot of Trouble.”

Gruber responds:

It’s not that Android is suddenly in a lot of trouble — it’s that a lot of people are suddenly realizing that Android has been in trouble all along.

Exactly. But he doesn’t go on to mention why it’s been in trouble all along (though as I recall, he has in the past). I’ve seen plenty of reports, like this one from comScore that iPhones use WiFi networks significantly more than Android phones in the U.S. and U.K. This is one way of measuring the qualitative differences in how people use iPhones compared to how they use Android phones. You could also talk about app revenue, for instance.

All of these measurements and analysis revolve around one clear conclusion, especially when one considers how people end up walking out of a store with either an iPhone or an Android phone. Carriers are pushing Android because they can control the experience more. They’re giving away Android phones as stock upgrade models when customers’ contracts come up. People who don’t even care about owning a “smartphone” are bringing home Android phones because that’s just what the sales rep at the store recommended.

Android is in trouble because a lot of its users (the majority? the vast majority?) are just using it as a phone. It’s a commodity. A lot of the people buying it don’t really know or care what it is, and will never actively use its full potential. It’s just a phone. It may be capable of much more, but if it’s not being used for more, what difference does that make?

People who go into a store wanting to purchase a smartphone predominantly choose the iPhone. Not all of them, of course. Tech-savvy people do choose other smartphone platforms, including Android, especially those who want to tinker with the system. But the rest take whatever they are told to buy by their carriers’ sales reps.

This is the biggest reason Android tablets haven’t taken off, and it’s been discussed too. There’s a built-in market for the apathetic purchase of an Android smartphone. But no one (well, I hope) is walking into a cellular carrier’s store and saying “I want a tablet. What tablet do you recommend?” People who want a tablet don’t just want a tablet; overwhelmingly they want an iPad. Most people who don’t want an iPad don’t want a tablet at all. (Almost) everybody needs a phone.

The problem for the carriers, and the reason they’ve been promoting Android, has typically been that Apple retains too much control (from the carriers’ perspective) over the iPhone. That’s not likely to change, but with Windows Phone, suddenly the carriers have other options. Microsoft is definitely keeping a tighter rein on Windows Phone than Google does with Android, but with Windows Phone, the carriers still have options they don’t get with the iPhone. (Not that this lack of control has prevented them from selling millions of the things.)

If Verizon is serious about pushing Windows Phone (along with the fact that they still sell huge numbers of iPhones), then we’ll soon begin to see just how Android was, as Gruber says, in trouble all along. The success it has achieved to date was largely dependent upon carriers pushing it on unsuspecting or indifferent customers. If they stop doing that…

Morning cup o’ links

Perhaps it would have been better to make a sausage analogy for these links, rather than a coffee-and-sausage one. But since one of the links is to a post written by Marco Arment, coffee seems appropriate. (Then again, a Google search reveals that I am far from the first person to use the phrase “morning cup o’ links” so maybe I should spend less time worrying about it being a non sequitir and instead worry that I am horribly unoriginal.)

Each morning I start the day by perusing the latest on Twitter and my RSS feeds, and I almost always find something interesting to read. But today was more interesting than most, and simply retweeting the links didn’t seem adequate. Also, some of these links may become topics for discussion on this week’s episode of The Undisciplined Room, so this is your homework.

First up, we have a post on The Verge discussing homeless hotspots at SXSW. This is a topic I’ve been reading about for the past few days, but this post was the first that made me think beyond my gut reaction that this was shameless exploitation.

Next, with a HT to Daring Fireball, and via Marco Arment, we have a look at Curator’s Code and why it’s a bad idea. The evidence has been mounting for me that Maria Popova’s 15 minutes of (borrowed) fame are almost over (especially when I’m reminded of her love of Ayn Rand and Malcolm Gladwell), and Marco helps solidify that thought.

Then we have type designer Mark Simonson (who designed the Proxima Nova font that I use in the Room 34 logo and branding materials) discussing font anachronisms in The Artist. As much as I enjoyed The Artist, issues with the fonts it used (especially straight quotes, and the fact that it used fonts in a lot of places where hand lettering would have been more appropriate) even distracted me, so I can’t imagine what it must be like for someone like Mark Simonson or Chank Diesel. (Full disclosure: I did development work on Chank’s mobile website.)

And finally… Chicago musician and multi-talent Joshua Wentz has just announced the release of the Side 2 EP by Absinthe and the Dirty Floors, one of the many musical projects with which he’s involved. He’s also made a video for each song on the EP, like this:

Mac App Store “sandboxing”: perspective of a long-time Mac “power user”

In the web world I reasonably qualify for the title of “developer” as I spend most of my work day writing server- or client-side code. In the iOS world I nominally qualify as a “developer” in that I have paid the $99 fee to join Apple’s iOS developer program, although I have yet to do anything with it besides download early betas of iOS 5 that I never even bothered to install because I don’t have an extra device to experiment with. But in the Mac world, I am nothing more than a “power user” at best. Aside from a couple of Automator scripts (which do not count), I’ve never created a Mac application and doubt I ever will.

Therefore I’ve kept a healthy distance from Mac app developers’ criticisms of the ongoing “iOS-ification” of the Mac, specifically where the Mac App Store is concerned. I don’t use most of the iOS-like features in Mac OS X Lion, and probably won’t in Mountain Lion either, but I do use the Mac App Store, occasionally. I think it works pretty well. It’s actually gotten me to buy some apps — games mostly (of course), plus a couple of utilities like Space Gremlin (yes, that’s a utility, not a game) — for the Mac that I probably would neither have known about nor bothered to pay money for otherwise.

But there are some things that, as a power user, I recognize as problems with the Mac App Store, specifically concerning the “sandboxing” requirement that will take effect on March 1. Essentially, in order to be carried in the Mac App Store, an app cannot access the Mac’s file system except in very limited ways dictated by the OS. It’s a huge step towards making the Mac behave like iOS, which is mainly beneficial in two ways: it improves security by reining in potentially malicious applications, and it simplifies the user experience for novice computer users in a tried-and-true way.

It may be reasonable to question at least one, if not both, of those benefits however. Regarding the latter, I suspect that most would-be Mac owners who have too much difficulty understanding the Mac interface, and who would be better served by making the Mac more iOS-like, would really be better served by not buying a Mac at all, and just getting an iPad instead. Since Apple sold more iOS devices last year than all of the Macs it has sold, ever, that seems like a no-brainer. For a large and growing majority of its customers, Apple is the company that makes iPhones and iPads, not the company that makes Macs.

But it’s the former issue — the security-enhancing measure known as “sandboxing” — that is more troublesome for established Mac users like me. A lot of Mac apps need unfettered access to the file system to do what they do. Especially for those of us “power users” who write code — for the web or otherwise — we need applications that not only have full access to the file system, but that can even let us see “invisible” files. (I always have Mac OS X set to show invisible files, since I frequently need to work with .htaccess files, for instance. Did you know that any filename that begins with a period is automatically hidden by the OS by default? It’s an old Unix thing, designed to help keep users from accidentally deleting critical system files.)

A real Mac app developer has weighed in on the problems posed by the sandboxing requirement for his company’s app, SourceTree, and how, as a result, Atlassian has had to remove SourceTree from the Mac App Store.

Fortunately, unlike with iOS, the Mac App Store is not the only way to get applications onto your Mac, and I think it’s a reasonable assumption (but by no means guaranteed) that the Mac will always allow power users to do things the “old fashioned way” and install directly applications that do not conform to the App Store’s strict requirements.

But a look at the features of the upcoming Mountain Lion version of OS X gives pause. Clearly Apple wants to move the Mac more and more in an iOS direction, and clearly even beyond the growing sibling relationship between the two platforms, just on the Mac itself, Apple is putting a lot of effort into streamlining and simplifying things, making things on the Mac work, as Grubes said yesterday, “closer to how things should be rather than simply how they always have been.”

But how things “should” be is subjective, especially when it comes to what “real” applications on a “real” computer can reasonbly be expected by “power users” to be “able” to “do.” (Quotation marks indicate uncertainty.)

There’s been a lot of talk over the past few years about how we’re entering into a “post-PC” era. With the proliferation of smartphones and tablets (90-some percent of the latter being iPads, of course), new interaction methods and increasingly mobile-first web experiences, what is to become of the trusty old PC (Mac or Windows)? What surprises me most in all of this is that Apple and Microsoft are leading the charge to turn traditional computers into post-PC devices. Both Mac OS X Mountain Lion and Windows 8 are radically diverging from their traditional interfaces into new directions inspired by their mobile siblings.

In some ways this is an exciting and fascinating time. For about two decades now, Mac OS and Windows have been very similar, presenting nearly identical ways of interacting with computers. But iOS and Windows Phone (which is coming to Windows 8 as the Metro interface) are very different from each other. The next standard UI has not yet been established. We haven’t seen this kind of competition and variety since the early 1980s, before Microsoft’s desktop OS dominance was established.

But it’s not the early 1980s. We’re not living in a time of unprecedented invention and discovery. We have nearly 30 years of GUI-based computing experience under our collective belts, and the conventions of standard GUI interfaces (as represented mainly by Windows and Mac OS X) have become expectations of the “power users” who rely on them not just as an alternative form of entertainment while sitting on the couch in the evening, but to do real work — creative, technical, business-oriented work.

That’s not going to change overnight as February passes into March. And it may not change significantly in our lifetimes. But if Apple aggressively restricts what applications in the Mac App Store can do, in ways that prevents users from getting work done, while simultaneously pushing the Mac App Store as the primary (and eventually only) way to install applications on a Mac, they will effectively kill the Mac as a tool of creative professionals, which is pretty much the only thing that kept Apple alive through the ’90s in the first place. Sure, Apple doesn’t really need us now (and it would still have us anyway… I’m not giving up my iPhone anytime soon), but making the Mac too much like iOS doesn’t necessarily make it better. It just makes it… unnecessary.

Fiddly

I got up this morning and, like on most mornings, one of the first things I did was brush my teeth. It’s a simple process, just part of the minutiae of daily life. But as with so many of those little things we do every day, it’s a less-than-ideal experience. After fumbling to pull the toothbrush from the cup — where its bulbous, rubberized handle was wedged against the bulbous, rubberized handles of the other toothbrushes necessary for a household of four — and nearly dumping them all into the sink along the way, I took my frustration to Twitter:

It got me thinking about a recent post on Daring Fireball, where John Gruber expressed his frustrations that some people — even Apple Store “geniuses” — were telling iPhone owners that they need to occasionally force-quit all of the apps in their recently-used items tray. He followed up on that post on his podcast, The Talk Show, where he described the experience of operating systems where you are expected to manually monitor and adjust their states as being “fiddly.”

I’ve been thinking about that word, “fiddly,” a lot since then. I think it applies to a lot more than smartphone OSes. I’ve spent a great deal of my life dealing with overwhelming frustration at the clumsiness, the fiddliness, of everyday objects: cheap plastic toys that break easily, things that stick to other things when they shouldn’t or don’t when they should, tools that cannot adequately perform the tasks they are expressly intended for, etc.

As someone who’s not inclined to tinker with objects, much less invent solutions to their shortcomings, that frustration usually just burns off as simmering rage. But as I pondered the nature of fiddliness, and the ideal of the iPhone as a “non-fiddly” object, a couple of thoughts occurred to me:

1. It is the purpose of design to reduce the fiddliness in the world.

2. Very few makers of physical objects today follow #1.

Gary Hustwit’s documentary Objectified is focused on the design of everyday objects, and those who have excelled at creating objects that are, for lack of a better word, as non-fiddly as possible. Two people featured in the documentary are Dieter Rams, the legendary German designer who led Braun’s industrial design team in the 1950s and 1960s, and Jonathan Ive, the head of Apple’s industrial design team today. Both Rams and Ive share a passion for making objects that work. Form not only follows function, form is function. It’s a seamless integration of purpose and style that makes the objects a delight to use.

And that’s a very rare thing today, indeed.

iPad: Son of Newton

There’s some buzz going around concerning Apple’s new iPad commercial and its similarity to one Apple produced for the Newton two decades ago. Though I’m not the first to comment on this, I have a few thoughts of my own, so here goes…

First, let’s watch both commercials. I did not remember this (apparently) “classic” (in John Gruber’s words) ad for the Newton:

Now, watch Apple’s new iPad ad:

Wow. Homage indeed. I doubt very many people remember the Newton commercial, but the iPad commercial is stunningly similar. This had to be deliberate, but I’m wondering what exactly that deliberateness is supposed to mean.

Well, I’ll tell you this: watching the two ads back-to-back, I’m left feeling that a) the Newton really was way ahead of its time, and b) the Newton ad seems like one of those futuristic concept videos Apple (among other computer makers) seemed to love producing in the 1980s.

Newton was a vision of the future. iPad is the reality. That Newton actually became a shipping product says a lot about Apple’s ability to realize its vision (compared to the long line of never-to-be-made concepts that have come from Microsoft over the years, most recently… well… this). But the Newton was too far ahead of its time. Then again, it ushered in the PDA era, which ushered in the “smartphone” era, which led to the iPhone and now the iPad. So maybe Apple was really seeding (if you’ll pardon the pun) its own future with the Newton.

There are two key lines that for me define the difference between the two ads:

“Newton can receive a page and sends faxes and, soon, electronic mail.”

“(iPad is) 200,000 apps and counting. All the world’s websites in your hands.”

Granted, paging and faxing were still relevant technologies when the Newton was released, but they were already doomed, and the best Apple could say was that “soon” Newton could handle “electronic mail” (even then, using a soon-to-be-antiquated term). In contrast, the iPad hits the ground running, leveraging the existing success of the iPhone, and with forward momentum for future technologies. Newton was about what could be, but iPad is.