Form over Frolic: Jony Ive’s quest for boring perfection

Right now I’m sitting in front of a 27″ iMac. It’s the best computer I’ve ever owned, with a 5K display, high color gamut, 24 gigs of RAM and 512 gigs of SSD storage. It’s beautiful and minimalist, just like every iMac they’ve released since they switched to aluminum in 2007.

It’s also the least modifiable desktop computer I’ve ever owned. This trend also goes back to that aluminum iMac, in which—like today’s—only the RAM is user-upgradeable. (Since 2012, even that’s no longer true of the smaller 21″ iMac.) It’s hard not to ask: why is thinness the priority in all of Apple’s designs?

You know the answer: Jony Ive. It’s clear by now that he would like everything Apple produces to look as close to a pure pane of glass as he can make it, with minimal, unadorned metallic frames, as close to unbroken and symmetrical as functionality allows. And Ive’s team is perfectly willing to sacrifice functionality in pursuit of this goal. A female Lightning port is fractionally thinner than a female USB-C port, and now you know why the iPhone will never get USB-C ports. Sorry. You’re lucky the one-port MacBook’s one port isn’t a Lightning port. (I have it on good authority that was under consideration.)

This often gets portrayed as a choice between staying chained to legacy hardware and forging ahead to the future. But if you were using Macs a decade ago, do you remember the way the power indicator light on a Mac, both desktop and laptop, used to slowly pulse when it was asleep, as if it were slowly breathing? Or the way batteries on laptops, both replaceable and permanent, used to let you check charge levels without turning on or waking up the machine. Or, as recently as last year, the way power plugs changed color to show charging state. All of that—along with the illuminated Apple logo and, now, the cheerful startup chime—has gone away.

All the price of progress, right?

A couple years ago, Shawn Blanc published a book about “how to make good things great” called Delight is in the Details. That phrase captures an essential paradox: we want our products to stay out of our way in everyday use, yet products that convert us from merely satisfied customers to fans have little touches that call attention to themselves in just the right way. When I start my Mazda, its display lights up with the words “Zoom Zoom” for just a few seconds. It’s stupid, but after six years it still makes me smile.

“Little touches that call attention to themselves” are the opposite of Ive’s guiding aesthetic. He creates beautiful objects you can appreciate as works of art. You can’t help but marvel at the lengths to which his team will go to make a perfect fusion of glass and metal, to craft UIs that appear to directly manipulate data, to make the hardware disappear while you’re using it. Under Ive’s direction, Apple delivers works which are closer to the science fiction future than any other major consumer electronics company. And yet his designs are relentlessly whimsy-free. There won’t be a moment that catches you off-guard and makes you smile. Ive’s work never aspires to make you giggle with delight.

Software doesn’t escape his penchant for austerity, either. The Ive era of software UX has been about flattening, removing, relentlessly stamping out skeuomorphism. The “traffic light” window controls are just circles now; the swirling barber pole progress bars are simple blue, with a subtle pulse; we don’t even get the little puff of smoke when we pull icons off the dock. I’m surprised the iOS icons still jiggle-dance when they’re in rearrangement mode. I’m not sure that it’s fair to say that we’re seeing a software analog to Apple’s quest for thinness, but I’m not sure it isn’t, either.

I’d hardly be the first one to complain about a perceived drop in software and UX quality, or to question whether Apple’s being a little too aggressive in dropping legacy ports. Yet it feels like that’s always been part of the deal, right? We’re taking away the floppy drive, or only giving you these weird USB ports, or sealing the battery in, but look at how cool we can make this thing now! It’s not like anything else on the market. It’s fun.

This iMac is the best computer I’ve ever owned, but nothing about it screams fun. The quirkiest thing about it is my mechanical keyboard, something Apple would never dream of making on their own these days. (So gauche.)

Yes, but you keep talking about the Mac line. The future is in iOS! Despite revealing myself in past posts as a Mac partisan, I think this is not only true but, overall, good. I’m a fan of that science fiction future, and it’s not one in which I see many people sitting down in front of 27″ monitors and keyboards for their computing needs—even if the monitors are holographic and the keyboards aren’t physical.

But man, talk about the “pure pane of glass” ideal, right?

The argument Apple is implicitly making is that computers—especially the computers of the future that the iPad typifies—are appliances. Appliances can be beautiful, but they shouldn’t exhibit frippery. They should be focused. We should prefer the Kitchen-Aid stand mixer to the plastic knockoff that does twice as much at half the price, because it won’t do any of those things well and it’ll fall apart in a year. (Besides, you can do all those things with the Kitchen-Aid, anyway; you’ll just need to buy some dongles.)

That’s all true. Maybe Ive knows best. But if you showed me a table with an iPad Pro, a Surface Pro, and a Surface Book on it and asked me to rank them in order of Cool Factor, I’d be hard-pressed to put the iPad at the head of the line. Microsoft isn’t trying for tiny-quirk delight, which is just as well (“It looks like you’re trying to add personality to your UX! Can I help?”), but they’re sweating small, thoughtful details. Apple sweats the details of manufacturing processes. That’s great, but it’s not the same thing.

Maybe—just maybe—a little frippery is okay, even if it adds a half-millimeter in depth to a product, or adds a touch of (gasp) skeuomorphism to the UI here and there, or allows a slightly less restrained, tasteful pigment on the anodized aluminum case. Injecting a bit of fun, even weirdness, to their computers in the late ’90s helped pull Apple back from the brink. It may be time for another injection.

Being Kitchen-Aid is a fine goal, but you know what? They sell that stand mixer in nearly three dozen colors.


Originally published at Hacker Noon.

also on

Bubbles, baseball and Mr. Marsh

On deep stories and ugly truths in American politics

When I went to high school in Florida, I was in one of the last years that had to take a class called “Americanism vs. Communism.”

I had the dumb luck to get a teacher named Bob Marsh, an iconoclastic sixty-something motorcyclist and science fiction fan who told the students like me who were also sf fans about his friend Joe Haldeman. While there’s a common joke I hear even from today’s high school students about American history classes ending at World War II, we learned about the Cold War, about Korea, about Vietnam. We learned about Castro and Kruschev and Mao, but also about Watergate and COINTELPRO and HUAC. It was, improbably, a pretty good class.

Despite the name, the class’s story wasn’t about how Americanism, whatever that was, opposed communism. It was about how liberal democracy opposed authoritarianism.

That sense of “liberal” has gotten conceptually muddled over the years, particularly in post-war America. (Call it “classical liberalism” if you must, although that phrase is even more easily coopted.) This is the point: Bernie Sanders and Paul Ryan might not agree on much, but Ryan has never, to the best of my knowledge, advocated for a return to monarchy; Sanders has never once suggested outlawing private industry. They would both agree that the individual liberty and representative democracy thing is, on the whole, a pretty good idea. They are both pretty firmly standing for liberal democracy and against authoritarianism. That’s a foundational ideal of America. We’ve failed to hit it a lot through our history, but we’ve done better than a lot—a lot—of other countries.

The name “Americanism vs. Communism,” though, tells us another story, a story that’s been pervasive in America in the post-World War II era. This story tells us that if we want to oppose authoritarianism, we need only worry about “the left.” It doesn’t tell us that “the right” has its own kinds of authoritarians. To some people, it even implies that Nazis were socialists (it’s right in the name), and that fascists were liberals.

The name “Americanism vs. Communism” tells us, maybe, to let down our guard.


On John Gruber’s podcast “The Talk Show,” guest Merlin Mann said of the 2016 presidential election: “It’s not that my team didn’t win. It’s that maybe I just don’t understand baseball anymore.”

Merlin and I went to the same small Florida college at more or less the same time. (We all totally knew he was going to be a professional podcaster.) I’m pretty sure he also took an AvC class. We probably share a roughly similar, and from appearances similarly inadequate, understanding of baseball.

Before the election we were inundated with think pieces about how “the left” was wildly misinterpreting the appeal of nationalist populism. No no no, we were told, it’s not racism and misogyny and homophobia. It’s the rage, the deep story, the message to people who felt they were being not merely left behind but that “the elites” were letting other people “cut in line” ahead of them on the way to the American Dream. We’re still constantly hammered with the idea that if you’re in a city you’re in a bubble, if you’re liberal you’re in a bubble, that we just need to get out of that bubble and listen to real, non-bubble America.

The deep story may be about all that. But it’s also about how gay marriage devalues “real” marriage. How letting transgender folk use public bathrooms puts “real” men and women in danger. How we should watch, register and deport immigrants and build a wall around our borders. The racism and misogyny and homophobia isn’t incidental. It’s not a byproduct. The deep story is about tribalism.

Here’s an ugly truth: some of the country doesn’t believe that America belongs to people who aren’t in their tribe. That tribe is white, straight (at least openly), and Christian. It’s gotten bigger over the years—it didn’t used to include the Irish, or Italians, or Catholics, or women—but every inch of expansion has been fought, bitterly and grudgingly. Other tribes can live in America, maybe, but theirs comes first, and everyone else is here at their forbearance.

Another ugly truth is this: some of the country considers not just welfare, not just social programs, but basic justice and legal protection to be a zero-sum game. Her marriage means less if you can get married. The sign on the restroom door means less if you can go through it. The police are here to protect me from you. And God knows I don’t want my tax dollars to go to the likes of you.

The third ugly truth is this: those people are in power now.

Despite my sarcastic streak, I’m a natural optimist. I’m not going to claim there’s much of a silver lining here, though. I believe that the oft-maligned millennials—and even us Generation Xers—will pull us back on track. I don’t think this is the end of the great American experiment, that representative democracy is at its end, that America is doomed to become a mashup of The Handmaid’s Tale and Idiocracy.

But I think it’s going to get worse before it gets better. And I don’t know how much worse.

I wonder what Mr. Marsh would have said about all this, back in that Americanism vs. Communism class. I think he might say the problem isn’t bubbles. It’s not who’s listening to who in the present. It’s who’s listening to the past. America has always been at its worst when we’re encouraged to turn against one another, and at its best when we move toward ensuring that liberty and justice truly is for all.

I think he might also say this. Liberal democracies can vote themselves into authoritarianism. Voting themselves back out is much harder.

A digression on cyberpunk

I was listening to Fangs and Fonts‘ most recent podcast on cyberpunk, and—

Okay, let me back up. “Fangs and Fonts” is a podcast about writing, hosted by four writers in (and out of) furry fandom: Roland Ferret, Yannarra, Tarl “Voice” Hoch and Ocean. So far the episodes have mostly come across as structure-free conversations about a given topic. There’s a lot of spontaneity and liveliness to it, although I suspect they’d benefit from spending an hour or so before recording making a list of Things To Cover.

Anyway. While it was fun listening in on the conversation, my impression was that none of the four hosts had read much of the genre past the Wikipedia entry. They’d seen movies with cyberpunk tropes to varying degrees, but… well. There’s no way to say this without an implied tsk tsk, but you guys, it’s a writing podcast!

So let me back up more. Specificially, to the early ’80s.

It’s pretty easy to discern the cyber of cyberpunk: ubiquitous networking that lets us always be “jacked in” to a sprawling virtual reality while also letting corporations and governments, which all too often seem to be one and the same, track us for a variety of purposes both nefarious and benign. But what about the punk? Well, it meant… punk. An anti-authoritarian, alienated and disaffected counterculture that neither fits in with the dominant culture nor has much interest in doing so. Heroes in cyberpunk weren’t necessarily punks—or necessarily heroic—but they tended to have a very edge-of-society vibe.

The problem with focusing almost exclusively on the cinematic aspect of cyberpunk is that you miss that the “punk” element wasn’t just in the relationship of the fictional characters to their settings. It was in the relationship of the writers to mainstream science fiction. William Gibson, Bruce Sterling, Pat Cadigan, Rudy Rucker—they were very deliberately responding to the more utopian, and by then rather dated, science fiction of the 1950s and before, the Heinleins and Clarkes and Asimovs.

Thematically, cyberpunk had little in common with the “New Wave” of science fiction of the late 1960s and early 70s, the works of Michael Moorcock and Thomas Disch and J. G. Ballard—but stylistically, it’s a closer relative. As Moorcock wrote rather snippishly, science fiction lacked “passion, subtlety, irony, original characterization, original and good style, a sense of involvement in human affairs.”

When we think about cyberpunk today, we mostly think about the visual trappings, the stuff that does look good on film—but those weren’t at all the point. A lot of these stories and novels were, in oft-bitter and backhanded fashion, deeply socially conscious. They had shit to say, and what’s more, they wanted to say it beautifully. The opening sentence of Neuromancer has become one of the most famous in science fiction:

The sky above the port was the color of television, tuned to a dead channel.

Of course, the New Wave of science fiction doesn’t seem much less dated now than what it railed against, and the same is true of cyberpunk. (Bruce Sterling declared it dead as long ago as 1985.) Its aesthetics were assimilated into the mainstream long ago, and the very act of mainstreaming guts any legitimate claims to counterculture, rather like buying a Dead Kennedys tee at Hot Topic. But its treatment of technology, of dystopia, of the contentious relationship between individual freedom, corporate power and state control had a tremendous and lasting influence on science fiction. Yes, cyberpunk might have said “put on this black overcoat and these sunglasses and you’ll look awesome, trust me,” but it also said “science fiction is relevant not just as meditations on our possible long-term futures, but as a mirror to show us things about our here and now.”

And that’s stuff that—mostly—didn’t come along for the ride when the glasses and overcoats went to Hollywood.

If you’re interested in reading more seminal cyberpunk stuff, here’s a few things to potentially investigate:

  • Cyberpunk: Stories of Hardware, Software, Wetware, Evolution, and Revolution, an anthology that includes work by Sterling, Gibson, Cadigan, and others. Since Sterling’s original anthology Mirrorshades is long out of print, this is pretty much the collection to get for a broad overview of the important writers.
  • Neuromancer, William Gibson. Poetic and bleak, with memorable characters and a striking take on artificial intelligence, this was the first novel to win the Hugo, Nebula and Philip K. Dick Awards, and has been cited as one of the most important novels—not just science fiction novels, but novels, period—of the twentieth century’s latter half.
  • Snow Crash, Neal Stephenson. The main character is Hiro Protagonist, who delivers pizza for the Mafia in an armored supercar as his day job and is a warrior in the Metaverse by night. Either you already want to read it or I can’t help you. (The Metaverse as depicted here was a pretty direct influence on Second Life.)

For movies/TV, there are some interesting ones the Fangs & Fonts Folks touched on—especially Ghost in the Shell, The Matrix (the first one) and, of course, Blade Runner—but I’d also suggest the underrated Strange Days, directed by Kathryn Bigelow (The Hurt Locker), and—if you can find it—the short-lived cult TV series “Max Headroom.”