Form over Frolic: Jony Ive’s quest for boring perfection

Right now I’m sitting in front of a 27″ iMac. It’s the best computer I’ve ever owned, with a 5K display, high color gamut, 24 gigs of RAM and 512 gigs of SSD storage. It’s beautiful and minimalist, just like every iMac they’ve released since they switched to aluminum in 2007.

It’s also the least modifiable desktop computer I’ve ever owned. This trend also goes back to that aluminum iMac, in which—like today’s—only the RAM is user-upgradeable. (Since 2012, even that’s no longer true of the smaller 21″ iMac.) It’s hard not to ask: why is thinness the priority in all of Apple’s designs?

You know the answer: Jony Ive. It’s clear by now that he would like everything Apple produces to look as close to a pure pane of glass as he can make it, with minimal, unadorned metallic frames, as close to unbroken and symmetrical as functionality allows. And Ive’s team is perfectly willing to sacrifice functionality in pursuit of this goal. A female Lightning port is fractionally thinner than a female USB-C port, and now you know why the iPhone will never get USB-C ports. Sorry. You’re lucky the one-port MacBook’s one port isn’t a Lightning port. (I have it on good authority that was under consideration.)

This often gets portrayed as a choice between staying chained to legacy hardware and forging ahead to the future. But if you were using Macs a decade ago, do you remember the way the power indicator light on a Mac, both desktop and laptop, used to slowly pulse when it was asleep, as if it were slowly breathing? Or the way batteries on laptops, both replaceable and permanent, used to let you check charge levels without turning on or waking up the machine. Or, as recently as last year, the way power plugs changed color to show charging state. All of that—along with the illuminated Apple logo and, now, the cheerful startup chime—has gone away.

All the price of progress, right?

A couple years ago, Shawn Blanc published a book about “how to make good things great” called Delight is in the Details. That phrase captures an essential paradox: we want our products to stay out of our way in everyday use, yet products that convert us from merely satisfied customers to fans have little touches that call attention to themselves in just the right way. When I start my Mazda, its display lights up with the words “Zoom Zoom” for just a few seconds. It’s stupid, but after six years it still makes me smile.

“Little touches that call attention to themselves” are the opposite of Ive’s guiding aesthetic. He creates beautiful objects you can appreciate as works of art. You can’t help but marvel at the lengths to which his team will go to make a perfect fusion of glass and metal, to craft UIs that appear to directly manipulate data, to make the hardware disappear while you’re using it. Under Ive’s direction, Apple delivers works which are closer to the science fiction future than any other major consumer electronics company. And yet his designs are relentlessly whimsy-free. There won’t be a moment that catches you off-guard and makes you smile. Ive’s work never aspires to make you giggle with delight.

Software doesn’t escape his penchant for austerity, either. The Ive era of software UX has been about flattening, removing, relentlessly stamping out skeuomorphism. The “traffic light” window controls are just circles now; the swirling barber pole progress bars are simple blue, with a subtle pulse; we don’t even get the little puff of smoke when we pull icons off the dock. I’m surprised the iOS icons still jiggle-dance when they’re in rearrangement mode. I’m not sure that it’s fair to say that we’re seeing a software analog to Apple’s quest for thinness, but I’m not sure it isn’t, either.

I’d hardly be the first one to complain about a perceived drop in software and UX quality, or to question whether Apple’s being a little too aggressive in dropping legacy ports. Yet it feels like that’s always been part of the deal, right? We’re taking away the floppy drive, or only giving you these weird USB ports, or sealing the battery in, but look at how cool we can make this thing now! It’s not like anything else on the market. It’s fun.

This iMac is the best computer I’ve ever owned, but nothing about it screams fun. The quirkiest thing about it is my mechanical keyboard, something Apple would never dream of making on their own these days. (So gauche.)

Yes, but you keep talking about the Mac line. The future is in iOS! Despite revealing myself in past posts as a Mac partisan, I think this is not only true but, overall, good. I’m a fan of that science fiction future, and it’s not one in which I see many people sitting down in front of 27″ monitors and keyboards for their computing needs—even if the monitors are holographic and the keyboards aren’t physical.

But man, talk about the “pure pane of glass” ideal, right?

The argument Apple is implicitly making is that computers—especially the computers of the future that the iPad typifies—are appliances. Appliances can be beautiful, but they shouldn’t exhibit frippery. They should be focused. We should prefer the Kitchen-Aid stand mixer to the plastic knockoff that does twice as much at half the price, because it won’t do any of those things well and it’ll fall apart in a year. (Besides, you can do all those things with the Kitchen-Aid, anyway; you’ll just need to buy some dongles.)

That’s all true. Maybe Ive knows best. But if you showed me a table with an iPad Pro, a Surface Pro, and a Surface Book on it and asked me to rank them in order of Cool Factor, I’d be hard-pressed to put the iPad at the head of the line. Microsoft isn’t trying for tiny-quirk delight, which is just as well (“It looks like you’re trying to add personality to your UX! Can I help?”), but they’re sweating small, thoughtful details. Apple sweats the details of manufacturing processes. That’s great, but it’s not the same thing.

Maybe—just maybe—a little frippery is okay, even if it adds a half-millimeter in depth to a product, or adds a touch of (gasp) skeuomorphism to the UI here and there, or allows a slightly less restrained, tasteful pigment on the anodized aluminum case. Injecting a bit of fun, even weirdness, to their computers in the late ’90s helped pull Apple back from the brink. It may be time for another injection.

Being Kitchen-Aid is a fine goal, but you know what? They sell that stand mixer in nearly three dozen colors.


Originally published at Hacker Noon.

Bubbles, baseball and Mr. Marsh

On deep stories and ugly truths in American politics

When I went to high school in Florida, I was in one of the last years that had to take a class called “Americanism vs. Communism.”

I had the dumb luck to get a teacher named Bob Marsh, an iconoclastic sixty-something motorcyclist and science fiction fan who told the students like me who were also sf fans about his friend Joe Haldeman. While there’s a common joke I hear even from today’s high school students about American history classes ending at World War II, we learned about the Cold War, about Korea, about Vietnam. We learned about Castro and Kruschev and Mao, but also about Watergate and COINTELPRO and HUAC. It was, improbably, a pretty good class.

Despite the name, the class’s story wasn’t about how Americanism, whatever that was, opposed communism. It was about how liberal democracy opposed authoritarianism.

That sense of “liberal” has gotten conceptually muddled over the years, particularly in post-war America. (Call it “classical liberalism” if you must, although that phrase is even more easily coopted.) This is the point: Bernie Sanders and Paul Ryan might not agree on much, but Ryan has never, to the best of my knowledge, advocated for a return to monarchy; Sanders has never once suggested outlawing private industry. They would both agree that the individual liberty and representative democracy thing is, on the whole, a pretty good idea. They are both pretty firmly standing for liberal democracy and against authoritarianism. That’s a foundational ideal of America. We’ve failed to hit it a lot through our history, but we’ve done better than a lot—a lot—of other countries.

The name “Americanism vs. Communism,” though, tells us another story, a story that’s been pervasive in America in the post-World War II era. This story tells us that if we want to oppose authoritarianism, we need only worry about “the left.” It doesn’t tell us that “the right” has its own kinds of authoritarians. To some people, it even implies that Nazis were socialists (it’s right in the name), and that fascists were liberals.

The name “Americanism vs. Communism” tells us, maybe, to let down our guard.


On John Gruber’s podcast “The Talk Show,” guest Merlin Mann said of the 2016 presidential election: “It’s not that my team didn’t win. It’s that maybe I just don’t understand baseball anymore.”

Merlin and I went to the same small Florida college at more or less the same time. (We all totally knew he was going to be a professional podcaster.) I’m pretty sure he also took an AvC class. We probably share a roughly similar, and from appearances similarly inadequate, understanding of baseball.

Before the election we were inundated with think pieces about how “the left” was wildly misinterpreting the appeal of nationalist populism. No no no, we were told, it’s not racism and misogyny and homophobia. It’s the rage, the deep story, the message to people who felt they were being not merely left behind but that “the elites” were letting other people “cut in line” ahead of them on the way to the American Dream. We’re still constantly hammered with the idea that if you’re in a city you’re in a bubble, if you’re liberal you’re in a bubble, that we just need to get out of that bubble and listen to real, non-bubble America.

The deep story may be about all that. But it’s also about how gay marriage devalues “real” marriage. How letting transgender folk use public bathrooms puts “real” men and women in danger. How we should watch, register and deport immigrants and build a wall around our borders. The racism and misogyny and homophobia isn’t incidental. It’s not a byproduct. The deep story is about tribalism.

Here’s an ugly truth: some of the country doesn’t believe that America belongs to people who aren’t in their tribe. That tribe is white, straight (at least openly), and Christian. It’s gotten bigger over the years—it didn’t used to include the Irish, or Italians, or Catholics, or women—but every inch of expansion has been fought, bitterly and grudgingly. Other tribes can live in America, maybe, but theirs comes first, and everyone else is here at their forbearance.

Another ugly truth is this: some of the country considers not just welfare, not just social programs, but basic justice and legal protection to be a zero-sum game. Her marriage means less if you can get married. The sign on the restroom door means less if you can go through it. The police are here to protect me from you. And God knows I don’t want my tax dollars to go to the likes of you.

The third ugly truth is this: those people are in power now.

Despite my sarcastic streak, I’m a natural optimist. I’m not going to claim there’s much of a silver lining here, though. I believe that the oft-maligned millennials—and even us Generation Xers—will pull us back on track. I don’t think this is the end of the great American experiment, that representative democracy is at its end, that America is doomed to become a mashup of The Handmaid’s Tale and Idiocracy.

But I think it’s going to get worse before it gets better. And I don’t know how much worse.

I wonder what Mr. Marsh would have said about all this, back in that Americanism vs. Communism class. I think he might say the problem isn’t bubbles. It’s not who’s listening to who in the present. It’s who’s listening to the past. America has always been at its worst when we’re encouraged to turn against one another, and at its best when we move toward ensuring that liberty and justice truly is for all.

I think he might also say this. Liberal democracies can vote themselves into authoritarianism. Voting themselves back out is much harder.

A digression on cyberpunk

I was listening to Fangs and Fonts‘ most recent podcast on cyberpunk, and—

Okay, let me back up. “Fangs and Fonts” is a podcast about writing, hosted by four writers in (and out of) furry fandom: Roland Ferret, Yannarra, Tarl “Voice” Hoch and Ocean. So far the episodes have mostly come across as structure-free conversations about a given topic. There’s a lot of spontaneity and liveliness to it, although I suspect they’d benefit from spending an hour or so before recording making a list of Things To Cover.

Anyway. While it was fun listening in on the conversation, my impression was that none of the four hosts had read much of the genre past the Wikipedia entry. They’d seen movies with cyberpunk tropes to varying degrees, but… well. There’s no way to say this without an implied tsk tsk, but you guys, it’s a writing podcast!

So let me back up more. Specificially, to the early ’80s.

It’s pretty easy to discern the cyber of cyberpunk: ubiquitous networking that lets us always be “jacked in” to a sprawling virtual reality while also letting corporations and governments, which all too often seem to be one and the same, track us for a variety of purposes both nefarious and benign. But what about the punk? Well, it meant… punk. An anti-authoritarian, alienated and disaffected counterculture that neither fits in with the dominant culture nor has much interest in doing so. Heroes in cyberpunk weren’t necessarily punks—or necessarily heroic—but they tended to have a very edge-of-society vibe.

The problem with focusing almost exclusively on the cinematic aspect of cyberpunk is that you miss that the “punk” element wasn’t just in the relationship of the fictional characters to their settings. It was in the relationship of the writers to mainstream science fiction. William Gibson, Bruce Sterling, Pat Cadigan, Rudy Rucker—they were very deliberately responding to the more utopian, and by then rather dated, science fiction of the 1950s and before, the Heinleins and Clarkes and Asimovs.

Thematically, cyberpunk had little in common with the “New Wave” of science fiction of the late 1960s and early 70s, the works of Michael Moorcock and Thomas Disch and J. G. Ballard—but stylistically, it’s a closer relative. As Moorcock wrote rather snippishly, science fiction lacked “passion, subtlety, irony, original characterization, original and good style, a sense of involvement in human affairs.”

When we think about cyberpunk today, we mostly think about the visual trappings, the stuff that does look good on film—but those weren’t at all the point. A lot of these stories and novels were, in oft-bitter and backhanded fashion, deeply socially conscious. They had shit to say, and what’s more, they wanted to say it beautifully. The opening sentence of Neuromancer has become one of the most famous in science fiction:

The sky above the port was the color of television, tuned to a dead channel.

Of course, the New Wave of science fiction doesn’t seem much less dated now than what it railed against, and the same is true of cyberpunk. (Bruce Sterling declared it dead as long ago as 1985.) Its aesthetics were assimilated into the mainstream long ago, and the very act of mainstreaming guts any legitimate claims to counterculture, rather like buying a Dead Kennedys tee at Hot Topic. But its treatment of technology, of dystopia, of the contentious relationship between individual freedom, corporate power and state control had a tremendous and lasting influence on science fiction. Yes, cyberpunk might have said “put on this black overcoat and these sunglasses and you’ll look awesome, trust me,” but it also said “science fiction is relevant not just as meditations on our possible long-term futures, but as a mirror to show us things about our here and now.”

And that’s stuff that—mostly—didn’t come along for the ride when the glasses and overcoats went to Hollywood.

If you’re interested in reading more seminal cyberpunk stuff, here’s a few things to potentially investigate:

  • Cyberpunk: Stories of Hardware, Software, Wetware, Evolution, and Revolution, an anthology that includes work by Sterling, Gibson, Cadigan, and others. Since Sterling’s original anthology Mirrorshades is long out of print, this is pretty much the collection to get for a broad overview of the important writers.
  • Neuromancer, William Gibson. Poetic and bleak, with memorable characters and a striking take on artificial intelligence, this was the first novel to win the Hugo, Nebula and Philip K. Dick Awards, and has been cited as one of the most important novels—not just science fiction novels, but novels, period—of the twentieth century’s latter half.
  • Snow Crash, Neal Stephenson. The main character is Hiro Protagonist, who delivers pizza for the Mafia in an armored supercar as his day job and is a warrior in the Metaverse by night. Either you already want to read it or I can’t help you. (The Metaverse as depicted here was a pretty direct influence on Second Life.)

For movies/TV, there are some interesting ones the Fangs & Fonts Folks touched on—especially Ghost in the Shell, The Matrix (the first one) and, of course, Blade Runner—but I’d also suggest the underrated Strange Days, directed by Kathryn Bigelow (The Hurt Locker), and—if you can find it—the short-lived cult TV series “Max Headroom.”

On reviewing in a small community

So here’s the thing: bad reviews are fun.

Sure, good reviews can be fun, too. But let’s face it—stuff you hate gives you more occasion for zingers. Roger Ebert opened his review of one infamous movie with “Battlefield Earth is like taking a bus trip with someone who has needed a bath for a long time.” (My favorite review opener, though, is from Mary Pols of TIME: “More than 24 hours has passed since I watched the new Adam Sandler movie Jack and Jill and I am still dead inside.”)

But a good review can’t be just zingers, and the point of a review is not to show off how witty the reviewer is. Ebert explained—without rancor—just what it is that made “Battlefield Earth” suck. He didn’t accuse the movie of being an assault on all that is good and holy; the movie’s creators and stars needed a thick skin to deflect the barbs, but they weren’t personal attacks. No one was writing, say, “This is a steaming pile of shit.”

“That may be vulgar, but it’s not a personal attack.”

Well, see, that’s kinda the heart of the matter.

When you’re just talking trash to your friends about something, you can get away with that defense. In a printed or filmed review, saying that becomes considerably nastier. And if that review isn’t of a movie but is of something that a single person created—like a book—the review is personal, because the work is personal.

I’ve avoided mentioning the reviewer—and specific review—that inspired this, but if I reveal that it’s a furry story, some of you may quickly guess both. When it comes to writers and publishers, this is still a small community. The phrase “steaming pile of shit” comes from that review, as does the assertion that the book under review somehow “tricked” the reviewer into thinking it would be good, except that it really isn’t. It tricked him! Then he recovered from its evil spell and realized it was shit. Shit shit shit shit shit. (I suspect I’m undercounting the number of “shits” he used.)

Without knowing the book in question, the chances are you’re already thinking gosh, even if this is self-published fanfic spewed out by a fever-gripped teenager who left no comma unspliced, you’re making the reviewer sound a little unhinged. Well, he comes across as a little unhinged. To some degree that’s clearly a schtick, but it’s still startlingly vicious.

This is, in fact, a book I saw in draft form. It’s well-written. You could definitely make the case—as the reviewer did, with stentorian profanity—that the protagonist isn’t sympathetic. Neither is the influence character. (He’s charismatic, but not sympathetic.) They’re both con men. They make bad choices. I wanted to slap both of them at multiple points. Some readers might genuinely hate both main characters.

From The Oatmeal

But a badly written book—a “steaming pile of shit,” to wit—would hardly be powerful enough to make anyone angry with it. Whether you like a character or a setting has little to do with the quality of the work. The problem isn’t that this is a negative review. It’s that it’s an unfair review.

I mentioned before that the furry writing community is small, and bluntly, it’s small enough that this edges past merely irritating toward flat-out irresponsible. I doubt it’s going to hurt this particular book’s author, but public viciousness can be genuinely damaging at the scale we’re still at. Also, keep in mind reviewers earn—and lose—reputation currency as well. Authors and publishers do talk. And I can assure you I’m not the only one who’s saying, “Hey, can you believe this guy thinks this is an appropriate way to review a book?”

Let me underline that I’m not suggesting we never say negative things. Furry truly needs good criticism to advance, and we have a history of denying glaring problems in work by our community. But good criticism is well-reasoned. It distinguishes between this has objective problems in its storytelling and this story just isn’t my cup of tea.

And if you really don’t think something has any redeeming value at all—whether it’s competently written but just makes you want to pluck out your eyeballs, or it really is self-published fanfic spewed out by a fever-gripped teenager who left no comma unspliced—then you need to stop and ask yourself what your intention is in reviewing it. I’m betting the honest answer is “I want to mock this so everyone can laugh at my witty zingers, and I can be a capital-P Personality.”

If so, my advice is don’t do it. Because your review will probably be a steaming pile of shit.

(Nothing personal.)

Why a raccoon? Who cares?

I’ve been watching and silently judging another flareup of a very old debate in anthropomorphic fandom, this time happening on Twitter and in the comments on Flayrah’s review of Roar #4. Does there need to be a justification for having characters in a story be animal people?

We can illustrate the divide with two comments. First, from “Crossaffliction,” in his usual diplomatic style:

For the record, the entire premise of “let’s make talking animals the main characters and leave that completely unexplained in [a] way that offers nothing to the story” is a flawed premise, and also makes people wonder why you’d do that (we suspect the writer has some sort of freaky weird animal people fetish, and, you know what, we’d probably [be] right in this case).

Then “Sparf”:

I agree with some of the most prolific writers in our fandom when I say that we have moved on past that need. This is furry fiction as a meta genre unto itself. Every story does not need an explanation of where furries came from or why they exist. If it is germane to the story being told, sure, it can be revealed in the narrative, but usually it is trite or feels wedged in.

While I can quibble with both (if a genre is “meta” it definitionally doesn’t exist “unto itself,” and at times Crossie has an unhealthy hangup about the fandom’s unhealthy hangups), three observations.

  1. These actually aren’t two sides of the same argument. One position is that it’s a flaw in a story to use anthropomorphic animals for no reason more profound than “I like them, neener neener.” The other position is that there’s no need to explain why a story has anthropomorphic animals. These positions aren’t mutually exclusive.
  2. In fact, they’re probably both right.
  3. The most prolific writers in our fandom have moved past the need to keep rehashing this argument. Christ on a pogo stick.

My own old story “A Gift of Fire, a Gift of Blood” features a vampire bat; the story would be substantively different were she a vampire in the Dracula sense. Yet there’s no real explanation for why the Empire of Ranea (the world that story and others of mine are set in) has fox, wolf and cat people instead of dwarves, elves and hobbits. Effin’ hobbits, how do they work? Tolkien has a lot of unexplained hobbits running around, yet as impossible as this may be to believe, the story seems to work anyway!

When we say that a furry story is better when it’s not “humans in fursuits,” what does that actually mean? Take Kyell Gold’s Out of Position. Few if any aspects of Dev and Lee’s story require them to be tiger and fox. Their world is our world, just with animal people. This is about as “because furries” as one can get. Yet we’re shown a myriad of subtle ways in which their animal nature changes things: ergonomics, body language, stereotypes, laws, social mores. You could tell an equivalent story with an all-human cast, but it would be a different story.

“Why is she a raccoon?” That’s not an interesting question. “How do you show how her being a raccoon matters?” That’s an interesting question.