Feed housekeeping

I finally remembered that I’d been using FeedPress for, er, feed stuff (remember RSS?) on the original Coyote Tracks, and I’ve updated it to pick up the feed from the new site instead. So if you were one of the couple hundred people who’d been reading posts that way, hi!

Since the new blog merges what was Coyote Tracks and my rarely-updated writing blog, Coyote Prints, you may get more than what you want here. If you only want tech posts, you can subscribe to the tech category feed. If you only want writing posts, you can subscribe to the writing category feed. (You can get a feed for any of the categories by going to the category page itself, if you really insist.)

And, if you’re following this as a link from Goodreads (hi?), note that the Goodreads blog is only pulling from the writing category.

Last but not least, if you’re subscribed to the Tumblr feed directly somehow (tracks.ranea.org/rss), you’re going to get what’s cross-posted to Tumblr, which is going to be mostly tech but probably kind of random.

Cotton, hay, and rags: giving bias the veneer of rationality

As you’ve surely heard by now, a mid-level engineer at Google—he’s anonymous, so I’ll call him Mr. Rationalface—wrote a memo called “Google’s Ideological Echo Chamber” in which he argued that “differences in distributions of traits between men and women may in part explain why we don’t have 50% representation of women in tech and leadership. Discrimination to reach equal representation is unfair, divisive, and bad for business.” (His words, not mine.) In response, recently former Google engineer Yonatan Zunger wrote the simply-titled “About this Googler’s manifesto,” in which he argues it’s manifest bullshit. (My words, not Zunger’s).

Between the time I started writing this and now, news has come out that Mr. Rationalface has been fired. I’ll come back to that.

I’ve been thinking about responses I saw on Hacker News to Zunger’s piece. The most common defense of Mr. Rationalface’s thesis was to restate its core premise: This whole drive for diversity rests on the premise that there’s no difference between men and women, but the falsehood of that is apparent to even the most casual of observers.

This is a common rhetorical trick I see in this particular corner of the internet (i.e., rationalists who want to rationally prove that PC SJW WTFery is irrational): restate the opposing premise incorrectly, then commence a full frontal assault on the restatement. Of course there are biological differences between men and women; who claimed otherwise? Mr. Rationalface proceeds from here to assert the following totally objective, non-sexist truths:

  • Women are more open toward feelings and aesthetics, while men are more open to ideas.
  • Women have more empathy than men, while men have more interest in systematizing.
  • Women are gregarious and agreeable; men are assertive!
  • Women are more neurotic, with higher anxiety and lower stress tolerance.
  • Women are irrational, that’s all there is to that! Their heads are full of cotton, hay, and rags!

Whoops! While the first four are from Mr. Rationalface, that last bullet point was from noted academic rationalist Henry Higgins.

A fairer way to state the “pro-diversity” case is more like, some perceived differences between men and women used to justify associating higher-paying professions with men are rooted in dubious stereotypes. And we can test whether there’s prima facie evidence for that by looking at the actual history of software engineering. In the early days, it was women’s work: it was seen as more like filing and typing than math and logic—the hard stuff was the hardware. But by the mid-1970s, it was men’s work. But the work hadn’t changed. What changed was the perception of the work: society started to consider it high-status white collar work rather than low.

I know that—irony of ironies—I’m trying to rationally analyze an argument that is, at its heart, not about rationality at all. It’s about reclaiming ground in the Great Culture War. If the gender disparity in the engineering workforce at Google reflects something broken in their culture, it demands a solution that involves taking action one might call “affirmative.” PC! SJW! Cthluhu fhtagn! So don’t even allow the possibility that the problem is in the culture. If the problem isn’t in the culture, it must be in women. The solutions offered must involve working with and around Essential Feminine Nature.

But it’s the argument style that leaves me fascinated, the same style employed by many of his defenders, and a style that echoes through GamerGate, the Sad Puppies and other geeky outposts in the Great Culture War. If I may engage in some stereotyping myself, it’s an argument style beloved of folks who are mostly white, mostly male, mostly under 30, and mostly a little too sure of their razor-sharp logic. I don’t think this kind of guy gets redpilled because of deep-rooted anxiety over losing white male privilege—I think they get redpilled because it’s just effin’ cool to be told you’re one of the few people smart enough to see reality as it is, rather than buying into the conventional wisdom that traps all the other sheeple. This is why so many fringers, from anti-vaxxers to white supremacists, construct elaborate, nearly-logical theories built on a stack of unexamined premises. This is obvious to the most casual of observers, so let’s move on, they say, while the rest of us sheeple are making the time-out signal and saying wait, what?

Isn’t it obvious when premises are false? Isn’t this willful—and malicious—ignorance? Sometimes. If we’re honest with ourselves, more often than not. But the more boxes you tick on the cis-het-white-male line, the more advantages you get for no actual work on your part. You have, if I might be so bold, a rational self-interest in supporting arguments that those advantages are immutable nature, and attacking arguments that they’re uncomfortably squishy social constructs. To paraphrase Upton Sinclair, “It is difficult to get a man to understand something when his social status depends on his not understanding it.”

So about Mr. Rationalface’s firing. If I were his manager, would I have canned him? I admit I’m not comfortable with hey, it’ll only chill the speech we don’t want; you can’t know that only the “right” group of people will take away exactly the message you intend to send. (Exhibit A: Hacker News.) But as Yonatan Zunger noted, a substantial number of Mr. R’s (former) coworkers were likely furious; he might as well have scrawled Does Not Work Well With Others, Especially Wimmen across his face with a Sharpie. From a—dare I say it—coldly rational standpoint, Google HR gets a firestorm no matter what, but keeping him risks a second, bigger firestorm when he shoots his mouth (or text editor) off to a coworker again.

I looked back at Hacker News briefly on the day of his firing and saw, well, what I expected. This is an outrage! This proves all the author’s points! This was not the anti-diversity manifesto the SJWs are claiming it is, it’s a well-written, polite, logical argument! It definitely had the appearance of logic, and it was debatably civil. But from its mischaracterization of the “pro-diversity” arguments through its “you’d agree with me if bias wasn’t blinding you to my truth” conclusion, it was precisely what its critics claimed it was. It’s easy to say Mr. Rationalface lost his job for not kowtowing to liberal groupthink, but sometimes a burning bridge is just a burning bridge.

Two Painkillers

The Painkiller is a semi-classic tiki drink. I say semi- because one glance will tell you it’s a pretty close relative of the Piña Colada. With all respect to Rupert Holmes of “Escape” fame, the Piña Colada is kind of loathed by tiki aficionados. (From what I gather, it’s loathed by Holmes, too, who sadly quipped, “No matter what else I do, my tombstone will be a giant pineapple.”) But for some reason, the tiki gods have smiled on the Painkiller.

So, curiously, have the trademark lawyers: Pusser’s, a sort of funky “British navy style” rum from the Virgin Islands, has trademarked Painkiller and insists that it must be made with their rum. (Never mind that the original recipe from the Soggy Dollar Bar actually predates Pusser’s.) Here’s their recipe:


1 oz. coconut cream
1 oz. orange juice
4 oz. pineapple juice
2–4 oz. Pusser’s rum

Shake with crushed ice and pour unstrained into a sufficiently large glass. Garnish with grated nutmeg and a cinnamon stick.

So: stop pretending it isn’t a piña colada with orange juice. A relatively funky, dry navy rum brings a different flavor profile to the party than low-end Bacardi, but this is not a “spirit-forward” drink. (Unless you double the rum and make a “Painkiller 4,” which is an idea of dubious wisdom.) Also, despite Pusser’s attempted rewrite of pop culture history here, it wasn’t used in the original recipe, which was—as far as I’ve been able to tell—a mix of dark rums from Barbados and Jamaica.

As it turns out, though, screwing with the recipe is harder than it looks. I wanted to cut back on the pineapple juice to reduce the inherent Colada-ness, but cutting back on the sweetness too much took it too far from its roots. After a few attempts, I decided the simplest route was the best: slice the pineapple juice content in half, leaving the coconut cream and orange juice at their original levels. This makes the drink more like a refined cousin of its orange-free relative.

Also, why not get back to the original rum mix? It’d be more authentic, blending rums is fun, and screw Pusser’s trademark. So. In my version, I’ve gone with R. L. Seale’s 10 Year for the Barbados. For the Jamaican, I chose Smith & Cross: like Pusser’s, it’s “navy style,” meaning it’s a bit funky. (It’s also a rather formidable 57% ABV; the Seale’s is a respectable 43%. I’m just saying, do not do a “Painkiller 4” version of mine unless you want to make friends with the floor.) At my local Total Wine & More, the Seale’s was $24 and the Smith & Cross was $27; if you’d like a cheaper pairing, go with Mount Gay Eclipse and Appleton Estate Signature Blend.

Don’t skimp on the juices—get the freshest you can get. This is good advice for all drinks. I used Coco Lopez for the coconut cream, which seems to be the one most tiki bars use; I don’t know how much of a difference other brands make, but don’t use coconut milk. They’re not the same thing.

Coyote Painkiller

1 oz. coconut cream
1 oz. fresh orange juice
2 oz. pineapple juice (not from concentrate)
1 oz. R. L. Seale’s 10 Year rum
1 oz. Smith & Cross rum

Shake with cracked ice and pour unstrained into a double Old Fashioned glass. Add more ice if necessary, stir, and garnish with grated nutmeg and cinnamon.

A new home

Over the years, I’ve ended up with multiple “presences” online:

  • The original Coyote Tracks, hosted at Tumblr
  • “Coyote Prints,” an attempt at a writing news-ish weblog, generated with Jekyll
  • My Ranea.org website, made with a hacky homebrew static site generator
  • The occasional foray onto Medium

That’s not even inclusive of earlier attempts at this, like a LiveJournal and, before that, a very simple bloggy thing that worked by putting files with names like 1999-01-01-entry.txt in a specific directory that were picked up by a small PHP script. (That was back in the days when PHP was just used to embed bits of interactivity in HTML pages, just like that, which is something it’s pretty good at. I’m pretty sure I was doing that in early 1998, which by some measure might make me one of the earliest bloggers, or would if there had been just one damn person reading my home page.)

While this hodgepodge of bloglike objects had good intentions—separation of concerns, trying new platforms, keeping up with the cool kids—it’s become too unwieldy. The decision where to post is sometimes kind of arbitrary. Many of the people who read about my writing are interested in tech; while the reverse isn’t as true, I’d actually kinda like to expose some of my tech audience to my writing, especially stories that involve techy things.

A bigger concern, though, comes down to fully controlling my own content.

This isn’t a new concern; Marco Arment was writing about owning your identity back in 2011. Some blogging services let you bring your own domain—Tumblr does it for free, which is why you go to tracks.ranea.org instead of chipotle.tumblr.com—and others, like WordPress.com, let you do it for a modest charge. Medium makes it possible, but only for publications (and at a fairly high cost); many other services don’t offer this at all.

So: Welcome to coyotetracks.org.

But while owning your online identity is necessary, it’s not sufficient: you need to own your content, too. I don’t mean that in a legal sense—despite the headless chicken dance the internet goes through every time somebody changes their legal boilerplate, no reputable service ever has or ever will tried to steal your copyright. I mean it in an existential sense.

I still like Tumblr, despite its foibles, but as far as I know it was never profitable on its own, it was never profitable for Yahoo, and it’s on track to never be profitable for Verizon. As for Medium, I love what it’s trying to do, or maybe I love what it was trying to last business model and not so much now, or maybe vice-versa, or maybe it was three or four business models ago. What other businesses call pivots, Medium calls Tuesdays.

I’ll circle back to that, but the upshot is that I decided I needed a POSSE: “publish own site, syndicate everywhere.” (Look, I didn’t make it up.) And that brings me to…WordPress.

I’ll be blunt: I don’t like WordPress. Internally it’s a dumpster fire, full of arcanely formatted non-OO code, bloated HTML, and a theming engine designed by bipolar squirrels.

So I looked at other things. I know there are ways to make static site generators quasi-automatic, that Matt Gemmell swears it’s faster to blog from his iPad with Jekyll. I’ve done it, with a system not too dissimilar from the one he describes. It works, but I don’t love it. I’m comfortable at a shell prompt, but I don’t want it to be necessary for blogging, especially if I’m on an iPad. (I’m moving back to the Mac for portable writing, but that’s another post.)

I also looked at Ghost, which started with some fanfare a couple years ago as a modern take on WordPress that focused back on blogging essentials rather than shoehorning in a content management system. Now they’re a “professional publishing platform,” and all their messaging is we are not for you, casual blogger, pretty much the opposite of their original ideology.

But I can publish to WordPress right from Ulysses. Or MarsEdit. Or the WordPress web interface, desktop app, or iOS app. The WordPress API is, at least for me, a killer feature. And its ecosystem is unmatched: I have access to thousands of plugins, at least six of which are both worth using and actively maintained.

So: I’m still finding my way. I’ve added a cross-poster which can theoretically post everywhere I want, although I’m not sure if I’m going to use its Medium functionality—I want to be able to vet what it’s posting before it goes live there, so I’ll probably just use Medium’s post importer. And I don’t want to syndicate everything everywhere: I want to syndicate selectively. (This post probably won’t even go to Medium, for instance.)

The semi-ironic footnote: I don’t know if this is really going to make me post more, when all is said and done. I’ve always been guilty of being more interested in building things than running them. But we’ll see.


My first full-length novel, Kismet, was published by Argyll Productions in January 2017. Here’s the back cover blurb:

The River: a hodgepodge of arcologies and platforms in a band around Ceres full of dreamers, utopians, corporatists—and transformed humans, from those with simple biomods to the exotic alien xenos and the totemics, remade with animal aspects. Gail Simmons, an itinerant salvor living aboard her ship Kismet, has docked everywhere totemics like her are welcome…and a few places they’re not.

But when she’s accused of stealing a databox from a mysterious wreck, Gail lands in the crosshairs of corporations, governments and anti-totemic terrorists. Finding the real thieves is the easy part. To get her life back, Gail will have to confront a past she’s desperate not to face—and what’s at stake may be more than just her future.

Kismet takes place about ten years after the short story “Tow” and shares the same protagonist. While I aimed for a hard science fiction feel, it’s a character-driven story about identity, transhumanism and what defines home. It came out of the CSSF Novel Writers Workshop, a residential workshop at the University of Kansas led by Kij Johnson, so it has a pretty good pedigree. I could tell you that if you like Kim Stanley Robinson’s work or Jennifer Foehner Wells’ Fluency, you’ll probably like Kismet. I could also tell you that its high concept is “The Expanse meets Zootopia,” which is not entirely wrong.

You can get previews of the ebook from most of those links, but you can also get a four-chapter preview PDF on my “buy my books” page.

The fantastic cover art and design is by Teagan Gavet. (The featured art here is a clip from the book’s alternate cover.)

Form over Frolic: Jony Ive’s quest for boring perfection

Right now I’m sitting in front of a 27″ iMac. It’s the best computer I’ve ever owned, with a 5K display, high color gamut, 24 gigs of RAM and 512 gigs of SSD storage. It’s beautiful and minimalist, just like every iMac they’ve released since they switched to aluminum in 2007.

It’s also the least modifiable desktop computer I’ve ever owned. This trend also goes back to that aluminum iMac, in which—like today’s—only the RAM is user-upgradeable. (Since 2012, even that’s no longer true of the smaller 21″ iMac.) It’s hard not to ask: why is thinness the priority in all of Apple’s designs?

You know the answer: Jony Ive. It’s clear by now that he would like everything Apple produces to look as close to a pure pane of glass as he can make it, with minimal, unadorned metallic frames, as close to unbroken and symmetrical as functionality allows. And Ive’s team is perfectly willing to sacrifice functionality in pursuit of this goal. A female Lightning port is fractionally thinner than a female USB-C port, and now you know why the iPhone will never get USB-C ports. Sorry. You’re lucky the one-port MacBook’s one port isn’t a Lightning port. (I have it on good authority that was under consideration.)

This often gets portrayed as a choice between staying chained to legacy hardware and forging ahead to the future. But if you were using Macs a decade ago, do you remember the way the power indicator light on a Mac, both desktop and laptop, used to slowly pulse when it was asleep, as if it were slowly breathing? Or the way batteries on laptops, both replaceable and permanent, used to let you check charge levels without turning on or waking up the machine. Or, as recently as last year, the way power plugs changed color to show charging state. All of that—along with the illuminated Apple logo and, now, the cheerful startup chime—has gone away.

All the price of progress, right?

A couple years ago, Shawn Blanc published a book about “how to make good things great” called Delight is in the Details. That phrase captures an essential paradox: we want our products to stay out of our way in everyday use, yet products that convert us from merely satisfied customers to fans have little touches that call attention to themselves in just the right way. When I start my Mazda, its display lights up with the words “Zoom Zoom” for just a few seconds. It’s stupid, but after six years it still makes me smile.

“Little touches that call attention to themselves” are the opposite of Ive’s guiding aesthetic. He creates beautiful objects you can appreciate as works of art. You can’t help but marvel at the lengths to which his team will go to make a perfect fusion of glass and metal, to craft UIs that appear to directly manipulate data, to make the hardware disappear while you’re using it. Under Ive’s direction, Apple delivers works which are closer to the science fiction future than any other major consumer electronics company. And yet his designs are relentlessly whimsy-free. There won’t be a moment that catches you off-guard and makes you smile. Ive’s work never aspires to make you giggle with delight.

Software doesn’t escape his penchant for austerity, either. The Ive era of software UX has been about flattening, removing, relentlessly stamping out skeuomorphism. The “traffic light” window controls are just circles now; the swirling barber pole progress bars are simple blue, with a subtle pulse; we don’t even get the little puff of smoke when we pull icons off the dock. I’m surprised the iOS icons still jiggle-dance when they’re in rearrangement mode. I’m not sure that it’s fair to say that we’re seeing a software analog to Apple’s quest for thinness, but I’m not sure it isn’t, either.

I’d hardly be the first one to complain about a perceived drop in software and UX quality, or to question whether Apple’s being a little too aggressive in dropping legacy ports. Yet it feels like that’s always been part of the deal, right? We’re taking away the floppy drive, or only giving you these weird USB ports, or sealing the battery in, but look at how cool we can make this thing now! It’s not like anything else on the market. It’s fun.

This iMac is the best computer I’ve ever owned, but nothing about it screams fun. The quirkiest thing about it is my mechanical keyboard, something Apple would never dream of making on their own these days. (So gauche.)

Yes, but you keep talking about the Mac line. The future is in iOS! Despite revealing myself in past posts as a Mac partisan, I think this is not only true but, overall, good. I’m a fan of that science fiction future, and it’s not one in which I see many people sitting down in front of 27″ monitors and keyboards for their computing needs—even if the monitors are holographic and the keyboards aren’t physical.

But man, talk about the “pure pane of glass” ideal, right?

The argument Apple is implicitly making is that computers—especially the computers of the future that the iPad typifies—are appliances. Appliances can be beautiful, but they shouldn’t exhibit frippery. They should be focused. We should prefer the Kitchen-Aid stand mixer to the plastic knockoff that does twice as much at half the price, because it won’t do any of those things well and it’ll fall apart in a year. (Besides, you can do all those things with the Kitchen-Aid, anyway; you’ll just need to buy some dongles.)

That’s all true. Maybe Ive knows best. But if you showed me a table with an iPad Pro, a Surface Pro, and a Surface Book on it and asked me to rank them in order of Cool Factor, I’d be hard-pressed to put the iPad at the head of the line. Microsoft isn’t trying for tiny-quirk delight, which is just as well (“It looks like you’re trying to add personality to your UX! Can I help?”), but they’re sweating small, thoughtful details. Apple sweats the details of manufacturing processes. That’s great, but it’s not the same thing.

Maybe—just maybe—a little frippery is okay, even if it adds a half-millimeter in depth to a product, or adds a touch of (gasp) skeuomorphism to the UI here and there, or allows a slightly less restrained, tasteful pigment on the anodized aluminum case. Injecting a bit of fun, even weirdness, to their computers in the late ’90s helped pull Apple back from the brink. It may be time for another injection.

Being Kitchen-Aid is a fine goal, but you know what? They sell that stand mixer in nearly three dozen colors.

Originally published at Hacker Noon.

Bubbles, baseball and Mr. Marsh

On deep stories and ugly truths in American politics

When I went to high school in Florida, I was in one of the last years that had to take a class called “Americanism vs. Communism.”

I had the dumb luck to get a teacher named Bob Marsh, an iconoclastic sixty-something motorcyclist and science fiction fan who told the students like me who were also sf fans about his friend Joe Haldeman. While there’s a common joke I hear even from today’s high school students about American history classes ending at World War II, we learned about the Cold War, about Korea, about Vietnam. We learned about Castro and Kruschev and Mao, but also about Watergate and COINTELPRO and HUAC. It was, improbably, a pretty good class.

Despite the name, the class’s story wasn’t about how Americanism, whatever that was, opposed communism. It was about how liberal democracy opposed authoritarianism.

That sense of “liberal” has gotten conceptually muddled over the years, particularly in post-war America. (Call it “classical liberalism” if you must, although that phrase is even more easily coopted.) This is the point: Bernie Sanders and Paul Ryan might not agree on much, but Ryan has never, to the best of my knowledge, advocated for a return to monarchy; Sanders has never once suggested outlawing private industry. They would both agree that the individual liberty and representative democracy thing is, on the whole, a pretty good idea. They are both pretty firmly standing for liberal democracy and against authoritarianism. That’s a foundational ideal of America. We’ve failed to hit it a lot through our history, but we’ve done better than a lot—a lot—of other countries.

The name “Americanism vs. Communism,” though, tells us another story, a story that’s been pervasive in America in the post-World War II era. This story tells us that if we want to oppose authoritarianism, we need only worry about “the left.” It doesn’t tell us that “the right” has its own kinds of authoritarians. To some people, it even implies that Nazis were socialists (it’s right in the name), and that fascists were liberals.

The name “Americanism vs. Communism” tells us, maybe, to let down our guard.

On John Gruber’s podcast “The Talk Show,” guest Merlin Mann said of the 2016 presidential election: “It’s not that my team didn’t win. It’s that maybe I just don’t understand baseball anymore.”

Merlin and I went to the same small Florida college at more or less the same time. (We all totally knew he was going to be a professional podcaster.) I’m pretty sure he also took an AvC class. We probably share a roughly similar, and from appearances similarly inadequate, understanding of baseball.

Before the election we were inundated with think pieces about how “the left” was wildly misinterpreting the appeal of nationalist populism. No no no, we were told, it’s not racism and misogyny and homophobia. It’s the rage, the deep story, the message to people who felt they were being not merely left behind but that “the elites” were letting other people “cut in line” ahead of them on the way to the American Dream. We’re still constantly hammered with the idea that if you’re in a city you’re in a bubble, if you’re liberal you’re in a bubble, that we just need to get out of that bubble and listen to real, non-bubble America.

The deep story may be about all that. But it’s also about how gay marriage devalues “real” marriage. How letting transgender folk use public bathrooms puts “real” men and women in danger. How we should watch, register and deport immigrants and build a wall around our borders. The racism and misogyny and homophobia isn’t incidental. It’s not a byproduct. The deep story is about tribalism.

Here’s an ugly truth: some of the country doesn’t believe that America belongs to people who aren’t in their tribe. That tribe is white, straight (at least openly), and Christian. It’s gotten bigger over the years—it didn’t used to include the Irish, or Italians, or Catholics, or women—but every inch of expansion has been fought, bitterly and grudgingly. Other tribes can live in America, maybe, but theirs comes first, and everyone else is here at their forbearance.

Another ugly truth is this: some of the country considers not just welfare, not just social programs, but basic justice and legal protection to be a zero-sum game. Her marriage means less if you can get married. The sign on the restroom door means less if you can go through it. The police are here to protect me from you. And God knows I don’t want my tax dollars to go to the likes of you.

The third ugly truth is this: those people are in power now.

Despite my sarcastic streak, I’m a natural optimist. I’m not going to claim there’s much of a silver lining here, though. I believe that the oft-maligned millennials—and even us Generation Xers—will pull us back on track. I don’t think this is the end of the great American experiment, that representative democracy is at its end, that America is doomed to become a mashup of The Handmaid’s Tale and Idiocracy.

But I think it’s going to get worse before it gets better. And I don’t know how much worse.

I wonder what Mr. Marsh would have said about all this, back in that Americanism vs. Communism class. I think he might say the problem isn’t bubbles. It’s not who’s listening to who in the present. It’s who’s listening to the past. America has always been at its worst when we’re encouraged to turn against one another, and at its best when we move toward ensuring that liberty and justice truly is for all.

I think he might also say this. Liberal democracies can vote themselves into authoritarianism. Voting themselves back out is much harder.

A digression on cyberpunk

I was listening to Fangs and Fonts‘ most recent podcast on cyberpunk, and—

Okay, let me back up. “Fangs and Fonts” is a podcast about writing, hosted by four writers in (and out of) furry fandom: Roland Ferret, Yannarra, Tarl “Voice” Hoch and Ocean. So far the episodes have mostly come across as structure-free conversations about a given topic. There’s a lot of spontaneity and liveliness to it, although I suspect they’d benefit from spending an hour or so before recording making a list of Things To Cover.

Anyway. While it was fun listening in on the conversation, my impression was that none of the four hosts had read much of the genre past the Wikipedia entry. They’d seen movies with cyberpunk tropes to varying degrees, but… well. There’s no way to say this without an implied tsk tsk, but you guys, it’s a writing podcast!

So let me back up more. Specificially, to the early ’80s.

It’s pretty easy to discern the cyber of cyberpunk: ubiquitous networking that lets us always be “jacked in” to a sprawling virtual reality while also letting corporations and governments, which all too often seem to be one and the same, track us for a variety of purposes both nefarious and benign. But what about the punk? Well, it meant… punk. An anti-authoritarian, alienated and disaffected counterculture that neither fits in with the dominant culture nor has much interest in doing so. Heroes in cyberpunk weren’t necessarily punks—or necessarily heroic—but they tended to have a very edge-of-society vibe.

The problem with focusing almost exclusively on the cinematic aspect of cyberpunk is that you miss that the “punk” element wasn’t just in the relationship of the fictional characters to their settings. It was in the relationship of the writers to mainstream science fiction. William Gibson, Bruce Sterling, Pat Cadigan, Rudy Rucker—they were very deliberately responding to the more utopian, and by then rather dated, science fiction of the 1950s and before, the Heinleins and Clarkes and Asimovs.

Thematically, cyberpunk had little in common with the “New Wave” of science fiction of the late 1960s and early 70s, the works of Michael Moorcock and Thomas Disch and J. G. Ballard—but stylistically, it’s a closer relative. As Moorcock wrote rather snippishly, science fiction lacked “passion, subtlety, irony, original characterization, original and good style, a sense of involvement in human affairs.”

When we think about cyberpunk today, we mostly think about the visual trappings, the stuff that does look good on film—but those weren’t at all the point. A lot of these stories and novels were, in oft-bitter and backhanded fashion, deeply socially conscious. They had shit to say, and what’s more, they wanted to say it beautifully. The opening sentence of Neuromancer has become one of the most famous in science fiction:

The sky above the port was the color of television, tuned to a dead channel.

Of course, the New Wave of science fiction doesn’t seem much less dated now than what it railed against, and the same is true of cyberpunk. (Bruce Sterling declared it dead as long ago as 1985.) Its aesthetics were assimilated into the mainstream long ago, and the very act of mainstreaming guts any legitimate claims to counterculture, rather like buying a Dead Kennedys tee at Hot Topic. But its treatment of technology, of dystopia, of the contentious relationship between individual freedom, corporate power and state control had a tremendous and lasting influence on science fiction. Yes, cyberpunk might have said “put on this black overcoat and these sunglasses and you’ll look awesome, trust me,” but it also said “science fiction is relevant not just as meditations on our possible long-term futures, but as a mirror to show us things about our here and now.”

And that’s stuff that—mostly—didn’t come along for the ride when the glasses and overcoats went to Hollywood.

If you’re interested in reading more seminal cyberpunk stuff, here’s a few things to potentially investigate:

  • Cyberpunk: Stories of Hardware, Software, Wetware, Evolution, and Revolution, an anthology that includes work by Sterling, Gibson, Cadigan, and others. Since Sterling’s original anthology Mirrorshades is long out of print, this is pretty much the collection to get for a broad overview of the important writers.
  • Neuromancer, William Gibson. Poetic and bleak, with memorable characters and a striking take on artificial intelligence, this was the first novel to win the Hugo, Nebula and Philip K. Dick Awards, and has been cited as one of the most important novels—not just science fiction novels, but novels, period—of the twentieth century’s latter half.
  • Snow Crash, Neal Stephenson. The main character is Hiro Protagonist, who delivers pizza for the Mafia in an armored supercar as his day job and is a warrior in the Metaverse by night. Either you already want to read it or I can’t help you. (The Metaverse as depicted here was a pretty direct influence on Second Life.)

For movies/TV, there are some interesting ones the Fangs & Fonts Folks touched on—especially Ghost in the Shell, The Matrix (the first one) and, of course, Blade Runner—but I’d also suggest the underrated Strange Days, directed by Kathryn Bigelow (The Hurt Locker), and—if you can find it—the short-lived cult TV series “Max Headroom.”

On reviewing in a small community

So here’s the thing: bad reviews are fun.

Sure, good reviews can be fun, too. But let’s face it—stuff you hate gives you more occasion for zingers. Roger Ebert opened his review of one infamous movie with “Battlefield Earth is like taking a bus trip with someone who has needed a bath for a long time.” (My favorite review opener, though, is from Mary Pols of TIME: “More than 24 hours has passed since I watched the new Adam Sandler movie Jack and Jill and I am still dead inside.”)

But a good review can’t be just zingers, and the point of a review is not to show off how witty the reviewer is. Ebert explained—without rancor—just what it is that made “Battlefield Earth” suck. He didn’t accuse the movie of being an assault on all that is good and holy; the movie’s creators and stars needed a thick skin to deflect the barbs, but they weren’t personal attacks. No one was writing, say, “This is a steaming pile of shit.”

“That may be vulgar, but it’s not a personal attack.”

Well, see, that’s kinda the heart of the matter.

When you’re just talking trash to your friends about something, you can get away with that defense. In a printed or filmed review, saying that becomes considerably nastier. And if that review isn’t of a movie but is of something that a single person created—like a book—the review is personal, because the work is personal.

I’ve avoided mentioning the reviewer—and specific review—that inspired this, but if I reveal that it’s a furry story, some of you may quickly guess both. When it comes to writers and publishers, this is still a small community. The phrase “steaming pile of shit” comes from that review, as does the assertion that the book under review somehow “tricked” the reviewer into thinking it would be good, except that it really isn’t. It tricked him! Then he recovered from its evil spell and realized it was shit. Shit shit shit shit shit. (I suspect I’m undercounting the number of “shits” he used.)

Without knowing the book in question, the chances are you’re already thinking gosh, even if this is self-published fanfic spewed out by a fever-gripped teenager who left no comma unspliced, you’re making the reviewer sound a little unhinged. Well, he comes across as a little unhinged. To some degree that’s clearly a schtick, but it’s still startlingly vicious.

This is, in fact, a book I saw in draft form. It’s well-written. You could definitely make the case—as the reviewer did, with stentorian profanity—that the protagonist isn’t sympathetic. Neither is the influence character. (He’s charismatic, but not sympathetic.) They’re both con men. They make bad choices. I wanted to slap both of them at multiple points. Some readers might genuinely hate both main characters.

From The Oatmeal

But a badly written book—a “steaming pile of shit,” to wit—would hardly be powerful enough to make anyone angry with it. Whether you like a character or a setting has little to do with the quality of the work. The problem isn’t that this is a negative review. It’s that it’s an unfair review.

I mentioned before that the furry writing community is small, and bluntly, it’s small enough that this edges past merely irritating toward flat-out irresponsible. I doubt it’s going to hurt this particular book’s author, but public viciousness can be genuinely damaging at the scale we’re still at. Also, keep in mind reviewers earn—and lose—reputation currency as well. Authors and publishers do talk. And I can assure you I’m not the only one who’s saying, “Hey, can you believe this guy thinks this is an appropriate way to review a book?”

Let me underline that I’m not suggesting we never say negative things. Furry truly needs good criticism to advance, and we have a history of denying glaring problems in work by our community. But good criticism is well-reasoned. It distinguishes between this has objective problems in its storytelling and this story just isn’t my cup of tea.

And if you really don’t think something has any redeeming value at all—whether it’s competently written but just makes you want to pluck out your eyeballs, or it really is self-published fanfic spewed out by a fever-gripped teenager who left no comma unspliced—then you need to stop and ask yourself what your intention is in reviewing it. I’m betting the honest answer is “I want to mock this so everyone can laugh at my witty zingers, and I can be a capital-P Personality.”

If so, my advice is don’t do it. Because your review will probably be a steaming pile of shit.

(Nothing personal.)

Why a raccoon? Who cares?

I’ve been watching and silently judging another flareup of a very old debate in anthropomorphic fandom, this time happening on Twitter and in the comments on Flayrah’s review of Roar #4. Does there need to be a justification for having characters in a story be animal people?

We can illustrate the divide with two comments. First, from “Crossaffliction,” in his usual diplomatic style:

For the record, the entire premise of “let’s make talking animals the main characters and leave that completely unexplained in [a] way that offers nothing to the story” is a flawed premise, and also makes people wonder why you’d do that (we suspect the writer has some sort of freaky weird animal people fetish, and, you know what, we’d probably [be] right in this case).

Then “Sparf”:

I agree with some of the most prolific writers in our fandom when I say that we have moved on past that need. This is furry fiction as a meta genre unto itself. Every story does not need an explanation of where furries came from or why they exist. If it is germane to the story being told, sure, it can be revealed in the narrative, but usually it is trite or feels wedged in.

While I can quibble with both (if a genre is “meta” it definitionally doesn’t exist “unto itself,” and at times Crossie has an unhealthy hangup about the fandom’s unhealthy hangups), three observations.

  1. These actually aren’t two sides of the same argument. One position is that it’s a flaw in a story to use anthropomorphic animals for no reason more profound than “I like them, neener neener.” The other position is that there’s no need to explain why a story has anthropomorphic animals. These positions aren’t mutually exclusive.
  2. In fact, they’re probably both right.
  3. The most prolific writers in our fandom have moved past the need to keep rehashing this argument. Christ on a pogo stick.

My own old story “A Gift of Fire, a Gift of Blood” features a vampire bat; the story would be substantively different were she a vampire in the Dracula sense. Yet there’s no real explanation for why the Empire of Ranea (the world that story and others of mine are set in) has fox, wolf and cat people instead of dwarves, elves and hobbits. Effin’ hobbits, how do they work? Tolkien has a lot of unexplained hobbits running around, yet as impossible as this may be to believe, the story seems to work anyway!

When we say that a furry story is better when it’s not “humans in fursuits,” what does that actually mean? Take Kyell Gold’s Out of Position. Few if any aspects of Dev and Lee’s story require them to be tiger and fox. Their world is our world, just with animal people. This is about as “because furries” as one can get. Yet we’re shown a myriad of subtle ways in which their animal nature changes things: ergonomics, body language, stereotypes, laws, social mores. You could tell an equivalent story with an all-human cast, but it would be a different story.

“Why is she a raccoon?” That’s not an interesting question. “How do you show how her being a raccoon matters?” That’s an interesting question.