This is the second of a series of blog posts about bike sheds. In part one, we explored the history of the bike shed story, you can read part one here.
Why do software developers find so much resonance in the bike shed story?
What’s important? What’s really important? How do you know it’s important? How do you know whether this thing is more or less important than that thing?
Parkinson mined a great deal of satire from what he saw as a kind of logical puzzle—looking at the costs, the committee, he suggests, ought to have spent more time on higher price-tag proposals as opposed to dwelling on relatively minor line items, like a bike shed or coffee budget.
So, is that the lesson we should take? That more expensive things are more important?
This framing doesn’t strike me as terribly realistic—regardless of its relative price, I might have valid reasons for being very not okay with a nuclear power plant going up in my backyard; in fact, if it were a bargain-basement nuclear reactor, my concerns would seem even more valid. But it’s hard to argue with cold, hard numbers, so using price as a proxy for importance can provide an attractive heuristic.
The modern version of this story, where caring about color is our benchmark for “trivial details”, is stranger still. Kamp’s mailing list wasn’t literally debating the choice of color, the whole color thing here was just a, ahem, colorful metaphor. For the metaphor to work, at all, we have to have a shared understanding of what’s important and what’s trivial. Kamp is saying (1) color does not matter, and (2) the topic they are debating matters as little as if they were debating color.
For decades, software developers have been fine with this. And yet… Color is an amazingly deep topic! There are books on the history of color. There are fascinating stories about how colors got their names, how they were made, how they impact fashion, how they tell stories… until software emits smells, color will be one of the most important aspects for developers to understand when considering how human beings will interact with our software.
An idea like “color doesn’t matter” should have been dismissed out of hand as ridiculous—a visual artist might spend months fine-tuning a combination of colors—and yet, software developers have generally been fine with this line of reasoning for decades.
This might just be weirdly myopic, if it weren’t a small part of a larger, more troubling pattern. Software developers—and other professionals who are oriented around quantitative thinking—have a tendency to dismiss more qualitative disciplines such as design, marketing, or management—which also turn out to be exactly the disciplines best-suited to mitigating the kinds of dead-end discussions the bike shed legend is supposedly built to address.
(Incidentally, in my experience, this is almost always a one-sided problem: I don’t see designers de-valuing mathematics or physics, I do see engineers and financiers regularly treating designers, marketing folk, and management with disdain, and the stories we tell about bike sheds are a part of that.)
One problem I have with the bike shed story is in how tempting it is to use the story to stop a discussion cold. “Nuclear power plants are important, talk about that, stop talking about bike sheds.”
In conference rooms and in online discussions, I frequently seen software developers deploy the bike shed myth as an attempt to minimize a topic they see as unimportant and to label that discussion as a trivial distraction.
When someone says “oh, bikeshedding again”, or “who cares what color the bike shed is?” they are not only seeking to cut off discussion on whatever the topic at hand, they are also reinforcing a bias that frames entire categories of conversation as being of lesser value. Labeling a conversation a bike shed sends the message that we shouldn’t consider something that others have expressed interested in discussing. It is often used as a tool for signalling, “what I’m worried about really matters, not what they are worried about,” without coming right out and saying that.
One of the hardest parts of making software is the enormous complexity of this endeavor. Software developers regularly wrangle syntax, semantics, and semiotics, and the slightest misstep can bring the whole project crashing down around us. A stray semi-colon is not likely to ruin a novel, but will definitely crash your program. Spend enough time programming, or hanging around where programmers congregate, and you’ll encounter load-bearing comments, stylesheets which kill websites, and punctuation errors that crashed rockets.
If a stray punctuation can ruin your day, you are going to spend a lot of time sweating the small stuff. As developers, the fact that any of a million factors can lead us to catastrophic, trauma-inducing failure gives us a topsy-turvy and often arbitrary system by which to judge what’s important and what isn’t. We tend to have a lot of anxiety about things that might seem trivial in a different light, and the bike shed story is often our way of reassuring ourselves that (unlike other people) we’re worrying about the right things.
What’s missing from the bike shed?
What worries me most about this is: what if the bike shed story makes it harder for software developers to experience empathy?
Way back in Parkinson’s original story, an imaginary committee—an all-male team of scientists, engineers, and financiers—get together to discuss how to spend their money, then spend too much time talking about low-cost projects. Parkinson seems to just throw up his hands and admit a kind of defeat—let’s just label the problematic behavior and move on: bad meetings are inevitable… what can you do but laugh?
And yet, a well-organized, focused committee meeting isn’t impossible, any more than a rocket ship flying safely to the moon or a skyscraper withstanding an earthquake or a software project finishing on time. Humans do all sorts of hard things, all the time, by bringing skillful people together and empowering them to do their best work.
What if endless debates about what appear to be minutiae weren’t an unavoidable flaw of human psychology, but instead indicated a deficiency of expertise?
What if, all along, what if the “bike shed” is actually a story about how committees of scientists, engineers, and financiers are not truly well equipped to address the wide array of challenges that lie outside of their hard-won expertise? What if “caucus-style”—or “open floor”—meetings often preferred by STEM folk have a paradoxical tendency to further silence already marginalized voices?
What if someone in that meeting had said something like:
“Well, it looks like we’re past the time we’d originally allocated on our agenda for the bike shed discussion, and we’re no closer to a decision. Since it’s clearly a deeper topic than anticipated, I propose we spin off an exploratory project to evaluate bike shed options and present their findings at our next meeting. Can the project tolerate postponing a decision another month?”
“I’m noticing that no one from the design team is here today, so it’s not surprising to me that we’re having trouble deciding on a color. I’d propose we table the bike shed discussion until our next meeting, and I’ll be sure to ask our design lead if someone can join us then.”
“Since I knew we’d be discussing the bike shed today, I conducted a survey of bike riders in our community, and spent a few days categorizing their responses into three broad categories…”
What does it look like when software developers sideline critical voices as “not important”? Remember when Google’s Photos app automatically tagged photos of black people as “gorillas” ? Or how it took Apple’s HealthKit five years to add menstruation tracking? Or how “smart home” devices keep being used by domestic abusers to further terrorize their victims?
Could discussions that included and amplified marginalized voices and perspectives have avoided these problems? I don’t know, but I’m pretty sure that these problems aren’t “trivial”. If it’s true that software developers generally know what’s important and worthy of consideration and what isn’t, then why do we see terrible-but-ultimately-avoidable outcomes as often as we do?
What if this is actually a story about blind spots?
What if a group of scientists, engineers, and financiers got together, and because they weren’t able to admit to themselves or others that they didn’t collectively know enough about design or color or how to structure a meeting in a way that facilitates productive discussion, they found themselves in an endless argument?
Thank you to Kaleb Lape for collaborating on this post.