Artificial Guilt

Great piece in the New York Times Magazine about our Dr. Frankenstein-like quest to play God, subvert sin, and build a better artificial sweetener.

The science on these questions is inconclusive at best. There’s no clear evidence that artificial sweeteners cause cancer or obesity, at least in human beings. But the fear of artificial sweeteners was never quite a function of the scientific evidence — or never of just that. It stems as much from a sense that every pleasure has its consequences: that when we try to hack our taste buds in the lab — to wrench the thrill of sugar from its ill effects — we’re cheating at a game we’ll never win.

Just beware the first paragraph, which may spoil aspects of Breaking Bad for those who have not finished it.

Caught Like Insects in a Web

I’d estimate that the New Yorker has published more than 50,000 cartoons since its first issue in 1925 (I couldn’t find a precise number in a cursory Google search). So it’s surprising to learn that the single most reprinted cartoon of that nearly 90 year history is the one by Peter Steiner from 1993 that says, “On the internet, nobody knows you’re a dog.”

In an interview in 2000, Steiner said the line didn’t feel that profound to him when he wrote it. “I guess, though, when you tap into the zeitgeist you don’t necessarily know you’re doing it.” But the idea quickly caught on as shorthand for the internet’s spirit of anonymity, especially in chatrooms and message boards—a spirit that lives on in sites like Reddit, where “doxing” someone is one of the worst crimes you can commit.

In those early days, the internet felt like an ocean made for skinny-dipping; instead of doffing your clothes, you doffed your identity. You could read about, look at, discuss, and eventually purchase just about anything that interested you, without fear of anyone looking at you funny. This lack of identity could be used for nefarious purposes, of course, and it could lead people down any number of self-destructive paths. But for many, it was liberating to find that, on the web, you could explore your true nature and find fellow travelers without shame.

But as paranoia grows about the NSA reading our emails and Google tapping into our home thermostats, it’s increasingly clear that — rather than providing an identity-free playground — the web can just as easily capture and preserve aspects of our identities we would have preferred to keep hidden. What started as a metaphor to describe the complexly interconnected network has come to suggest a spider’s sticky trap.

I thought of this listening to a recent episode of WTF with Marc Maron. Comedian Artie Lang was telling the story of how he came into his own as a stand-up comedian by exploring, with brutal honesty, the darkness of his personal life. Then he stopped himself for a second to explain that he would never have been able to achieve that level of honesty onstage if he’d worried about his sets appearing on the internet.

Lang: It was before every jerk off had a cellphone taping you. Remember when it was midnight at a club in Cincinnati. It was just you and those people! That was it….Now it’s you and everyone in the fucking world.

Maron: And on Twitter…you can’t do anonymous sets anymore.

Lang: Exactly. An anonymous set is what makes you…The comics are going to get worse man, ’cause they’re gonna check themselves…They’re not gonna wanna see themselves bombing on Instagram or wherever the fuck it is and they’re never gonna take risks.

Where the internet used to encourage risk, now it seems to inhibit it, because it turns out the web can capture anything. What you say in front of friends, or even in front of an audience, can blow away with the wind. On the web, your words can stick around, can be passed around. Celebrities may have been the early victims, but now anyone is fair game. Millions of people are potentially watching you, ready to descend in a feeding frenzy of judgment. In the New Yorker’s Page Turner blog, Mark O’Connell writes about the phenomenon of Twitter users deleting their tweets, something he has seen happen in real time.

It’s a rare and fleeting sight, this emergency recall of language, and I find it touching, as though the person had reached out to pluck his words from the air before they could set about doing their disastrous work in the world, making their author seem boring or unfunny or ignorant or glib or stupid.

Maybe we should treat the web like a public place, with certain standards of behavior. Maybe those who engage in disorderly conduct, posting creepshots and racist tweets, should be exposed and held to account. Perhaps our expectation of anonymity on the internet never made sense. The problem is that the digital trails we leave on web can blur the line between speech and thought, between imagination and intent.

It’s that blurred line Robert Kolker explores in his piece for New York magazine about the so-called “Cannibal Cop,” Gilberto Valle, who never kidnapped, raped, murdered, or cannibalized anyone, but who chatted about it obsessively online. And even though there was little evidence that he took any steps to make his fantasies a reality, his online discussions served to convict him of conspiracy to do so. Kolker writes:

The line between criminal thoughts and action is something the courts have pondered for decades…What’s changed in recent years are the tools used to detect intent—namely, a person’s online activity. “We’ve always said you can’t punish for thoughts alone, but now we really know what the thoughts are,” says Audrey Rogers, a law professor at Pace University. [emphasis mine]

I’m reminded of a recent Radiolab episode about Ötzi, the 5000 year old Iceman discovered in the Alps in 1991. For more than two decades, Archaeologists have poured over the details of his clothing, his possessions, his tattoos, and the arrowhead lodged in his back, evidence he was murdered. From the contents of his stomach, they’ve even determined what he ate for his final meal. I wonder if there will someday be archaeologists who sift through our hard drives, tracing out the many digital trails we’ve left in the web, trying to determine not what we were eating, but what we were thinking. Will their findings be accurate?

To paraphrase John Keats, most lives used to be writ in water. Now they’re writ in code. As much as our digital lives are only partial documents, they often seem more real to strangers simply because they are what has been documented. Maybe the internet doesn’t know you’re a dog, but it doesn’t care. In the eyes of strangers, you are that which the web reveals you to be, because the web is where the evidence is.

Songs about Songs

I love what Shawn Blanc said about what Stephen Hackett said about what John Roderick said about focusing one’s attention on creating “primary source material,” rather than mere commentary.

By saying so, however, I fear that I’m engaging in mere commentary — in what Roderick calls, "This chattering sort of criticism and culture digestion that is so much of I guess what we call content — Internet content, which is just like, ’Oh, this just came out and now I’m talking about it and now I’m talking about this other guy who was talking about it.’”

But I’m not sure I would draw such a qualitative distinction between primary and secondary source material. Songs are not empirically better than linked list blog posts. I’d rather read a brief but beautifully crafted post on Kottke or Daring Fireball than listen to a lot of the songs currently on the radio. What matters is the intention, the craft, the effort behind what you make. A close reading of Roderick’s words suggests he might agree.

You know, if you’re making a song, or if you’re writing a story, that is source material. It’s primary. It’s the thing that did not exist before. You’re not commenting. Presumably, your song is not commenting on some earlier song, or if it is, it’s doing it in an inventive way."

The French writer Michel de Montaigne has long been considered the inventor of the essay. The original meaning of the word “essay” was “stab” or “attempt,” because he would take an idea and poke at it from as many different angles as he could think of. He’s more recently been called the godfather of blogging, because he didn’t just write down his own thoughts. He constantly quoted from the authors he was reading and then reflected upon how their ideas comported with his own. He was a great writer but also a great reader and a great commentator. It’s a tradition carried on by bloggers like Kottke, whose work Tim Carmody once described as “watching an agile mind at work, one attached to a living, breathing person, and feeling like you were tapped into a discussion that was bringing together the most vital parts of the web.”

In a piece called Trapped by tl;dr (via Shawn Blanc again) Seth Godin wrote,

“There are thousands of times as many things available to read as there were a decade ago. It’s possible that in fact there are millions as many.”

That’s precisely why we need people who are great readers, people who can sort through the best of what’s out there, who can, in their own way, write songs about the songs we’re all inundated with. And do it in an inventive way.

What the First App Says about Us

MG Siegler believes the first app you open in the morning says something about you.

I see the first app you turn to in the morning as the new homepage. Some might argue it’s your entire homescreen of apps, but I don’t think that’s right. It’s the one service you care most about, no matter the reason, and want to load immediately upon hitting the web. The delivery device has changed, but the concept has not.

What I find interesting is not which app people are choosing to open first thing in the morning, but the fact that apps are the first thing so many of us choose. Siegler traces back his own first app from Twitter to Path to Facebook to Email. For me it would be Twitter to Reeder to Email.

And I think Siegler’s right that before smartphones, it would have been a favorite website on my laptop, something like Slate or Pitchfork or the New York Times. And if I go further back (like a hypnotist regressing the patent to remember former lives), before we even had the internet, it would have been a book, or a copy of The New Yorker, or (even further back) cartoons on TV.

The difference between apps and everything that came before is that the apps we choose now (Twitter, or Facebook, Flipboard, RSS readers) tend to gather and serve up content from myriad, disparate sources. Before apps, we had to choose one source at a time. What I find intoxicating about the apps I open in the morning is the possibility of surprise. As Ben Thompson says, it’s so much more delightful to get the gift I didn’t know I wanted.

But, like Rands and Alexis Madrigal, I agree that this stream of brief interestingness might not be entirely good for me. Perhaps it’s time to try a new first app.

A Software World

At the end of his takedown of an article that calls 2013 “A Lost Year for Tech” (neatly summed up as “a sad pile of piss-on-everything cynicism”), John Gruber writes:

There’s a nihilistic streak in tech journalism that I just don’t see in other fields. Sports, movies, cars, wristwatches, cameras, food — writers who cover these fields tend to celebrate, to relish, the best their fields have to offer. Technology, on the other hand, seems to attract enthusiasts with no actual enthusiasm.

Rene Richie followed up on that point, wondering where the nihilism comes from.

It could just be that computer technology is still relatively new and tech journalists - and tech readers, we feed each other - lack the maturity of older, more established industries and pursuits. It could be that tech writers sometimes seem to care more about being perceived as cool than in being good.

I think this nihilistic streak could be a symptom of the deep-seated suspicion of technology in Western Culture, even among those of us who claim to love it. We all use technology, but we don’t trust it. We fear the power of its “reality distortion field.” We tend to see the experiences it enables as inauthentic, alien, perhaps corrupting, and certainly inferior to “real” experiences. This theme of technology’s malevolent influence is obvious in a lot of science fiction, from Frankenstein to The Matrix, but you can even see it in the training montage from Rocky IV.

The Russian might have a fancy weight room, fancy exercise equipment, and fancy synthetic muscles, but he’ll never triumph over Rocky, who can lift real wagons and real bags of rocks and run to the top of a real mountain.

I thought of that montage when I saw this post from Seth Godin, which makes the fairly reasonable case that our pursuit of productivity (through apps and blogs and devices) often makes us less productive in the end. I take his point until he gives this example:

Isaac Asimov wrote more than 400 books, on a manual typewriter, with no access to modern productivity tools. I find it hard to imagine they would have helped him write 400 more.

First, “400 more” is a pretty high bar. But wouldn’t Asimov have derived some benefit modern productivity tools? Like, say, a computer? If not Asimov, I imagine there were countless people born before the advent of “modern productivity tools” who would have benefited enormously from them. Of course these tools can’t write 400 books for you, but they can reduce the friction just enough to get the ball rolling.

To give just two examples, I was a remarkably disorganized person for most of my life, because I insisted on trying to keep my deadlines, appointments and todo items in my head. Now, with apps like Omnifocus and Due, I’m not only much more organized but also remarkably less anxious about what I might forget. I’ve also been somewhat overweight for most of my adult life, but in the last four years I’ve lost about 50 pounds, mainly due to changes in exercise and diet. But those changes were the direct result of tracking my calories and exercise through the app Lose It. (I had no idea, for instance, that running five miles only burns the equivalent calories of one big sandwich.) Those are the two biggest impacts on my life, but software has, in a variety of ways, also helped me create better lessons for my students, grade papers more thoroughly, capture innumerable moments of my children’s’ lives, stay in touch with people I love, write a novel, and start this blog. My life is significantly better as a result of this technology.

Which brings me back to the nihilism of tech journalists. Few, if any, of these small improvements of the daily life of one person would merit a headline in a tech publication. We tend to expect, and tend only to notice, the big revolutions in technology: the personal computer, the iPod, the smartphone, the tablet. It’s no coincidence that these are all hardware products. Hardware feels more “real” to us. Maybe the reason tech journalists are so often depressed about the state of technology is that hardware revolutions are extremely hard to come by. Dozens of hardware makers get up on stages and set up booths at CES every year touting their new attempts at hardware revolutions. And most of them fall completely flat.

Software doesn’t get the same attention, because it’s less substantial, less trustworthy, and because it’s behind a screen. But software is the real story. Frank Chimero’s fantastic web essay What Screens Want makes this point by citing a video clip of the British documentary program Connections, in which the host describes how plastic has completely permeated our world. Chimero then rewrites the script, replacing the word “plastic” with the word “software.”

It’s a software world. And because of software, it’s a soft world in a different sense, in the original sense of the word: it changes its shape easily.

Software is the magic that makes our devices “indistinguishable from magic”. Many of us think of it as an art form, and yet it’s a strange sort of art form. Most art forms don’t remind you to take out the recycling or help you lose fifty pounds. But the things software can do are almost limitless. Maybe tech journalists would be less cynical about the advances of technology if they wrote more about software than hardware, and more about the how than the what — how software is not only changing its shape, but changing our shape, in more ways than one. That is the real, ongoing technological revolution.

Why (I Hope) Blogs Still Matter in 2014

I started this blog less than six months ago, and for the first three months, I had fewer than a hundred page views. But my readership grew in fits and starts, with a retweet here and a link there, even an occasional block quote, until finally, I arrived home after work a couple weeks ago to find a link to something I wrote on Kottke.org.

Kottke-fucking-dot-org (the New Yorker of blogs, as far as I’m concerned).

My page views went up to 12,000 in a single day, small potatoes for some I’m sure, but a big deal to me. People were starting to follow me on Twitter, sending me messages about how much they enjoyed my writing. After nearly a decade of working in public radio, and then several years writing and struggling to publish a novel, I felt as though blogging was finally giving me a platform and an audience I could call my own.

So imagine my surprise when, just a few days later, in a piece for the Nieman Journalism Lab, Kottke himself announced that, “The blog is dead.” He hedged a bit in a post on his blog, but stood by his main point:

Instead of blogging, people are posting to Tumblr, tweeting, pinning things to their board, posting to Reddit, Snapchatting, updating Facebook statuses, Instagramming, and publishing on Medium. In 1997, wired teens created online diaries, and in 2004 the blog was king. Today, teens are about as likely to start a blog (over Instagramming or Snapchatting) as they are to buy a music CD. Blogs are for 40-somethings with kids.

I’m not quite forty, but I do have kids, so I found this unbearably depressing. Apparently, I have found the right creative platform for myself at precisely the moment it’s fallen out of fashion.

Except I don’t really believe that. And Kottke doesn’t seem to either. The footnote in his blog post about Tumblr (and whether Tumblr blogs are actually blogs) bears this out.

If you asked a typical 23-year-old Tumblr user what they called this thing they’re doing on the site, I bet “blogging” would not be the first (or second) answer. No one thinks of posting to their Facebook as blogging or tweeting as microblogging or Instagramming as photoblogging. And if the people doing it think it’s different, I’ll take them at their word. After all, when early bloggers were attempting to classify their efforts as something other than online diaries or homepages, everyone eventually agreed. Let’s not fight everyone else on their choice of subculture and vocabulary.

So it’s the terminology that’s changing rather than the impulse. And while these alternative services are undoubtedly siphoning off casual users of what used to be blogs, the reason those users are leaving is that blogging platforms don’t provide the easiest access to the intended audience. My wife and I once used personal blogs to share pictures of and stories about our kids. Now we do that on Facebook because Facebook is where the friends and relatives are. If you want to communicate with your social group, you go to the service where your social group congregates, where your message will be conveyed to the largest number of people you know.

But I want to communicate with people I don’t know. And that’s why I think blogs, or personal websites, or single author web publications, or whatever-the-fuck-you-want-to-call-them, still matter.

Rewind about four years. I had just quit a terrible associate producer job in public radio and was failing to make it as a freelancer. That fall, I went to a public radio conference and got to meet one of the Kitchen Sisters (I was so star-struck, I didn’t even know which one she was) and other amazing producers like Kara Oehler and Joe Richman (when he asked me how freelancing was going, I said, “Teaching at a technical college is going pretty well.”) But the most interesting conversation I had that night was with a guy who had been working behind the scenes for most of his career, helping different radio shows find their own unique production style.

I was telling him how I wasn’t sure I could sell the kinds of stories I had been making before the economy crashed, stories about ordinary life with no real news hook. The only show that still had a large freelance budget was Marketplace, and I didn’t want to change my style to suit them. This guy’s advice? Start a podcast. Just make the kinds of stories I wanted to make and put them out in the world, and if the stories were good, the audience would eventually come to me. He cited Jesse Thorn of MaximumFun as a model.

I didn’t follow that guy’s advice (a podcast seemed like too much work), but he did. That guy was Roman Mars. He went on to create and host the amazing show 99% invisible. Not only did the audience come to him, but he recently raised more than $375,000 on Kickstarter to fund the fourth season of the show. His experience echoes the words of Radiolab co-host Robert Krulwich, who offered similar advice in a commencement address to the Berkeley Graduate School of Journalism.

Suppose, instead of waiting for a job offer from the New Yorker, suppose next month, you go to your living room, sit down, and just do what you love to do. If you write, you write. You write a blog. If you shoot, find a friend, someone you know and like, and the two of you write a script. You make something. No one will pay you. No one will care. No one will notice, except of course you and the people you’re doing it with. But then you publish, you put it on line, which these days is totally doable, and then… you do it again.

I had those words in mind when I started my blog six months ago, and I’ve had them in mind whenever I think I should be pitching one of my blog posts to an online publication like Slate or Salon or The Magazine. I’d like to get paid for what I write, but there’s something wonderfully satisfying about owning and controlling my own work. I also don’t want to wait to see if someone will publish it. I want to publish, and see if the audience comes to me.

This is what blogs still offer.

When I first read Kottke’s post on the death of blogs, my knee-jerk fear was that it meant fewer and fewer people would be reading blogs in the near future. What I now think he means is that fewer and fewer people will be writing blogs in the near future. And maybe that’s a good thing. Maybe the rise of social networking platforms will function like a brush fire, clearing out the forest for those of us who want to do more than share a picture, video, or link—those of us who actually want to read, analyze, reflect on, argue with, and write thoughtfully about the stream of information we’re all trying to navigate. Those are the blogs I want to be reading in 2014, and beyond.

Misunderstood or Double-edged?

A lot of people are writing about Apple's latest commercial for the iPhone. Gruber thinks it's their best ad of the year, Kottke calls it one of their best ever. Nick Heer compares it to Don Draper's carousel pitch for the slide projector. But Ben Thompson's take is my favorite because he responds to the ad's critics, who say that Apple is "promoting recording your family over actually spending time with your family."

This criticism is indicative of the recent conventional wisdom that these devices are making us stupid, lonely, and disconnected from the real world. Thompson sees the ad as an attempt to bridge the technological/generational divide, to say the reason we're so obsessed with our gadgets is that they can actually do amazing things.

On the flipside, how many young people – including, I’d wager, many reading this blog – have parents who just don’t get us, who see technology as a threat, best represented by that “can-you-put-that-damn-thing-down-and-join-us-in-the-real-world!?” smartphone in our hands, without any appreciation that it’s that phone and the world it represents that has allowed us to find ourselves and become the person we know they wanted us to be?

In the first half of the ad, the kid is portrayed as self-absorbed, antisocial, even rude in his attention to his iPhone. But why? Would we have seen him in such a negative light if he had been reading a copy of The Catcher in the Rye, or writing in a journal, or drawing in a sketchpad, or noodling on a guitar? The magical, revolutionary thing about an iPhone (and I say this unironically) is that it can become a novel, a journal, a sketchbook, a musical instrument, or a video camera/video editor (with apps like iBooks, Day One, Paper, Garageband, and iMovie among many others).

And yet, we've all seen people ignoring the real world in favor of their device, and they were not involved in a heartwarming creative pursuit. I have looked at Twitter more than once while my children were trying to have a conversation with me. I have even checked Facebook, surreptitiously, while my son, who'd just learned to read, struggled to read a new book to me (not the first book he read to me, but still). I'm not proud of this. And I worry that my kids, who love these devices as much (if not more) than I do, with soon be acting just like the teenager in this ad. And instead of making a beautiful home video during family gatherings, they'll be sexting with Russian oligarchs, selling their non-essential organs, and ordering designer brain implants on some future version of the Silk Road.

That's the double edge of the technology we now have in our pockets. It gives us access to boundless information, and enables all kinds of interactions with that information, but it doesn't distinguish between empty and nourishing information, or help us determine the right uses of that information. We have to make those distinctions and those choices.

One of my favorite pictures of my kids was taken with, edited on, and sent to me from my wife's iPhone. It shows my son and daughter, cuddled together in the dark, their radiant, smiling faces lit from beneath by an unearthly glow. You can't see the object making the glow in the picture, but it's the screen of an iPad. It's also the source of the joy on their faces.

That screen is not going away anytime soon, but we don't have to be passive viewers of it, merely consuming and feeling vaguely guilty about what we consume from it. There's immense creative power behind the screen. Instead of worrying about it, lamenting it, and disparaging it, we should focus on learning how best to use it --- to gather, understand, shape, and share the information around us.

Placebo-philes

Audiophiles have gotten a lot of bad press recently, what with the apparently silly Pono music player (which plays much higher quality audio files despite almost no one being able to hear the difference) and the news from Wired magazine that "burning in" your headphones has no discernible effect on sound quality. Reading about the truly insane things audiophiles will do in pursuit of the perfect sound, I can't help reflecting back on that unfortunate period in my life when I almost fell down the same rabbit hole.

For me it started with a simple search for better headphones. I think I typed "best headphones under $50" into Google, and what came back was a series of lists, like this one or this one, ranking the best headphones at a series of price ranges. I settled on a pair pretty quickly, and when they arrived I loved them, but those lists had planted their hooks in my brain. How much better would my music sound if I were willing to spend just a little bit more?

I decided to research what headphones I would buy the next time I had saved up a decent amount of money, and my research led me to my first (and really only) foray into Internet forums: a website called Head-Fi, where enthusiasts gather to discuss, argue, and bond over their love of headphones and headphone-related accessories. It was a remarkably friendly place for people who enjoyed tuning out the world, but darkness lurked at the edges. People would post glowing reviews of the headphones they just bought, and others would weigh in about how much they loved those headphones too, but inevitably someone would say how those headphones would sound even better if connected to a decent headphone amplifier. Or a decent digital audio converter. Or how those headphones didn't even compare to these other headphones that just cost a little more.

The perfect headphone set up always cost just a little bit more. Audio nirvana was always just out of reach.

Over the course of three years, I wound up buying one pair of headphones that cost about $100, then another that cost about $150, then a headphone amplifier that cost about $100, then another headphone amplifier that cost several hundred, then a special iPod that had been rewired for better sound, then several more pairs of headphones, each more expensive than the last. It helped that the internet made it easy to resell my previous purchases in order to fund my new purchases.

But that was nothing compared to the equipment (and prices paid) by many. The most money I ever paid for headphones was about $300. But the best headphones were going for more than $1000, and the best amplifiers and related devices were many times that.

I don't think it's an accident that this period in my life was the same period in which I had two children in diapers and an extremely stressful job. After putting the kids to bed, if I didn't have any more work to do, and if my wife wanted to watch TV, I would find a quiet spot in the house and get lost in the increasingly detailed soundstage my gear supplied.

But the specter that loomed over everything was the idea that this was all some big placebo effect. I would occasionally spend an evening listening to a song on my new set of headphones and then on my old set, or with my new amplifier and then my old amplifier. I would make my wife listen to see if she heard a difference. Sometimes she did, sometimes she didn't. Sometimes I didn't. Every once in a while, I'd read a post on Head-fi about someone who was selling everything he'd bought because he realized he was listening to his equipment rather than music. I finally had the same realization and made the same decision. At the time, I felt like a recovering addict, or a victim of a con artist, reformed but slightly ashamed.

I got a new perspective on that period, however, when I read this recent piece by Felix Salmon (via Kottke) about converting money into happiness. Salmon is also interested in placebo effects, specifically in the world of wine tasting, where experiments have frequently shown that very few people call tell the difference between cheap and expensive wine, or even the difference between red and white wine. When I first read about those studies, they reminded me of the scene in Brideshead Revisited when a couple guys get drunk and describe the wine they're tasting with increasingly absurd metaphors:

"….It is a little shy wine like a gazelle." "Like a leprechaun." "Dappled in a tapestry window." "and this is a wise old wine." "A prophet in a cave." "and this is a necklace of pearls on a white neck." "Like a swan." "Like the last unicorn."

I had moments almost as absurd with my headphones, when I heard things inside songs I swore I'd never heard before, when I felt as if parts of the music were two dimensional backdrops and then three dimensional shapes would leap out of the picture towards me, or the music would drizzle over my head, or crackle like lightning, or I'd swear I could smell the studio where the song had been recorded, or something.

In other words, I was an idiot. Because on other nights, usually after I'd owned that same set of gear for a little while, I wouldn't hear those things any more, and I'd start thinking that I needed better gear. I needed a new placebo affect.

It's easy to sneer at the placebo effect, or to feel ashamed of it when you're its victim. And that's precisely why I found Felix Salmon's piece revelatory, because instead of sneering at the placebo effect of fancy wine, its marketing, and its slightly higher prices, he thinks we should take advantage of it. If the placebo effect makes us happy, why not take advantage of that happiness?

The more you spend on a wine, the more you like it. It really doesn’t matter what the wine is at all. But when you’re primed to taste a wine which you know a bit about, including the fact that you spent a significant amount of money on, then you’ll find things in that bottle which you love ... After all, what you see on the label, including what you see on the price tag, is important information which can tell you a lot about what you’re drinking. And the key to any kind of connoisseurship is informed appreciation of something beautiful.

This idea of "informed appreciation" reminds me of another area of modern life beset by placebo effects: the world of alternative medicine. In a recent article for the Atlantic, David H. Freedman argues that there's virtually no scientific evidence that alternative medicine (anything from chiropractic care to acupuncture) has any curative benefit beyond a placebo effect. And so many scientists are outraged that anyone takes alternative medicine seriously. However, there is one area where alternative medicine often trumps traditional medicine: stress reduction. And stress reduction can, of course, make a huge impact on people's health. The Atlantic article quotes Elizabeth Blackburn, a biologist at the University of California at San Francisco and a Nobel laureate.

“We tend to forget how powerful an organ the brain is in our biology,” Blackburn told me. “It’s the big controller. We’re seeing that the brain pokes its nose into a lot of the processes involved in these chronic diseases. It’s not that you can wish these diseases away, but it seems we can prevent and slow their onset with stress management.” Numerous studies have found that stress impairs the immune system, and a recent study found that relieving stress even seems to be linked to slowing the progression of cancer in some patients.

Perhaps not surprisingly, a trip to the chiropractor or the acupuncturist is much more likely to reduce your stress than a trip to the doctor. If anything, a trip to the doctor makes you more anxious.

Maybe each of these activities (listening to high end audio gear, drinking high end wine, having needles inserted into your chakras) is really about ritualizing a sensory experience. By putting on headphones you know are high quality, or drinking expensive wine, or entering the chiropractor's office, you are telling yourself, "I am going to focus on this moment. I am going to savor this." It's the act of savoring, rather than the savoring tool, that results in both happiness and a longer life.

Of course, you don't need ultra high end gear to enjoy your music, or ultra high end wine to enjoy your evening, just as you shouldn't solely use acupuncture to treat your cancer. It might be as effective to learn how to meditate. But maybe we all just need to meditate in different ways.

Love the GoldieBlox Video, but Not the Toys

I once paid my daughter a dollar to not buy a princess backpack. And when she recently announced, "I don't need to know math, because I'm going to grow up to be a supermodel," my brain nearly melted out of my ears. I spent the next thirty minutes (or so) half-ranting, half-explaining to her how she has to fight against a culture that believes the most important thing about a girl is being pretty. So I'm totally down with the sentiment of this video.

But then I get to the end and see that the product being advertised, GoldieBlox, is apparently just a set of blocks "for girls." Which makes me wonder, do girls need blocks "for girls"? Why can't they just use blocks?

The company's website has an answer:

Our founder, Debbie, spent a year researching gender differences to develop a construction toy that went deeper than just "making it pink" to appeal to girls. She read countless articles on the female brain, cognitive development and children's play patterns...Her big "aha"? Girls have strong verbal skills. They love stories and characters. They aren't as interested in building for the sake of building; they want to know why. GoldieBlox stories replace the 1-2-3 instruction manual and provide narrative-based building, centered around a role model character who solves problems by building machines. Goldie's stories relate to girls' lives, have a sense of humor and make engineering fun.

I find this somewhat persuasive but also strange. The idea behind the video is that girls just need to be taught to see themselves as more than princesses, that they just need to overcome the culture's brainwashing to become the builders they were meant to be. And yet, the message above seems to be that girls are naturally more in tune with verbal skills than spacial skills. So apparently the only way to lure them into using a spacial toy is with verbal bait? I'm not buying it.

After a brief search for studies on gendered toy preference, I stumbled on Jeffrey Trawick-Smith, a professor at the Center for Early Childhood Education at Eastern Connecticut State University, who has been conducting a toy study for the past few years, trying to determine which types of toys tend to elicit the most complex forms of play. And he made some interesting findings about toy preference among boys and girls.

What set the highest-scoring toys apart was that they prompted problem solving, social interaction, and creative expression in both boys and girls. Interestingly, toys that have traditionally been viewed as male oriented—construction toys and toy vehicles, for example—elicited the highest quality play among girls.

He also says,

One trend that is emerging from our studies can serve as a guide to families as they choose toys: Basic is better. The highest-scoring toys so far have been quite simple: hardwood blocks, a set of wooden vehicles and road signs, and classic wooden construction toys. These toys are relatively open-ended, so children can use them in multiple ways.

The accompanying video for the study bears this out (study findings begin at 3:10).

One of the most basic toys they studied, a set of multicolored wooden blocks in the shape of stick figures called Rainbow People, turned out to be among the most successful in encouraging open-ended play. Children created trains of Rainbow People, made towers of Rainbow People, made up stories about rainbow people, and so on. Even the teachers admitted that they didn't expect these simple toys to be so popular. And indeed, the children weren't interested in them at first. But once they started exploring the toys, they played with them longer and more creatively than much flashier options.

The implication of the video is that both children and adults are often drawn to toys that don't actually foster the most creative play. Toy companies do a lot of research on kids, but their research is geared toward getting kids to want their toys, not necessarily getting kids to play extensively with those toys. Lego's sales went through the roof after they started making highly gendered building sets based on Star Wars, etc. But as any parent can tell you, building these Lego sets tends to follow a linear trajectory: follow the instructions until the X-wing fighter is complete, put on shelf. Not a lot of open-ended play there.

All of which goes to say that while I love the GoldieBlox video, I think the GoldieBlox toys are misguided, just as I think Lego's attempt to woo girls was misguided. You don't teach kids how to think and build creatively by giving them ready-made narratives to build around. And you don't encourage open-ended play by making your building toys highly gendered. The best building toys are the simplest. What we need to do is advocate for a new aisle in the toy section where boys and girls can each find creative toys to play with, side by side, and build things we've never seen out of their own imaginations.

Worse than Regression

I greatly enjoyed Stephen Hackett and Federico Viticci's rant against iWork on The Prompt this past week. Stephen Hackett followed up with a blog post making a similar point about the unsettling trend of Apple regressing features whenever it tries to revamp a major software offering. The previous examples given were iMovie 08 and Final Cut Pro X, each of which caused an outcry from users upon release because of their missing features, and each of which Apple slowly improved by adding back lost functionality. Gruber claimed a couple weeks ago that this is just another example of apple valuing "simplicity over functionality."

But I would argue that the iWork redesign is not just another example of this trend, but something much worse.

iMovie '08 and Final Cut Pro X are both, in fact, examples of Apple starting over from scratch in order to reimagine how the app should work. In the case of iMovie, my understanding is that the app was designed to help people make better home videos by allowing them to easily scrub through raw footage and select just the clips they wanted to use in the final product. In the case of Final Cut Pro X, the app was redesigned around the magnetic timeline, which would allow for much more intuitive editing for less experienced users.

You can argue about whether either of these redesigns was actually successful. In my opinion, iMovie 08 was a mess when it first came out, but Final Cut Pro X was actually amazing for a first time user. But what you can't argue is that each was a bold, radical departure from the past that laid a new foundation for the app's future.

iWork has none of that boldness. All it does is clip the wings of the desktop versions so as not to embarrass the iPad's feature set. The radical new paradigm is that you'll be able to the same things on your iPad as you do on your desktop, even if this means ruining the experience you used to have on the desktop.

The reason I find this so depressing is that Apple software designers have a lot of exciting ideas about the unique opportunities of the iPad in their "life" apps. Garageband, iPhoto, and (to a lesser extent) iMovie offer completely different and in many ways superior experiences on the iPad. There's no desire to make them identical to the desktop versions. It's obvious that the tablet offers a different experience. So why, in God's name, should the iWork apps be identical on both platforms?

If you want to offer a radical new version of word processing or spreadsheets or slide decks that's specifically tailored to the iPad experience, make it fucking different from the desktop experience. Don't just make the desktop experience worse! To give just one example, couldn't they at least try to reimagine a way to end the nightmare that is iOS text selection? Couldn't they make interaction with text more intimate, more intuitive, through touch somehow? Apparently not.

Nilay Patel made a great point on the most recent episode of the Vergecast. He was talking about how iOS 7 still doesn't feel like it's been optimized for the iPad, which suggests that even Apple doesn't quite know what the iPad is for. But if they want the iPad to replace most computers, they have to figure that out.

Apple are definitely making great strides with their iLife applications, Garageband being the best example. But there's a difference between creative applications and straightforward work applications. Many more people use their computers for the latter than the former, and Apple have yet to demonstrate that they understand "work" as well as they understand "life".

The Gorgon Stare

There's a line in a Matthew Power's GQ article "Confessions of a Drone Warrior" that perfectly captures the disquieting, if slightly irrational, sense that drones are harbingers of a nightmare, dystopian future.

Even their shape is sinister: the blunt and featureless nose cone, like some eyeless creature that has evolved in darkness.

I read this article with equal parts fascination and horror, as I did Mark Bowden's article "The Killing Machines" in the Atlantic a couple months back. Bowden is definitely the more pro-drone of the two, making the case that, even though drone strikes do result in some collateral damage, ground operations often cause many more civilian casualties. He gives the example of the Delta Force raid in Mogadishu, which he wrote about in Black Hawk Down.

The objective, which was achieved, was to swoop in and arrest Omar Salad and Mohamed Hassan Awale, two top lieutenants of the outlaw clan leader Mohammed Farrah Aidid...We were not officially at war with Somalia, but the ensuing firefight left 18 Americans dead and killed an estimated 500 to 1,000 Somalis—a number comparable to the total civilian deaths from all drone strikes in Pakistan from 2004 through the first half of 2013, according to the Bureau of Investigative Journalists’ estimates.

It's a statistic that puts the death toll of drone strikes in startling perspective. Bowden also counters the notion that drones distance soldiers from the realities and consequences of war, quoting a Predator pilot who used to fly B-1 bombers:

"There is a very visceral connection to operations on the ground," Dan says. "When you see combat, when you hear the guy you are supporting who is under fire, you hear the stress in his voice, you hear the emotions being passed over the radio, you see the tracers and rounds being fired, and when you are called upon to either fire a missile or drop a bomb, you witness the effects of that firepower." He witnesses it in a far more immediate way than in the past, and he disdains the notion that he and his fellow drone pilots are like video gamers, detached from the reality of their actions. If anything, they are far more attached.

In some ways, I came away from these articles more disturbed by the surveillance possibilities of drones than I am by their potential as vehicles for assassination. Bowden mentions something known as the Gorgon Stare: a system of cameras and artificial intelligence for analyzing camera footage (named for the mythical beast that can turn its victims to stone).

Instead of one soda-straw-size view of the world with the camera, we put essentially 10 cameras ganged together, and it gives you a very wide area of view of about four kilometers by four kilometers—about the size of the city of Fairfax, [Virginia]—that you can watch continuously. Not as much fidelity in terms of what the camera can see, but I can see movement of cars and people—those sorts of things. Now, instead of staring at a small space, which may be, like, a villa or compound, I can look at a whole city continuously for as long as I am flying that particular system.

In the more recent GQ article, Matthew Power details how the burden of this surveillance, and the fatal power that comes along with it, can weigh on the person behind the camera. The article is a profile of one such drone pilot, Brandon Bryant, who describes his experience in the use of drones as an almost god-like witness to the mass of humanity he had under surveillance.

He watched the targets drink tea with friends, play with their children, have sex with their wives on rooftops, writhing under blankets. There were soccer matches, and weddings too. He once watched a man walk out into a field and take a crap, which glowed white in infrared.

In the end, it's not so much the violence that Bryant finds traumatizing, but the simultaneous sense of powerlessness (because he wasn't the one choosing his targets) and the way the technology, rather than distancing him, gave him an intimate, invasive view of his victims' final moments on earth, watching the heat of their bodies dissipating through infrared.

The smoke clears, and there’s pieces of the two guys around the crater. And there’s this guy over here, and he’s missing his right leg above his knee. He’s holding it, and he’s rolling around, and the blood is squirting out of his leg, and it’s hitting the ground, and it’s hot. His blood is hot. But when it hits the ground, it starts to cool off; the pool cools fast. It took him a long time to die. I just watched him. I watched him become the same color as the ground he was lying on.

Even if drones cause less collateral damage, I can't help but fear the greater psychological damage, both to our enemies and to ourselves. Both articles are well worth reading.

Appropriate Indignation

Listening to the recent episode of Rene Richie's Vector podcast with Ben Thompson, I found myself infected by Thompson's righteous indignation at Apple's iPad strategy. After all the boneheaded doomsaying about Apple in recent months, it was refreshing to hear someone criticize Apple out of intelligent concern rather than laziness.

Thompson's argument, which he has also written about at length on his website, is that Apple itself actually seems unaware of the appeal, the potential, the "why" of the iPad. The power of the iPad is not its ability to merely replicate a computer's utility through touch input, but to actually enable new possibilities by putting users in a more immediate relationship to digital information.

If you are a musician, the iPad is your instrument, your studio. If you are an artist, the iPad is your paint brush, your easel. If you are a student, the iPad is your textbook. If you are a child, the iPad is your storybook, or your entertainment. If you are a grandma, the iPad is your connection to your family.

I had no idea what I might use the iPad for when I first bought it, and at first I remember thinking that it had been more of a luxury item than anything essential to my life. But as third party developers innovated and imagined new uses for it, apps like Instapaper, Reeder, Flipboard, iThoughts HD, and Day One became important parts of my daily life. Each of these apps can or do exist in desktop versions, but it was their iPad versions that appealed to me most directly.

More recently, I've found apps that I use in my job teaching developmental writing that I simply could not live without. To give just one example, there is an app called Doceri that essentially turns my iPad into a mini Smartboard, enabling me to annotate anything on my computer screen while it's attached to a projector. So, for example, I can put a section of a student's paper up on the screen and have the whole class talk about how to improve it, annotating the writing on the fly as we talk. This is the kind of app that allows a new, more direct interaction with digital content, helping me accomplish something that simply not possible before the iPad.

And yet what software did Apple demo onstage last week? The new iWork suite. A trio of apps that already worked better on the desktop than they ever could on the iPad. And what was the big update to these apps? Instead of adding functionality that only the iPad could provide, Apple chose to remove functionality from the desktop versions to bring them in line with their lesser tablet incarnations.

I am a longtime user of iWork. I grade all my students' papers in Pages. To do this, I make extensive use of the "comment" function. When I'm finished commenting on a student's paper, I can easily convert the document to a PDF for easy emailing to the student or printing a hard copy. In the new Pages, this ability is gone. You can comment on a document, but neither printing nor PDF export preserve the comments. They are only viewable within the Pages version. This is one of many features that have been removed in the rewrite of the iWork apps.

Gruber downplayed the controversy over the rewrite on Daring Fireball with a link to Matthew Panzarino, who wrote about the new iWork at Techcruch.

Lots of folks are getting all worked up about iWork being “dumbed down,” but it feels like a reset to me. I can see this playing out pretty much like Apple’s recent Final Cut Pro X re-thinking. That app was introduced in a radically simplified and streamlined form that caused immediate outcry. Over time, Apple has steadily added back features that were missing from the early dramatic redesign of the pro video-editing suite.

The difference is the motivation. Apple rewrote Final Cut from the ground up because they thought they could provide a better, more intuitive way of editing video. I'd argue that the magnetic timeline actually was a feature that deserved to have the app rewritten around it. By contrast, iWork on the iPad clearly was and still is an inferior experience to the desktop versions. Choosing to rewrite the apps around those inferior versions suggests that Apple misunderstands both the strengths and weaknesses of both its platforms, which is disheartening to say the least.

OS Design as Culture

This summer, I had a conversation with an American Sign Language interpreter about the concept of Deaf Culture. Turns out, many people who are born deaf don't consider themselves "handicapped" or even "hearing impaired". Instead, they consider themselves part of a different culture, as if deafness were an ethnicity. I found this surprising the first time I heard about it, since ethnicity is a biological legacy whereas deafness seems like a biological accident. Of course, I was wrong. As the sign language interpreter pointed out, one of the primary characteristics of a culture is not necessarily ethnicity but rather a shared language.

I thought of this reading Farhad Manjoo's piece in Slate about trying to switch from an iPhone to an Android phone. His primary objection to the Android phone was how the phone manufacturers and the carriers intruded on the experience of the operating system with "skins" and extra apps. He prefers the iPhone's purer experience. Predictably, his comments section is full of invective from both Apple and Android adherents. The people who engage in these battles, on blogs and in the comment threads of tech news sites, are often mocked by the tech writers who stand above the fray. The terms used to describe them (fanboy, neck beard, faithful, sheep) suggest that caring too much about the operating system of your smartphone is at best juvenile and at worst cultish. Matt Honan captured the absurdity of the turf war in Wired this past spring.

Nobody cares what kind of smartphone you believe in. It’s not a religion. It’s not your local sports team even. Stop being a soldier. You are not a soldier. You are just wrong. Shut up.

It's a first world problem, no doubt, and trolls are trolls, but sometimes I fear that we sell people short for caring too much about these things. Many of us are using these devices as our primary tools to capture, digest, share, and communicate information about the world. Is it any wonder that we care deeply about the way these tools work and how they allow us to interact with that information? An operating system is not itself a language, but it is a tool for expression of language, not unlike speaking, writing, listening, or signing. It's certainly not a religion, but there is a sense in which it can form the basis of a culture, or at least a cultural understanding of digital information.

When my father got laid off from his job and decided to switch careers, one of the changes he made was to go from a Windows P.C. to a Mac. I did the same a few years later when I changed jobs. And as cheesy as it sounds, the new computer changed me. It changed how I wrote, how I made radio stories, how I consumed the news, took photographs, and made home movies of my children. And now it affects how I teach my students every single day.

Nobody doubts that computing devices are a part of daily life, but I think we sometimes forget how much more they are than mere "gadgets," "tools," or "toys." Even tech writers and podcasters sometimes act ashamed, calling themselves nerds or geeks, as if they shouldn't care about this stuff as much as they do. But we should care. This is the stuff of modern life. Computing devices are the windows through which we interact with our increasingly digital world. We have to care about them, and think about them, if we're ever going to understand them and the world we're living into.

Reclaiming My Lifeforce with Keyboard Maestro

I moved to New Jersey in the second grade, and my new teacher told me I had the ugliest handwriting she'd ever seen. I blame her for the fact that I have hated writing by hand ever since. Luckily, two years later, my fourth grade teacher showed me how to type on a computer. I never looked back, turning in as many assignments by typing as I could. The facility with which I could edit, change, and rearrange words on the screen felt magical. As I've gotten older, and nerdier, I'm always looking for new ways to re-experience that feeling when computers actually reduce the friction of my life, when I can coast on Steve Jobs's metaphorical bicycle.

Listening to a recent episode of Mac Power Users, possibly the nerdiest of the nerdy podcasts I listen to, I was inspired to create my first screencast about one such friction reducing tool: Keyboard Maestro. This application saves me time every single day, and I've long wanted to share its tricks with the world.

This was my first podcast, so forgive the video quality and the filming location (my closet). If you're interested, Keyboard Maestro has a free trial at their website. Check it out.

Gruber, Vesper, Technology and Excellence

I first heard John Gruber's name four years ago, at a time of great anxiety in my life. I had just quit my job as a public radio producer in hopes of doing my own creative work, but it was a leap of fear more than faith. My job had been a soul-deadneing mix of writing promotional copy and fact checking, with a hyperperfectionist boss, and I had spent weeks waking up at 3:00 AM every night, unable to get back to sleep, dreading each new day to come.

So I quit, even though I had no new job lined up, even though I had two kids to support, even though my wife's pre-tenure job as a middle-school teacher was hardly secure. I didn't know whether to feel brave or crazy. Mostly, I felt as though I'd saved myself from drowning, broken through the surface of the water and taken my first deep breath of air, only to realize I was nowhere near land.

It was around this time that I stumbled upon the now infamous talk that Gruber did with Merlin Mann: Obsession Times Voice. The ostensible subject of the talk was "blogging," a medium I'd toyed with at various times over the years, but the real subject was authenticity. John and Merlin talked about how being true to your voice and your obsession, in whatever field you chose, might not make you rich, but it would certainly make you proud of the work you did. I took that idea to heart; it partly inspired me to begin writing the novel that occupied the next four years of my life.

I also began reading John Gruber's posts about technology, among other things, at Daring Fireball. I never would have thought I'd be interested in a "tech blog," but I found Gruber's remarkably assured and confident voice, the painstaking way he savored details and dismantled jackasses, to be addictive and strangely comforting.

Flash forward four years, and I have finished the novel I started, I'm approaching tenure at the college where I teach, and I have read virtually every word John Gruber's published. I find this slightly embarassing. I feel I should have spent more of the past four years reading about things of greater import, like politics, science or literature. Why technology?

In "Why I Take Good Care of My Macintosh," the poet Gary Snyder gives as one of his reasons, "Because my mind flies into it through my fingers." Ever since I began devoting more time to my own creative work, I've taken more and more pleasure in the tools I use to fling my mind into my computer, to the point where the tools seem almost like works of art themselves. I think I'm drawn to Gruber's writing, and all the best writing about technology, because I get the sense that he feels the same way. A post last summer at the Atlantic's website, celebrating the 10 year anniversery of Daring Fireball, pointed out that technology in general, and Apple in particular, is really just John Gruber's MacGuffin. The true subject of his blog is "excellence."

And so, when I learned that Gruber had helped design (or rather "directed" along with developer Brent Simmons and designer David Wiskus) an iPhone app called Vesper, I couldn't wait to see how his attention to excellence would translate when directed at his own product rather than someone else's.

At first glance, Vesper is the simplest of notetaking apps, so simple that it hardly feels necessary. It's just notes with the ability to add tags and images. It has very few features, limited export options, no sync with any other app or service, and zero user settings. In Gruber's oft-quoted assessment of Apple's design process, he said,

They take something small, simple, and painstakingly well considered. They ruthlessly cut features to derive the absolute minimum core product they can start with. They polish those features to a shiny intensity.

Vesper is that minimum core product, and the shiny intensity resides in its details. First, it's remarkably fast, faster to launch than any other comparable app on my iPhone. The transitions between individual notes and the list of notes are beautiful fades (here's a gif from the macstories review,) and if you want to see the tagging structure underneath, you just slide the list of notes aside with your thumb. The app avoids any metaphorical gestures, like swiping to reveal a delete button, employing only direct manipulations of the thing itself. As Gruber has said in an interview it's details like these that make the app feel nice.

I'd go farther and say it's details like these that reduce anxiety in the user, that create a quiet, comforting space to live in for a moment. You know what's going to happen. You feel in control.

The one aspect of Vesper that's been widely criticized is its lack of sync, which some assume will change with further iterations. And maybe it will. But I'd argue that even this is a feature, not a flaw. I love the app Notesy because I can throw an infinite number of notes in there and all those notes are synced to my other devices, becoming magically searchable and findable whenever I need them. I don't even have to think about them, and the result is that I often forget they're even there. I use Vesper to capture my most important ideas precisely because I want to think about them. Instead of throwing these ideas in the same place I put everything else, I want to put them someplace special. Vesper has become that place.

Which brings me back to John Gruber and excellence. I now think it's a bit disinguenuous to say that his blog isn't really about technology. At its best, his blog is about technology precisely because technology allows for a certain kind of striving toward excellence. No other field (thanks to Moore's Law) is accelerating at quite the same pace towards new possibilities of excellence. Software in particular, unbound by the limits of the physical world, is providing tools that allow us to make things that are more perfect, more precise, more useful, more beautiful. In many ways, technology itself is the both the means and the ends of striving towards excellence.

Vesper is a product of that striving, as well as a great way to store and catalogue ideas for writing about it.

Also, I don't care what anyone says, the icon is totally fucking awesome.