Songs about Songs

I love what Shawn Blanc said about what Stephen Hackett said about what John Roderick said about focusing one’s attention on creating “primary source material,” rather than mere commentary.

By saying so, however, I fear that I’m engaging in mere commentary — in what Roderick calls, "This chattering sort of criticism and culture digestion that is so much of I guess what we call content — Internet content, which is just like, ’Oh, this just came out and now I’m talking about it and now I’m talking about this other guy who was talking about it.’”

But I’m not sure I would draw such a qualitative distinction between primary and secondary source material. Songs are not empirically better than linked list blog posts. I’d rather read a brief but beautifully crafted post on Kottke or Daring Fireball than listen to a lot of the songs currently on the radio. What matters is the intention, the craft, the effort behind what you make. A close reading of Roderick’s words suggests he might agree.

You know, if you’re making a song, or if you’re writing a story, that is source material. It’s primary. It’s the thing that did not exist before. You’re not commenting. Presumably, your song is not commenting on some earlier song, or if it is, it’s doing it in an inventive way."

The French writer Michel de Montaigne has long been considered the inventor of the essay. The original meaning of the word “essay” was “stab” or “attempt,” because he would take an idea and poke at it from as many different angles as he could think of. He’s more recently been called the godfather of blogging, because he didn’t just write down his own thoughts. He constantly quoted from the authors he was reading and then reflected upon how their ideas comported with his own. He was a great writer but also a great reader and a great commentator. It’s a tradition carried on by bloggers like Kottke, whose work Tim Carmody once described as “watching an agile mind at work, one attached to a living, breathing person, and feeling like you were tapped into a discussion that was bringing together the most vital parts of the web.”

In a piece called Trapped by tl;dr (via Shawn Blanc again) Seth Godin wrote,

“There are thousands of times as many things available to read as there were a decade ago. It’s possible that in fact there are millions as many.”

That’s precisely why we need people who are great readers, people who can sort through the best of what’s out there, who can, in their own way, write songs about the songs we’re all inundated with. And do it in an inventive way.

What the First App Says about Us

MG Siegler believes the first app you open in the morning says something about you.

I see the first app you turn to in the morning as the new homepage. Some might argue it’s your entire homescreen of apps, but I don’t think that’s right. It’s the one service you care most about, no matter the reason, and want to load immediately upon hitting the web. The delivery device has changed, but the concept has not.

What I find interesting is not which app people are choosing to open first thing in the morning, but the fact that apps are the first thing so many of us choose. Siegler traces back his own first app from Twitter to Path to Facebook to Email. For me it would be Twitter to Reeder to Email.

And I think Siegler’s right that before smartphones, it would have been a favorite website on my laptop, something like Slate or Pitchfork or the New York Times. And if I go further back (like a hypnotist regressing the patent to remember former lives), before we even had the internet, it would have been a book, or a copy of The New Yorker, or (even further back) cartoons on TV.

The difference between apps and everything that came before is that the apps we choose now (Twitter, or Facebook, Flipboard, RSS readers) tend to gather and serve up content from myriad, disparate sources. Before apps, we had to choose one source at a time. What I find intoxicating about the apps I open in the morning is the possibility of surprise. As Ben Thompson says, it’s so much more delightful to get the gift I didn’t know I wanted.

But, like Rands and Alexis Madrigal, I agree that this stream of brief interestingness might not be entirely good for me. Perhaps it’s time to try a new first app.

A Software World

At the end of his takedown of an article that calls 2013 “A Lost Year for Tech” (neatly summed up as “a sad pile of piss-on-everything cynicism”), John Gruber writes:

There’s a nihilistic streak in tech journalism that I just don’t see in other fields. Sports, movies, cars, wristwatches, cameras, food — writers who cover these fields tend to celebrate, to relish, the best their fields have to offer. Technology, on the other hand, seems to attract enthusiasts with no actual enthusiasm.

Rene Richie followed up on that point, wondering where the nihilism comes from.

It could just be that computer technology is still relatively new and tech journalists - and tech readers, we feed each other - lack the maturity of older, more established industries and pursuits. It could be that tech writers sometimes seem to care more about being perceived as cool than in being good.

I think this nihilistic streak could be a symptom of the deep-seated suspicion of technology in Western Culture, even among those of us who claim to love it. We all use technology, but we don’t trust it. We fear the power of its “reality distortion field.” We tend to see the experiences it enables as inauthentic, alien, perhaps corrupting, and certainly inferior to “real” experiences. This theme of technology’s malevolent influence is obvious in a lot of science fiction, from Frankenstein to The Matrix, but you can even see it in the training montage from Rocky IV.

The Russian might have a fancy weight room, fancy exercise equipment, and fancy synthetic muscles, but he’ll never triumph over Rocky, who can lift real wagons and real bags of rocks and run to the top of a real mountain.

I thought of that montage when I saw this post from Seth Godin, which makes the fairly reasonable case that our pursuit of productivity (through apps and blogs and devices) often makes us less productive in the end. I take his point until he gives this example:

Isaac Asimov wrote more than 400 books, on a manual typewriter, with no access to modern productivity tools. I find it hard to imagine they would have helped him write 400 more.

First, “400 more” is a pretty high bar. But wouldn’t Asimov have derived some benefit modern productivity tools? Like, say, a computer? If not Asimov, I imagine there were countless people born before the advent of “modern productivity tools” who would have benefited enormously from them. Of course these tools can’t write 400 books for you, but they can reduce the friction just enough to get the ball rolling.

To give just two examples, I was a remarkably disorganized person for most of my life, because I insisted on trying to keep my deadlines, appointments and todo items in my head. Now, with apps like Omnifocus and Due, I’m not only much more organized but also remarkably less anxious about what I might forget. I’ve also been somewhat overweight for most of my adult life, but in the last four years I’ve lost about 50 pounds, mainly due to changes in exercise and diet. But those changes were the direct result of tracking my calories and exercise through the app Lose It. (I had no idea, for instance, that running five miles only burns the equivalent calories of one big sandwich.) Those are the two biggest impacts on my life, but software has, in a variety of ways, also helped me create better lessons for my students, grade papers more thoroughly, capture innumerable moments of my children’s’ lives, stay in touch with people I love, write a novel, and start this blog. My life is significantly better as a result of this technology.

Which brings me back to the nihilism of tech journalists. Few, if any, of these small improvements of the daily life of one person would merit a headline in a tech publication. We tend to expect, and tend only to notice, the big revolutions in technology: the personal computer, the iPod, the smartphone, the tablet. It’s no coincidence that these are all hardware products. Hardware feels more “real” to us. Maybe the reason tech journalists are so often depressed about the state of technology is that hardware revolutions are extremely hard to come by. Dozens of hardware makers get up on stages and set up booths at CES every year touting their new attempts at hardware revolutions. And most of them fall completely flat.

Software doesn’t get the same attention, because it’s less substantial, less trustworthy, and because it’s behind a screen. But software is the real story. Frank Chimero’s fantastic web essay What Screens Want makes this point by citing a video clip of the British documentary program Connections, in which the host describes how plastic has completely permeated our world. Chimero then rewrites the script, replacing the word “plastic” with the word “software.”

It’s a software world. And because of software, it’s a soft world in a different sense, in the original sense of the word: it changes its shape easily.

Software is the magic that makes our devices “indistinguishable from magic”. Many of us think of it as an art form, and yet it’s a strange sort of art form. Most art forms don’t remind you to take out the recycling or help you lose fifty pounds. But the things software can do are almost limitless. Maybe tech journalists would be less cynical about the advances of technology if they wrote more about software than hardware, and more about the how than the what — how software is not only changing its shape, but changing our shape, in more ways than one. That is the real, ongoing technological revolution.

Why (I Hope) Blogs Still Matter in 2014

I started this blog less than six months ago, and for the first three months, I had fewer than a hundred page views. But my readership grew in fits and starts, with a retweet here and a link there, even an occasional block quote, until finally, I arrived home after work a couple weeks ago to find a link to something I wrote on Kottke.org.

Kottke-fucking-dot-org (the New Yorker of blogs, as far as I’m concerned).

My page views went up to 12,000 in a single day, small potatoes for some I’m sure, but a big deal to me. People were starting to follow me on Twitter, sending me messages about how much they enjoyed my writing. After nearly a decade of working in public radio, and then several years writing and struggling to publish a novel, I felt as though blogging was finally giving me a platform and an audience I could call my own.

So imagine my surprise when, just a few days later, in a piece for the Nieman Journalism Lab, Kottke himself announced that, “The blog is dead.” He hedged a bit in a post on his blog, but stood by his main point:

Instead of blogging, people are posting to Tumblr, tweeting, pinning things to their board, posting to Reddit, Snapchatting, updating Facebook statuses, Instagramming, and publishing on Medium. In 1997, wired teens created online diaries, and in 2004 the blog was king. Today, teens are about as likely to start a blog (over Instagramming or Snapchatting) as they are to buy a music CD. Blogs are for 40-somethings with kids.

I’m not quite forty, but I do have kids, so I found this unbearably depressing. Apparently, I have found the right creative platform for myself at precisely the moment it’s fallen out of fashion.

Except I don’t really believe that. And Kottke doesn’t seem to either. The footnote in his blog post about Tumblr (and whether Tumblr blogs are actually blogs) bears this out.

If you asked a typical 23-year-old Tumblr user what they called this thing they’re doing on the site, I bet “blogging” would not be the first (or second) answer. No one thinks of posting to their Facebook as blogging or tweeting as microblogging or Instagramming as photoblogging. And if the people doing it think it’s different, I’ll take them at their word. After all, when early bloggers were attempting to classify their efforts as something other than online diaries or homepages, everyone eventually agreed. Let’s not fight everyone else on their choice of subculture and vocabulary.

So it’s the terminology that’s changing rather than the impulse. And while these alternative services are undoubtedly siphoning off casual users of what used to be blogs, the reason those users are leaving is that blogging platforms don’t provide the easiest access to the intended audience. My wife and I once used personal blogs to share pictures of and stories about our kids. Now we do that on Facebook because Facebook is where the friends and relatives are. If you want to communicate with your social group, you go to the service where your social group congregates, where your message will be conveyed to the largest number of people you know.

But I want to communicate with people I don’t know. And that’s why I think blogs, or personal websites, or single author web publications, or whatever-the-fuck-you-want-to-call-them, still matter.

Rewind about four years. I had just quit a terrible associate producer job in public radio and was failing to make it as a freelancer. That fall, I went to a public radio conference and got to meet one of the Kitchen Sisters (I was so star-struck, I didn’t even know which one she was) and other amazing producers like Kara Oehler and Joe Richman (when he asked me how freelancing was going, I said, “Teaching at a technical college is going pretty well.”) But the most interesting conversation I had that night was with a guy who had been working behind the scenes for most of his career, helping different radio shows find their own unique production style.

I was telling him how I wasn’t sure I could sell the kinds of stories I had been making before the economy crashed, stories about ordinary life with no real news hook. The only show that still had a large freelance budget was Marketplace, and I didn’t want to change my style to suit them. This guy’s advice? Start a podcast. Just make the kinds of stories I wanted to make and put them out in the world, and if the stories were good, the audience would eventually come to me. He cited Jesse Thorn of MaximumFun as a model.

I didn’t follow that guy’s advice (a podcast seemed like too much work), but he did. That guy was Roman Mars. He went on to create and host the amazing show 99% invisible. Not only did the audience come to him, but he recently raised more than $375,000 on Kickstarter to fund the fourth season of the show. His experience echoes the words of Radiolab co-host Robert Krulwich, who offered similar advice in a commencement address to the Berkeley Graduate School of Journalism.

Suppose, instead of waiting for a job offer from the New Yorker, suppose next month, you go to your living room, sit down, and just do what you love to do. If you write, you write. You write a blog. If you shoot, find a friend, someone you know and like, and the two of you write a script. You make something. No one will pay you. No one will care. No one will notice, except of course you and the people you’re doing it with. But then you publish, you put it on line, which these days is totally doable, and then… you do it again.

I had those words in mind when I started my blog six months ago, and I’ve had them in mind whenever I think I should be pitching one of my blog posts to an online publication like Slate or Salon or The Magazine. I’d like to get paid for what I write, but there’s something wonderfully satisfying about owning and controlling my own work. I also don’t want to wait to see if someone will publish it. I want to publish, and see if the audience comes to me.

This is what blogs still offer.

When I first read Kottke’s post on the death of blogs, my knee-jerk fear was that it meant fewer and fewer people would be reading blogs in the near future. What I now think he means is that fewer and fewer people will be writing blogs in the near future. And maybe that’s a good thing. Maybe the rise of social networking platforms will function like a brush fire, clearing out the forest for those of us who want to do more than share a picture, video, or link—those of us who actually want to read, analyze, reflect on, argue with, and write thoughtfully about the stream of information we’re all trying to navigate. Those are the blogs I want to be reading in 2014, and beyond.

Misunderstood or Double-edged?

A lot of people are writing about Apple's latest commercial for the iPhone. Gruber thinks it's their best ad of the year, Kottke calls it one of their best ever. Nick Heer compares it to Don Draper's carousel pitch for the slide projector. But Ben Thompson's take is my favorite because he responds to the ad's critics, who say that Apple is "promoting recording your family over actually spending time with your family."

This criticism is indicative of the recent conventional wisdom that these devices are making us stupid, lonely, and disconnected from the real world. Thompson sees the ad as an attempt to bridge the technological/generational divide, to say the reason we're so obsessed with our gadgets is that they can actually do amazing things.

On the flipside, how many young people – including, I’d wager, many reading this blog – have parents who just don’t get us, who see technology as a threat, best represented by that “can-you-put-that-damn-thing-down-and-join-us-in-the-real-world!?” smartphone in our hands, without any appreciation that it’s that phone and the world it represents that has allowed us to find ourselves and become the person we know they wanted us to be?

In the first half of the ad, the kid is portrayed as self-absorbed, antisocial, even rude in his attention to his iPhone. But why? Would we have seen him in such a negative light if he had been reading a copy of The Catcher in the Rye, or writing in a journal, or drawing in a sketchpad, or noodling on a guitar? The magical, revolutionary thing about an iPhone (and I say this unironically) is that it can become a novel, a journal, a sketchbook, a musical instrument, or a video camera/video editor (with apps like iBooks, Day One, Paper, Garageband, and iMovie among many others).

And yet, we've all seen people ignoring the real world in favor of their device, and they were not involved in a heartwarming creative pursuit. I have looked at Twitter more than once while my children were trying to have a conversation with me. I have even checked Facebook, surreptitiously, while my son, who'd just learned to read, struggled to read a new book to me (not the first book he read to me, but still). I'm not proud of this. And I worry that my kids, who love these devices as much (if not more) than I do, with soon be acting just like the teenager in this ad. And instead of making a beautiful home video during family gatherings, they'll be sexting with Russian oligarchs, selling their non-essential organs, and ordering designer brain implants on some future version of the Silk Road.

That's the double edge of the technology we now have in our pockets. It gives us access to boundless information, and enables all kinds of interactions with that information, but it doesn't distinguish between empty and nourishing information, or help us determine the right uses of that information. We have to make those distinctions and those choices.

One of my favorite pictures of my kids was taken with, edited on, and sent to me from my wife's iPhone. It shows my son and daughter, cuddled together in the dark, their radiant, smiling faces lit from beneath by an unearthly glow. You can't see the object making the glow in the picture, but it's the screen of an iPad. It's also the source of the joy on their faces.

That screen is not going away anytime soon, but we don't have to be passive viewers of it, merely consuming and feeling vaguely guilty about what we consume from it. There's immense creative power behind the screen. Instead of worrying about it, lamenting it, and disparaging it, we should focus on learning how best to use it --- to gather, understand, shape, and share the information around us.

Placebo-philes

Audiophiles have gotten a lot of bad press recently, what with the apparently silly Pono music player (which plays much higher quality audio files despite almost no one being able to hear the difference) and the news from Wired magazine that "burning in" your headphones has no discernible effect on sound quality. Reading about the truly insane things audiophiles will do in pursuit of the perfect sound, I can't help reflecting back on that unfortunate period in my life when I almost fell down the same rabbit hole.

For me it started with a simple search for better headphones. I think I typed "best headphones under $50" into Google, and what came back was a series of lists, like this one or this one, ranking the best headphones at a series of price ranges. I settled on a pair pretty quickly, and when they arrived I loved them, but those lists had planted their hooks in my brain. How much better would my music sound if I were willing to spend just a little bit more?

I decided to research what headphones I would buy the next time I had saved up a decent amount of money, and my research led me to my first (and really only) foray into Internet forums: a website called Head-Fi, where enthusiasts gather to discuss, argue, and bond over their love of headphones and headphone-related accessories. It was a remarkably friendly place for people who enjoyed tuning out the world, but darkness lurked at the edges. People would post glowing reviews of the headphones they just bought, and others would weigh in about how much they loved those headphones too, but inevitably someone would say how those headphones would sound even better if connected to a decent headphone amplifier. Or a decent digital audio converter. Or how those headphones didn't even compare to these other headphones that just cost a little more.

The perfect headphone set up always cost just a little bit more. Audio nirvana was always just out of reach.

Over the course of three years, I wound up buying one pair of headphones that cost about $100, then another that cost about $150, then a headphone amplifier that cost about $100, then another headphone amplifier that cost several hundred, then a special iPod that had been rewired for better sound, then several more pairs of headphones, each more expensive than the last. It helped that the internet made it easy to resell my previous purchases in order to fund my new purchases.

But that was nothing compared to the equipment (and prices paid) by many. The most money I ever paid for headphones was about $300. But the best headphones were going for more than $1000, and the best amplifiers and related devices were many times that.

I don't think it's an accident that this period in my life was the same period in which I had two children in diapers and an extremely stressful job. After putting the kids to bed, if I didn't have any more work to do, and if my wife wanted to watch TV, I would find a quiet spot in the house and get lost in the increasingly detailed soundstage my gear supplied.

But the specter that loomed over everything was the idea that this was all some big placebo effect. I would occasionally spend an evening listening to a song on my new set of headphones and then on my old set, or with my new amplifier and then my old amplifier. I would make my wife listen to see if she heard a difference. Sometimes she did, sometimes she didn't. Sometimes I didn't. Every once in a while, I'd read a post on Head-fi about someone who was selling everything he'd bought because he realized he was listening to his equipment rather than music. I finally had the same realization and made the same decision. At the time, I felt like a recovering addict, or a victim of a con artist, reformed but slightly ashamed.

I got a new perspective on that period, however, when I read this recent piece by Felix Salmon (via Kottke) about converting money into happiness. Salmon is also interested in placebo effects, specifically in the world of wine tasting, where experiments have frequently shown that very few people call tell the difference between cheap and expensive wine, or even the difference between red and white wine. When I first read about those studies, they reminded me of the scene in Brideshead Revisited when a couple guys get drunk and describe the wine they're tasting with increasingly absurd metaphors:

"….It is a little shy wine like a gazelle." "Like a leprechaun." "Dappled in a tapestry window." "and this is a wise old wine." "A prophet in a cave." "and this is a necklace of pearls on a white neck." "Like a swan." "Like the last unicorn."

I had moments almost as absurd with my headphones, when I heard things inside songs I swore I'd never heard before, when I felt as if parts of the music were two dimensional backdrops and then three dimensional shapes would leap out of the picture towards me, or the music would drizzle over my head, or crackle like lightning, or I'd swear I could smell the studio where the song had been recorded, or something.

In other words, I was an idiot. Because on other nights, usually after I'd owned that same set of gear for a little while, I wouldn't hear those things any more, and I'd start thinking that I needed better gear. I needed a new placebo affect.

It's easy to sneer at the placebo effect, or to feel ashamed of it when you're its victim. And that's precisely why I found Felix Salmon's piece revelatory, because instead of sneering at the placebo effect of fancy wine, its marketing, and its slightly higher prices, he thinks we should take advantage of it. If the placebo effect makes us happy, why not take advantage of that happiness?

The more you spend on a wine, the more you like it. It really doesn’t matter what the wine is at all. But when you’re primed to taste a wine which you know a bit about, including the fact that you spent a significant amount of money on, then you’ll find things in that bottle which you love ... After all, what you see on the label, including what you see on the price tag, is important information which can tell you a lot about what you’re drinking. And the key to any kind of connoisseurship is informed appreciation of something beautiful.

This idea of "informed appreciation" reminds me of another area of modern life beset by placebo effects: the world of alternative medicine. In a recent article for the Atlantic, David H. Freedman argues that there's virtually no scientific evidence that alternative medicine (anything from chiropractic care to acupuncture) has any curative benefit beyond a placebo effect. And so many scientists are outraged that anyone takes alternative medicine seriously. However, there is one area where alternative medicine often trumps traditional medicine: stress reduction. And stress reduction can, of course, make a huge impact on people's health. The Atlantic article quotes Elizabeth Blackburn, a biologist at the University of California at San Francisco and a Nobel laureate.

“We tend to forget how powerful an organ the brain is in our biology,” Blackburn told me. “It’s the big controller. We’re seeing that the brain pokes its nose into a lot of the processes involved in these chronic diseases. It’s not that you can wish these diseases away, but it seems we can prevent and slow their onset with stress management.” Numerous studies have found that stress impairs the immune system, and a recent study found that relieving stress even seems to be linked to slowing the progression of cancer in some patients.

Perhaps not surprisingly, a trip to the chiropractor or the acupuncturist is much more likely to reduce your stress than a trip to the doctor. If anything, a trip to the doctor makes you more anxious.

Maybe each of these activities (listening to high end audio gear, drinking high end wine, having needles inserted into your chakras) is really about ritualizing a sensory experience. By putting on headphones you know are high quality, or drinking expensive wine, or entering the chiropractor's office, you are telling yourself, "I am going to focus on this moment. I am going to savor this." It's the act of savoring, rather than the savoring tool, that results in both happiness and a longer life.

Of course, you don't need ultra high end gear to enjoy your music, or ultra high end wine to enjoy your evening, just as you shouldn't solely use acupuncture to treat your cancer. It might be as effective to learn how to meditate. But maybe we all just need to meditate in different ways.

Love the GoldieBlox Video, but Not the Toys

I once paid my daughter a dollar to not buy a princess backpack. And when she recently announced, "I don't need to know math, because I'm going to grow up to be a supermodel," my brain nearly melted out of my ears. I spent the next thirty minutes (or so) half-ranting, half-explaining to her how she has to fight against a culture that believes the most important thing about a girl is being pretty. So I'm totally down with the sentiment of this video.

But then I get to the end and see that the product being advertised, GoldieBlox, is apparently just a set of blocks "for girls." Which makes me wonder, do girls need blocks "for girls"? Why can't they just use blocks?

The company's website has an answer:

Our founder, Debbie, spent a year researching gender differences to develop a construction toy that went deeper than just "making it pink" to appeal to girls. She read countless articles on the female brain, cognitive development and children's play patterns...Her big "aha"? Girls have strong verbal skills. They love stories and characters. They aren't as interested in building for the sake of building; they want to know why. GoldieBlox stories replace the 1-2-3 instruction manual and provide narrative-based building, centered around a role model character who solves problems by building machines. Goldie's stories relate to girls' lives, have a sense of humor and make engineering fun.

I find this somewhat persuasive but also strange. The idea behind the video is that girls just need to be taught to see themselves as more than princesses, that they just need to overcome the culture's brainwashing to become the builders they were meant to be. And yet, the message above seems to be that girls are naturally more in tune with verbal skills than spacial skills. So apparently the only way to lure them into using a spacial toy is with verbal bait? I'm not buying it.

After a brief search for studies on gendered toy preference, I stumbled on Jeffrey Trawick-Smith, a professor at the Center for Early Childhood Education at Eastern Connecticut State University, who has been conducting a toy study for the past few years, trying to determine which types of toys tend to elicit the most complex forms of play. And he made some interesting findings about toy preference among boys and girls.

What set the highest-scoring toys apart was that they prompted problem solving, social interaction, and creative expression in both boys and girls. Interestingly, toys that have traditionally been viewed as male oriented—construction toys and toy vehicles, for example—elicited the highest quality play among girls.

He also says,

One trend that is emerging from our studies can serve as a guide to families as they choose toys: Basic is better. The highest-scoring toys so far have been quite simple: hardwood blocks, a set of wooden vehicles and road signs, and classic wooden construction toys. These toys are relatively open-ended, so children can use them in multiple ways.

The accompanying video for the study bears this out (study findings begin at 3:10).

One of the most basic toys they studied, a set of multicolored wooden blocks in the shape of stick figures called Rainbow People, turned out to be among the most successful in encouraging open-ended play. Children created trains of Rainbow People, made towers of Rainbow People, made up stories about rainbow people, and so on. Even the teachers admitted that they didn't expect these simple toys to be so popular. And indeed, the children weren't interested in them at first. But once they started exploring the toys, they played with them longer and more creatively than much flashier options.

The implication of the video is that both children and adults are often drawn to toys that don't actually foster the most creative play. Toy companies do a lot of research on kids, but their research is geared toward getting kids to want their toys, not necessarily getting kids to play extensively with those toys. Lego's sales went through the roof after they started making highly gendered building sets based on Star Wars, etc. But as any parent can tell you, building these Lego sets tends to follow a linear trajectory: follow the instructions until the X-wing fighter is complete, put on shelf. Not a lot of open-ended play there.

All of which goes to say that while I love the GoldieBlox video, I think the GoldieBlox toys are misguided, just as I think Lego's attempt to woo girls was misguided. You don't teach kids how to think and build creatively by giving them ready-made narratives to build around. And you don't encourage open-ended play by making your building toys highly gendered. The best building toys are the simplest. What we need to do is advocate for a new aisle in the toy section where boys and girls can each find creative toys to play with, side by side, and build things we've never seen out of their own imaginations.

Worse than Regression

I greatly enjoyed Stephen Hackett and Federico Viticci's rant against iWork on The Prompt this past week. Stephen Hackett followed up with a blog post making a similar point about the unsettling trend of Apple regressing features whenever it tries to revamp a major software offering. The previous examples given were iMovie 08 and Final Cut Pro X, each of which caused an outcry from users upon release because of their missing features, and each of which Apple slowly improved by adding back lost functionality. Gruber claimed a couple weeks ago that this is just another example of apple valuing "simplicity over functionality."

But I would argue that the iWork redesign is not just another example of this trend, but something much worse.

iMovie '08 and Final Cut Pro X are both, in fact, examples of Apple starting over from scratch in order to reimagine how the app should work. In the case of iMovie, my understanding is that the app was designed to help people make better home videos by allowing them to easily scrub through raw footage and select just the clips they wanted to use in the final product. In the case of Final Cut Pro X, the app was redesigned around the magnetic timeline, which would allow for much more intuitive editing for less experienced users.

You can argue about whether either of these redesigns was actually successful. In my opinion, iMovie 08 was a mess when it first came out, but Final Cut Pro X was actually amazing for a first time user. But what you can't argue is that each was a bold, radical departure from the past that laid a new foundation for the app's future.

iWork has none of that boldness. All it does is clip the wings of the desktop versions so as not to embarrass the iPad's feature set. The radical new paradigm is that you'll be able to the same things on your iPad as you do on your desktop, even if this means ruining the experience you used to have on the desktop.

The reason I find this so depressing is that Apple software designers have a lot of exciting ideas about the unique opportunities of the iPad in their "life" apps. Garageband, iPhoto, and (to a lesser extent) iMovie offer completely different and in many ways superior experiences on the iPad. There's no desire to make them identical to the desktop versions. It's obvious that the tablet offers a different experience. So why, in God's name, should the iWork apps be identical on both platforms?

If you want to offer a radical new version of word processing or spreadsheets or slide decks that's specifically tailored to the iPad experience, make it fucking different from the desktop experience. Don't just make the desktop experience worse! To give just one example, couldn't they at least try to reimagine a way to end the nightmare that is iOS text selection? Couldn't they make interaction with text more intimate, more intuitive, through touch somehow? Apparently not.

Nilay Patel made a great point on the most recent episode of the Vergecast. He was talking about how iOS 7 still doesn't feel like it's been optimized for the iPad, which suggests that even Apple doesn't quite know what the iPad is for. But if they want the iPad to replace most computers, they have to figure that out.

Apple are definitely making great strides with their iLife applications, Garageband being the best example. But there's a difference between creative applications and straightforward work applications. Many more people use their computers for the latter than the former, and Apple have yet to demonstrate that they understand "work" as well as they understand "life".

Sifting Lo-Fi Memories

The other day I was listening to an episode of Back to Work while taking a break from my actual work, and Merlin Mann made an offhand reference to smart playlists, a vastly underrated feature of iTunes he has written about extensively on his old blog. Listening to him, it suddenly occurred to me that I hadn't made a smart playlist of my own in quite a while. Autumn was in the air, and something about autumn always makes me think of adolescence, so I decided to build a playlist of the lo-fi indie rock (Pavement, Guided By Voices, Sebadoh, Superchunk, etc.) that I used to listen to when it was 18.

2013-11-05 Screen shot -1

In the process of playlist building, I began to consider what a strange and strangely powerful influence smart playlists have had on my life. The first time I ever used a CD player, the shuffle function struck me as its most powerful feature. As much as I like the album as an art form, I also love the idea of laying different songs against each other, creating new combinations of sound. Eventually, I got a 5 disc CD player, and I made extensive use of its shuffle function as well, but I wanted more. I wanted to shuffle all my music, and when MP3 players first came on the scene, I sensed this was finally possible. I shied away from the pricey iPod, though, until I heard about smart playlists.

The idea, if you aren't already familiar with it, is simple. iTunes tracks all kinds of data about the songs you listen to. It knows the name of the artist, the name of the album, the genre, when you added it to your library, when you last listened to it, how often you've listened to it, how many times you've skipped it, whether you've given it a rating of 1 to 5 stars, and so on. A smart playlist makes it possible to leverage all of this data to serve up exactly the kind of music you want to listen to at this moment.

The concept of smart playlists blew my mind. I knew next to nothing about computers, programming, or any software really other than Microsoft Office. But I understood at once that with smart playlists, I could not only shuffle all my music, but do so in as many different ways as I could possibly imagine. I could shuffle just songs I hadn't heard in the past year, just songs I'd rated five stars that I hadn't heard in the last year, just indie rock songs I'd rated five stars that I hadn't heard in the last year, and so on. The possibilities were endless.

It was the lure of smart playlists that got me to buy an iPod, which got me to buy my first Macbook, which got me interested in indie software, which got me interested in things like Quicksliver and Omnifocus, which got me interested in blogs like 43 Folders and podcasts like Back to Work, which led me to a Saturday afternoon near the end of autumn in 2013, listening to a podcast with Merlin Mann talking briefly about smart playlists on my iPhone.

The playlist I made that afternoon consisted of music that hadn't been easy to come by in the suburban New Jersey town where I grew up; I had to make special trips to record stores in Hoboken and New York City to track down albums like Slanted and Enchanted and Bee Thousand when they first came out. But I carried those CDs with me through college and grad school and eventually ripped them into mp3s, which I stored on the hard drives of several successive computers. I don't even own the discs any more. What amazes me is that I now have the power to write a mini-software program that will sift through those scratchy lo-fi music files, as one might sift through scratchy lo-fi memories, to find songs I'd forgotten I even knew I loved.

Which is a long way of saying that software is magic.

The Gorgon Stare

There's a line in a Matthew Power's GQ article "Confessions of a Drone Warrior" that perfectly captures the disquieting, if slightly irrational, sense that drones are harbingers of a nightmare, dystopian future.

Even their shape is sinister: the blunt and featureless nose cone, like some eyeless creature that has evolved in darkness.

I read this article with equal parts fascination and horror, as I did Mark Bowden's article "The Killing Machines" in the Atlantic a couple months back. Bowden is definitely the more pro-drone of the two, making the case that, even though drone strikes do result in some collateral damage, ground operations often cause many more civilian casualties. He gives the example of the Delta Force raid in Mogadishu, which he wrote about in Black Hawk Down.

The objective, which was achieved, was to swoop in and arrest Omar Salad and Mohamed Hassan Awale, two top lieutenants of the outlaw clan leader Mohammed Farrah Aidid...We were not officially at war with Somalia, but the ensuing firefight left 18 Americans dead and killed an estimated 500 to 1,000 Somalis—a number comparable to the total civilian deaths from all drone strikes in Pakistan from 2004 through the first half of 2013, according to the Bureau of Investigative Journalists’ estimates.

It's a statistic that puts the death toll of drone strikes in startling perspective. Bowden also counters the notion that drones distance soldiers from the realities and consequences of war, quoting a Predator pilot who used to fly B-1 bombers:

"There is a very visceral connection to operations on the ground," Dan says. "When you see combat, when you hear the guy you are supporting who is under fire, you hear the stress in his voice, you hear the emotions being passed over the radio, you see the tracers and rounds being fired, and when you are called upon to either fire a missile or drop a bomb, you witness the effects of that firepower." He witnesses it in a far more immediate way than in the past, and he disdains the notion that he and his fellow drone pilots are like video gamers, detached from the reality of their actions. If anything, they are far more attached.

In some ways, I came away from these articles more disturbed by the surveillance possibilities of drones than I am by their potential as vehicles for assassination. Bowden mentions something known as the Gorgon Stare: a system of cameras and artificial intelligence for analyzing camera footage (named for the mythical beast that can turn its victims to stone).

Instead of one soda-straw-size view of the world with the camera, we put essentially 10 cameras ganged together, and it gives you a very wide area of view of about four kilometers by four kilometers—about the size of the city of Fairfax, [Virginia]—that you can watch continuously. Not as much fidelity in terms of what the camera can see, but I can see movement of cars and people—those sorts of things. Now, instead of staring at a small space, which may be, like, a villa or compound, I can look at a whole city continuously for as long as I am flying that particular system.

In the more recent GQ article, Matthew Power details how the burden of this surveillance, and the fatal power that comes along with it, can weigh on the person behind the camera. The article is a profile of one such drone pilot, Brandon Bryant, who describes his experience in the use of drones as an almost god-like witness to the mass of humanity he had under surveillance.

He watched the targets drink tea with friends, play with their children, have sex with their wives on rooftops, writhing under blankets. There were soccer matches, and weddings too. He once watched a man walk out into a field and take a crap, which glowed white in infrared.

In the end, it's not so much the violence that Bryant finds traumatizing, but the simultaneous sense of powerlessness (because he wasn't the one choosing his targets) and the way the technology, rather than distancing him, gave him an intimate, invasive view of his victims' final moments on earth, watching the heat of their bodies dissipating through infrared.

The smoke clears, and there’s pieces of the two guys around the crater. And there’s this guy over here, and he’s missing his right leg above his knee. He’s holding it, and he’s rolling around, and the blood is squirting out of his leg, and it’s hitting the ground, and it’s hot. His blood is hot. But when it hits the ground, it starts to cool off; the pool cools fast. It took him a long time to die. I just watched him. I watched him become the same color as the ground he was lying on.

Even if drones cause less collateral damage, I can't help but fear the greater psychological damage, both to our enemies and to ourselves. Both articles are well worth reading.

Appropriate Indignation

Listening to the recent episode of Rene Richie's Vector podcast with Ben Thompson, I found myself infected by Thompson's righteous indignation at Apple's iPad strategy. After all the boneheaded doomsaying about Apple in recent months, it was refreshing to hear someone criticize Apple out of intelligent concern rather than laziness.

Thompson's argument, which he has also written about at length on his website, is that Apple itself actually seems unaware of the appeal, the potential, the "why" of the iPad. The power of the iPad is not its ability to merely replicate a computer's utility through touch input, but to actually enable new possibilities by putting users in a more immediate relationship to digital information.

If you are a musician, the iPad is your instrument, your studio. If you are an artist, the iPad is your paint brush, your easel. If you are a student, the iPad is your textbook. If you are a child, the iPad is your storybook, or your entertainment. If you are a grandma, the iPad is your connection to your family.

I had no idea what I might use the iPad for when I first bought it, and at first I remember thinking that it had been more of a luxury item than anything essential to my life. But as third party developers innovated and imagined new uses for it, apps like Instapaper, Reeder, Flipboard, iThoughts HD, and Day One became important parts of my daily life. Each of these apps can or do exist in desktop versions, but it was their iPad versions that appealed to me most directly.

More recently, I've found apps that I use in my job teaching developmental writing that I simply could not live without. To give just one example, there is an app called Doceri that essentially turns my iPad into a mini Smartboard, enabling me to annotate anything on my computer screen while it's attached to a projector. So, for example, I can put a section of a student's paper up on the screen and have the whole class talk about how to improve it, annotating the writing on the fly as we talk. This is the kind of app that allows a new, more direct interaction with digital content, helping me accomplish something that simply not possible before the iPad.

And yet what software did Apple demo onstage last week? The new iWork suite. A trio of apps that already worked better on the desktop than they ever could on the iPad. And what was the big update to these apps? Instead of adding functionality that only the iPad could provide, Apple chose to remove functionality from the desktop versions to bring them in line with their lesser tablet incarnations.

I am a longtime user of iWork. I grade all my students' papers in Pages. To do this, I make extensive use of the "comment" function. When I'm finished commenting on a student's paper, I can easily convert the document to a PDF for easy emailing to the student or printing a hard copy. In the new Pages, this ability is gone. You can comment on a document, but neither printing nor PDF export preserve the comments. They are only viewable within the Pages version. This is one of many features that have been removed in the rewrite of the iWork apps.

Gruber downplayed the controversy over the rewrite on Daring Fireball with a link to Matthew Panzarino, who wrote about the new iWork at Techcruch.

Lots of folks are getting all worked up about iWork being “dumbed down,” but it feels like a reset to me. I can see this playing out pretty much like Apple’s recent Final Cut Pro X re-thinking. That app was introduced in a radically simplified and streamlined form that caused immediate outcry. Over time, Apple has steadily added back features that were missing from the early dramatic redesign of the pro video-editing suite.

The difference is the motivation. Apple rewrote Final Cut from the ground up because they thought they could provide a better, more intuitive way of editing video. I'd argue that the magnetic timeline actually was a feature that deserved to have the app rewritten around it. By contrast, iWork on the iPad clearly was and still is an inferior experience to the desktop versions. Choosing to rewrite the apps around those inferior versions suggests that Apple misunderstands both the strengths and weaknesses of both its platforms, which is disheartening to say the least.

Paying for Apps

It seems odd that I felt so angry and depressed reading Marco Arment's recent blog post about why the market for "paid-up-front iOS apps" is dead. I'm not a developer. The market forces affecting software development have no effect on my livelihood. But I am a lover of software, and the flowering of my love for software (if I may be so melodramatic) coincided almost precisely with the flowering of the very software market Arment is now declaring dead. It feels like he's killing, by his very pronouncement, the thing I love. And I want him to be wrong.

The irony is that Instapaper may be the first iPhone app I ever paid money for. In the early days of the App Store, I remember consciously resisting the urge to buy apps because it felt like a waste of money, like they were glorified video games or something. As a result, most of my early exposure to independent iOS development was a free but sad parade of bubble popping, cat chasing, fish poking, funny face making, and farting apps.

But then I read about Instapaper. For years, I had been copying and pasting articles from the web into Word documents so that I could read them later. Instapaper sounded like my dream app. And yet the $10 price tag still felt like too much. I couldn't shake the irrational fear that it would somehow be a rip off, a swindle. So I used the free version instead. Arment has detailed why free versions actually don't help sales, and he's right. I put up with good enough for at least a year after that, far too long. It was finally John Gruber's post about Instapaper, and how much Arment sweated the details, that got me to fork over my cash. I can't possibly estimate how much Instapaper has meant to my reading life since, how much value those $10 dollars have reaped.

Instapaper was the gateway drug. Between my iPhone and iPad, I've since purchased countless todo apps, notes apps, mindmapping apps, rss readers, text editing apps, journaling apps, reminders apps, weather apps, calendar apps, photography apps, drawing apps, music making apps, Twitter clients. These apps have helped me organize my life, exercise more, lose weight, map out my hopes and dreams, capture moments of my children's lives, and enriched my knowledge of the world. I regret almost none of my purchases, even those I don't use anymore. If anything, I wish there were more great apps I could try.

In a previous blog post, I quoted Gary Snyder's poem "Why I Take Good Care of My Macintosh," where he gives as one of his answers, "Because my mind flies into it through my fingers." I feel the same way about my iPhone, except that it's no longer a one-way flight. These devices are my primary tools to capture, digest, share, and communicate information about the world, and it's the apps that make that experience possible. And what makes truly great iOS apps (like Day One, or Reeder, or Vesper, or Tweetbot, or Clear, or Paper, or Instapaper) stand apart is the sense that they create an unique space to live in for a moment, a place were all the details have been considered and savored and honed for a purpose by another human being.

I fear that Arment is saying the money for these spaces has dried up. Because there aren't enough people like me willing to pay for them. That it's only us outliers in the "upscale-geek world" who care about these things enough to foot an upfront fee, let alone a decent such fee.

But then I think about what I used to spend most of my spare money on before I spent it on apps. I spent it on music. And not music by mainstream bands, but bands on independent labels. These were bands who made a decent living touring and selling merch but rarely from record royalties, who yet still made some of the most indelible music of my life. And they didn't make that music to get rich. They made it because they loved making music.

So maybe it's true. Maybe the possibility of an independent developer making it rich or even making a living primarily off an iOS app simply by selling it to users has passed. I don't know if that's the case. It's possible future generations of users will see the value where most adults (software newbies) currently do not.

But I have hope that developers will continue to make and design the apps that I want to live inside, and that they'll sell those apps to me, simply because I hope they love designing them as much as I love using them.

The Clear Kerfuffle

The iOS development world is up in arms over the backlash to Realmac Software's Clear update. Many users complained about having to pay for an update, and then many developers and bloggers scolded the complainers. Joseph Keller wrote in iMore,

"It's important to remember that developers don't do this for fun. They do this to make a living. Making an app takes time, and making a good app takes even more. That's time they're not spending doing another job. To suggest that you should receive continual updates, no matter how substantial they are, after paying a small amount once, is to devalue the hard work that developers do."

Stephen Hackett at 512 Pixels said,

"The bottom line is this: developers should be able to work on their product in a sustainable way. Realmac are some of the good guys, and to have to backtrack on a business decision is a damn shame, especially in a world where people pour money into IAP-based games day and night."

The typically blunt Marco Arment simply tweeted,

"Upset about @realmacsoftware asking for another 3 dollars for a big update to Clear and going universal? You should be ashamed of yourself."

That last one hit home because I'm a huge fan of Arment's work, I'm a huge fan of software in general, I fully support developers charging for updates, and yet I was one of the people annoyed by Realmac's upgrade strategy.

The reason? There was virtually no sign that they had actually updated the iPhone app. Take a look at the video below for a comparison.

Almost every other app that got updated at least changed their design, their functionality, their something. Those that made substantial changes (i.e. Omnifocus) rightfully charged for their updates. An article in the Verge says the Clear update utilizes new "UI Dynamics." But if the user can't see those new dynamics (I can't see anything in the video above), and doesn't see value in that, you can't expect them to understand why it costs more. The one substantial change to the app was making it universal, but again, does the app do anything substantially new on the iPad that it didn't already do on the phone just as well? And how many users were clamoring for an iPad version?

I don't envy Realmac's situation. The fact is, they were on the cutting edge of the flat design that the software world is now catching up with, and they made (in my opinion) possibly the best app in the world for simple todo lists. Maybe the reason they didn't change it much for the update is that it's nearly perfect as is. But customers will only tolerate paying for updates if they see the value in those updates.

Some folks are now criticizing Realmac for caving in the face of pressure. Matt Gemmel tweeted,

"So, poor form, Realmac. Not helping at all. Price complainers aren’t ever going away, and giving in makes it harder for everybody."

I think they've made exactly the right decision. Charge for the iPad version, which is definitely new but may not appeal to all users, and charge for updates when and only when you can clearly communicate how you've added new value for your customers.

Chili Pepper + Video Game Review = Brilliant Insanity

The two elements of this piece of internet performance art seem completely unrelated. Why, after all, would one eat a spicy chili pepper just before delivering a video game review into a camera? And yet the result is so utterly compelling. It's like watching someone teeter on the lip of a volcano while wearing a clown suit. Or like watching someone pretend to play a joyful game of twister on a twister mat full of broken glass. Or like watching someone try to tell a joke while being burned alive.

None of these analogies does it justice. Just watch the truly harrowing, insane 3 minutes of it.

My favorite moments:

  • The way reviewer Erin Schmalfeld responds to the experience of eating the pepper (not once but twice) by whispering fearfully, "No!"
  • The way her increasingly pained voice and teary eyes suggest despair in the face of not only the absurdity of the game she's reviewing but the absurdity of existence itself.
  • The way her pain hardens into anger, which she takes out on the game's cheesy Disney dialogue about following your heart: "Who cares? WHO CARES??"
  • The truly heroic crawl to the end.

Via Kottke

OS Design as Culture

This summer, I had a conversation with an American Sign Language interpreter about the concept of Deaf Culture. Turns out, many people who are born deaf don't consider themselves "handicapped" or even "hearing impaired". Instead, they consider themselves part of a different culture, as if deafness were an ethnicity. I found this surprising the first time I heard about it, since ethnicity is a biological legacy whereas deafness seems like a biological accident. Of course, I was wrong. As the sign language interpreter pointed out, one of the primary characteristics of a culture is not necessarily ethnicity but rather a shared language.

I thought of this reading Farhad Manjoo's piece in Slate about trying to switch from an iPhone to an Android phone. His primary objection to the Android phone was how the phone manufacturers and the carriers intruded on the experience of the operating system with "skins" and extra apps. He prefers the iPhone's purer experience. Predictably, his comments section is full of invective from both Apple and Android adherents. The people who engage in these battles, on blogs and in the comment threads of tech news sites, are often mocked by the tech writers who stand above the fray. The terms used to describe them (fanboy, neck beard, faithful, sheep) suggest that caring too much about the operating system of your smartphone is at best juvenile and at worst cultish. Matt Honan captured the absurdity of the turf war in Wired this past spring.

Nobody cares what kind of smartphone you believe in. It’s not a religion. It’s not your local sports team even. Stop being a soldier. You are not a soldier. You are just wrong. Shut up.

It's a first world problem, no doubt, and trolls are trolls, but sometimes I fear that we sell people short for caring too much about these things. Many of us are using these devices as our primary tools to capture, digest, share, and communicate information about the world. Is it any wonder that we care deeply about the way these tools work and how they allow us to interact with that information? An operating system is not itself a language, but it is a tool for expression of language, not unlike speaking, writing, listening, or signing. It's certainly not a religion, but there is a sense in which it can form the basis of a culture, or at least a cultural understanding of digital information.

When my father got laid off from his job and decided to switch careers, one of the changes he made was to go from a Windows P.C. to a Mac. I did the same a few years later when I changed jobs. And as cheesy as it sounds, the new computer changed me. It changed how I wrote, how I made radio stories, how I consumed the news, took photographs, and made home movies of my children. And now it affects how I teach my students every single day.

Nobody doubts that computing devices are a part of daily life, but I think we sometimes forget how much more they are than mere "gadgets," "tools," or "toys." Even tech writers and podcasters sometimes act ashamed, calling themselves nerds or geeks, as if they shouldn't care about this stuff as much as they do. But we should care. This is the stuff of modern life. Computing devices are the windows through which we interact with our increasingly digital world. We have to care about them, and think about them, if we're ever going to understand them and the world we're living into.

The Monster Is the MacGuffin

The Conjuring is the latest horror film by James Wan, the guy who created the Saw franchise. It's a haunted house, paranormal investigation, exorcism movie with great production values and actors. My expectations were high. The movie has gotten great press. The audience at the theater was packed. I sat next to a young couple, and the woman kept wrapping herself tighter and tighter around her boyfriend, whispering at every creepy moment, "Ohfuckohfuckofuck!" I found myself literally on the edge of my seat more than once. But by the end, I felt let down, as I so often do by scary movies. I made this video essay to explore why.

The Joy and Darkness of Calvin and Hobbes

I was excited to learn that a documentary about the comic strip Calvin and Hobbes is in the works. My daughter got into the comic strip a while back and just recently my son has been asking me to read it to him. The rereading experience has been strange because the comics, which I first read when I was about ten years old, read so differently to me now.

Many cultural products for childen are intended to appeal, often on the sly, to adults, and it can sometimes feel dishonest, all this talking over kids' heads. I rememebr a New Yorker cartoon that showed two kids walking out of a movie theater, one saying to the other, "It was okay, but there were too many wisecracks for the grownups." I worried that Calvin and Hobbes might come off that way. A lot of the jokes seemed to be over my son's head.

But I've since come to think that the comic was written almost like an optical illusion, with two different shapes visible depending on how you choose to see it. Kids see more of the the antics, the joy, the frustration of being a kid, and grown ups see more of the cultural commentary and the struggle of Calvin's hilarious, exhausted parents. What I love is how Watterson manages to make Calvin so much more than just a wisecracking, trouble-making kid, but someone with equal parts joy and darkness. He captures something I've seen in my own children but rarely see in portrayals of children: that mixture of curiosity, wonder, defiance, narcissism, and outright nihilism.

Calvin and Hobbes Strip

But what makes the comic an enduring work of genius, of course, is the character of Hobbes the tiger, serving both as Calvin's conscience and a stand in for the wisdom of nature in opposition to human folly. And the genius is never knowing whether Hobbes is a part of Calvin or his own separate magical being. I wrote about Bill Watterson for The Writer's Almanac years ago, and I'll never forget Watterson's explanation for why he never licensed his creation for merchandise: "My strip is about private realities, the magic of imagination, and the specialness of certain friendships. [No one] would believe in the innocence of a little kid and his tiger if they cashed in on their popularity to sell overpriced knickknacks that nobody needs."

Anxiety and the Marc Maron Interview

I was never good at hitting news pegs when I worked in public radio, and so it's no surprise that I'm writing my thoughts about Marc Maron and his podcast WTF weeks after the show's 400th episode. But seeing various websites compile their favorite episodes of the show, I wanted to write about how important Maron's voice was to my own life when I first discovered it.

I have Ira Glass to thank for the discovery. He recommended WTF back in 2010, describing it as "the New York Times of comedy podcasts." In retrospect, as much as I love Ira Glass, this is a terrible analogy. WTF is neither the New York Times of anything nor the anything of comedy podcasts. While WTF contains elements of journalism and comedy, it is wholly its own thing, some new hybrid art form, mixing comedy, interview and audio-memoir. Though Maron comes from the world of standup comedy and primarily interviews standup comedians, I don't even think his show is primarily about comedy. It's just that comedy, and the practice of comedy, offers the best lens for examining Maron's most enduring subject, which is anxiety.

The origin story of the podcast is legendary. After ending his second marriage, losing his job at Air America, failing to break through as a mainstream standup comic, and (by his own account) approaching the verge of suicide, Maron started interviewing fellow comedians in his garage, partly to examine why he hadn't achieved more success in his life. The critic David Haglund has argued in Slate that it was precisely this exploration of failure that made the show resonate with listeners.

When your work life is not going well—especially if you’re professionally ambitious and even more so if your chosen field is the sort that has an audience—it is easy to isolate yourself, to shy away from friends and from peers and avoid acknowledging that things are not going as you hoped they would. Much of the power of WTF comes from Maron doing the exact opposite of that. He said to the world, “My life is a disaster right now. Help me understand what the fuck is going on.” (I’m paraphrasing.) It is maybe not a coincidence that his podcast became a phenomenon at a time when many Americans were losing their jobs.

It may not be a coincidence that I started listening to the show after I had quit my own job and then failed to find the next thing I wanted to do. I was teaching part time at a technical college, but on my days off, I would struggle to work on a novel or a number of radio stories that never came to fruition. In the afternoons, I would go for a jog or a bike ride, exercise providing the only part of the day that I could be sure I hadn't wasted. I no longer felt the excessively sharp teeth of anxiety that had plagued me in my previous job, but I still felt dizzy from the view of the precipice between this life and that one, not sure if I'd made the right choice for myself or my family, and it was into the midst of that doubt that Marc Maron's sandpapery voice entered and served as a remarkably comforting companion.

The appeal of those early episodes is the hunger with which Maron pursues wisdom for the purpose of improving his own life, and his quest is especially intoxicating if you identify with his creative struggles. But what's revelatory, even if it should be obvious, is to hear Maron uncover not the secrets of happiness but how hard even his most successful peers have struggled and continue to struggle with stress, anxiety, rage, and depression, just like he does.

Almost everyone agrees that the pinnacle of WTF is the interview with Louis CK, partly because he and Maron share so much history. The interview itself becomes a kind of excavation of a friendship that had died as a result of Maron's resentment over CK's success. What I love is how baldly CK lays out the history of his own moments of abject failure and desperation as he struggled to make it as a comic. One of my favorite parts of the interview is what might be called the parable of the trumpet and the peep show, excerpted below.

I retold this story to my wife recently, and instead of being amused, she was mortified. "I'm not sure what you want me to make of that," she said. "Are you planning to spend $1400 on something or what?" I couldn't quite figure out at the time what made it so important to me, but thinking of it now, the story reminds me of something I read in an essay (subscription required, sadly) by Andrew Solomon, describing his own bouts of anxiety-fueled depression. He offered the following analogy.

If you trip or slip, there is a moment, before your hand shoots out to break your fall, when you feel the earth rushing up at you and you cannot help yourself—a passing, fraction-of-a-second horror. I felt this way for days.

In his story of the trumpet and the peepshow, Louis CK is describing his own desperate attempt to stave off this kind of anxiety, his own hand shooting out, and what it grabbed to break his fall. Maybe if I grab this trumpet I won't fall. Maybe if I grab this orgasm I won't fall. Maybe if I can just distract myself for a second from this sensation of horror, I can catch my breath and things will be okay. I know this sensation well. Often, in those days after I quit my job, what I grabbed was a two hour jog with Marc Maron in my ears.

Maron returns to the theme of anxiety again and again in his interviews with comedians. Of course, it's truism that most comedians are troubled, depressive, addictive personalities. But it hadn't ever occurred to me that so many of them would suffer specifically from anxiety. Of course, standup comedy is actually the perfect venue for someone to confront their anxiety and conquer it publicly. As Maron frequently puts it, comedians are often drawn to comedy because it's the one place where they can control why people are laughing at them. Maron made this point in another of my favorite episodes of WTF, an interview with writer/director/produce Judd Apatow.

I love that so much. "Is there any point where I get enough approval? And I've realized, there is no point." It's such a naked admission. This is just one example of why WTF, at its best, towers over all other comedy podcasts. Because it's not a comedy podcast. It's an exploration of the failure all creative people fear, no matter how successful they become, how that fear can both paralyze and inspire, and how we can get through it by telling each other stories and occasionally laughing.

Reclaiming My Lifeforce with Keyboard Maestro

I moved to New Jersey in the second grade, and my new teacher told me I had the ugliest handwriting she'd ever seen. I blame her for the fact that I have hated writing by hand ever since. Luckily, two years later, my fourth grade teacher showed me how to type on a computer. I never looked back, turning in as many assignments by typing as I could. The facility with which I could edit, change, and rearrange words on the screen felt magical. As I've gotten older, and nerdier, I'm always looking for new ways to re-experience that feeling when computers actually reduce the friction of my life, when I can coast on Steve Jobs's metaphorical bicycle.

Listening to a recent episode of Mac Power Users, possibly the nerdiest of the nerdy podcasts I listen to, I was inspired to create my first screencast about one such friction reducing tool: Keyboard Maestro. This application saves me time every single day, and I've long wanted to share its tricks with the world.

This was my first podcast, so forgive the video quality and the filming location (my closet). If you're interested, Keyboard Maestro has a free trial at their website. Check it out.

Creepiness, the Uncanny, and Mama

Growing up, I never questioned why I loved scary movies or what I loved about them. But as I’ve gotten older (and more pretentious), I feel compelled to justify my taste by asserting that it’s creepiness (rather than mere gore or shock value) that I love. Somehow creepiness is more respectable, less likely to implicate me as a potential sociopath.

But what distinguishes creepiness from mere horror? Stephen King laid out his own hierarchy of the genre in his book Danse Macabre:

“I recognize terror as the finest emotion and so I will try to terrorize the reader. But if I find that I cannot terrify, I will try to horrify, and if I find that I cannot horrify, I’ll go for the gross-out. I’m not proud.”

I would place creepiness above even terror because terror is still just reaction to perceived danger. Creepiness is something else, not fear so much as a mysterious unease, often accompanied by goosebumps “creeping” across the skin. I would argue that it comes from glimpses of the uncanny, a concept defined by Freud among others as something that combines elements of the familiar and the grotesque. The most common example of this, now incorporated into the concept of the “uncanny valley,” is the nearly-human automaton that is somehow definitely not fucking human.

It’s no accident that some of the creepiest scenes in the last decade or so have come from Asian horror films, which often feature human figures who look slightly off, from the child who mews like a cat in The Grudge, to the girl with the hair-obscured face in The Ring. One technique Hollywood has borrowed extensively from these movies is the often digitally enhanced hurky jerky movements of ghosts. Click the gif below to animate.

It can be overdone but it can also be incredibly effectve. While he watched one such scene at the height of the Asian horror boom, the film critic Mike D’Angelo wrote in Time Out New York that he was, “More frightened than I’ve ever been in a movie theater.”

The scene he's referring to is from Kiyoshi Kurosawa’s film Pulse. I saw the movie several years ago, and I remember almost nothing about it (something to do with a haunted website) except this scene, which I also consider one of the creepiest moments of any movie I've ever watched.

I don't think the scene in isolation does it justice; essentially it consists of a woman walking toward the camera in a strange slow motion, almost as if she doesn't quite know how to walk. The moment that makes the scene indelible, unforgettable, for me is the way the ghost woman, as she walks forward, suddenly stumbles. Her walking was already strange, automoton-like, but that stumble is even worse, as if she wants badly to pass herself off as one of the living, but in the time since she died, she's forgotten.

Mike D’Angelo writes of this moment,

That’s the point where I started to lose it, and I can only conclude that what freaked me out was that I no longer had any idea what I was looking at, which meant that I didn’t know what would happen next. This woman’s intentions were not clear. This woman’s freakin’ movements were not clear. She was silent, she was implacable, she was mysteriously clumsy. Looking at the stumble again (and again), it almost seems like a glitch—not as some lame “she’s a computer program” reveal, but in the much more vaguely menacing sense of just plain Does Not Belong."

This is the uncanny, the thing that we both recogize as familiar and yet instinctively recogize as not belonging. And it gives us (or some of us) a shivery thrill of both dread and wonder. We both want to look away and yet can’t.

And so it is by the creepiness factor that I can highly recommend Mama, which came out months ago in the theater but that I (like many, I assume) can finally enjoy on DVD and BluRay. If by some unlikely chance you are reading this before you watch the movie and you’d prefer not to know anything more, I implore you to just go watch it. Few modern scary movies are genuinely satisfying, but this one is.

It’s not a perfect film, and in some ways, I wish the director had stuck with the more low budget approach to special effects utilized in the short film that was the germ of the full length movie. You can watch that short film on YouTube, with an intro by Guillermo Del Toro, which certainly packs an uncanny wollop, though it has only the barest hint of a story.

The full-length Mama is a ghost story, and like most ghost stories, the plot is essentially a mystery, built around the unraveling of the ghost’s identity. A man and his girlfriend take in his two young nieces after they’ve been living alone in the woods for years, only to find that the girls have brought a ghost with them: the same ghost who cared for and protected them while they survived in the forest. Unfortunately, the ghost is a jealous ghost and does not appreciate being supplanted by new caretakers.

What makes Mama unique, and uniquely dramatic as a ghost story, is not how the main characters are affected by the haunting. In fact, the two adult characters’ backgrounds and motivations are sketchy and underwritten at best. What makes it worth watching is the two little girls. They carry the movie, both as characters and as actresses. The movie’s dramatic center turns out not to be unraveling the mystery of the ghost’s identity, but rather the struggle between (and within) the girls of whether to continue their relationship with the ghost.

This struggle is underscored repeatedly by the uncanny visual effects of the girls talking to, interacting with, and even playing with the ghost. Every one of these scenes is creepy, moving, and riveting. The girls radiate wildness and danger from the moment they are first discovered in the woods to the way the younger girl sits in the corner and sleeps under her sister’s bed. I also love how the ghost could be read metaphorically as an embodied and totally pissed off case of post traumatic stress disorder.

And unlike so many recent scary movies, this one manages to reach a conclusion without a gimmicky, false, or surprise ending. Rather than just hoping the innocents would survive or the monsters would be vanquished by the end, I actually cared about decisions these little girls were making. This is what makes a work of suspense (supernatural or otherwise) rise above mere formula. Does it elicit emotions other than fear along with the fear? Mama certainly does. It’s that rare thing: a genuinely creepy movie that a grownup can be proud to have enjoyed.