The Decline of Reading?

Alexis Madrigal on the sad state of reading in the US:

Remember the good old days when everyone read really good books, like, maybe in the post-war years when everyone appreciated a good use of the semi-colon? Everyone's favorite book was by Faulkner or Woolf or Roth. We were a civilized civilization. This was before the Internet and cable television, and so people had these, like, wholly different desires and attention spans. They just craved, craved, craved the erudition and cultivation of our literary kings and queens.

Well, that time never existed.

Playing at Programming

I know next to nothing about programming, but I love apps that make me feel like a programmer. Some of my favorite examples on the Mac have been Keyboard Maestro, Text Expander, Hazel, and Automator, each of which allows you to create what are essentially mini-programs, without knowing a bit of code.

The most recent exciting entry in this category is Workflow, which has been called Automator for iOS, and which has gotten some great reviews. As soon as I downloaded it, I set out to see if I could build a workflow that solved a problem I’ve had for a long time: creating a link post directly from within Instapaper on my iPhone. I made a video to demonstrate how I built it.

Update: I've since played around a lot more with Workflow and created this much more robust version of the link post action. This is new version creates a Markdown formatted link post from Instapaper (may work with other apps with the system share sheet) and sends it to Drafts.

First copy some text for the block quote, then run the workflow from the share sheet. You will have the option to input the "link text," which is the text before and inside the brackets. You will then have the option to input "post link text" which is anything that would come after the inline link. Then the workflow will create a Markdown formatted link post in Drafts.

Download the workflow here.

Technology as Exoskeleton

I've always thought Aliens (with an s) was a more entertaining movie than the original Alien, but Tim Carmody makes a great argument for why Aliens is also a brilliant movie about technology.

That’s what technology is. It’s the world of things, some impossibly stupid, some smarter than we are, we have assembled around ourselves to cover over our fundamental weaknesses as a species. The strength we have, the advantage this gives us, is our ability to stand apart from the things we’ve made: to use them and set them aside; to make them prosthetic extensions of ourselves and to let them go.

Government-Funded Innovation

One point leapt out at me in Michael Hanlon's piece for or Aeon about why so many scientific and technological innovations occurred during the period he calls the "Golden Quarter," between 1945 and 1971, and why the pace of innovation has slowed since. The surprising fact is just how many of those innovations came not from private enterprise but from publicly funded research.

The first electronic computers came not from the labs of IBM but from the universities of Manchester and Pennsylvania. (Even the 19th-century analytical engine of Charles Babbage was directly funded by the British government.) The early internet came out of the University of California, not Bell or Xerox. Later on, the world wide web arose not from Apple or Microsoft but from CERN, a wholly public institution. In short, the great advances in medicine, materials, aviation and spaceflight were nearly all pump-primed by public investment. But since the 1970s, an assumption has been made that the private sector is the best place to innovate.

Printing Flesh

Jerome Groopman on the growing use of 3-D printing in medicine:

Almost every day, I receive an e-mail from my hospital’s press office describing how yet another colleague is using a 3-D printer to create an intricately realistic surgical model—of a particular patient’s mitral valve, or finger, or optic nerve—to practice on before the actual operation. Surgeons are implanting 3-D-printed stents, prosthetics, and replacement segments of human skull. The exponents of 3-D printing contend that the technology is making manufacturing more democratic; the things we are choosing to print are becoming ever more personal and intimate. This appears to be even more true in medicine: increasingly, what we are printing is ourselves.

Artisanal Memory Banks

Jody Rosen’s piece on The Knowledge (the legendary test London cabbies have to take to prove they’ve memorized every street, corner, landmark, and business location in their entire labrynthine city) is one of the most compelling works of journalism I've read in a long time. It teems over with fascinating details about the history of London, the history of this test, the people who take it, and the effect it has on their lives, down to the cellular level.

Eleanor Maguire, a neuroscientist at University College London, has spent 15 years studying cabbies and Knowledge boys. She has discovered that the posterior hippocampus, the area of the brain known to be important for memory, is bigger in London taxi drivers than in most people, and that a successful Knowledge candidate’s posterior hippocampus enlarges as he progresses through the test. Maguire’s work demonstrates that the brain is capable of structural change even in adulthood. The studies also provide a scientific explanation for the experiences of Knowledge students, the majority of whom have never pursued higher education and profess shock at the amount of information they are able to assimilate and retain.

Rosen makes the case that storing this vast array of knowledge inside the brain cells of living human beings is a noble pursuit even if technology can do it for us.

The Knowledge should be maintained because it is good for London’s soul, and for the souls of Londoners. The Knowledge stands for, well, knowledge — for the Enlightenment ideal of encyclopedic learning, for the humanist notion that diligent intellectual endeavor is ennobling, an end in itself. To support the Knowledge is to make the unfashionable argument that expertise cannot be reduced to data, that there’s something dystopian, or at least depressing, about the outsourcing of humanity’s hard-won erudition to gizmos, even to portable handheld gizmos that themselves are miracles of human imagination and ingenuity. London’s taxi driver test enshrines knowledge as — to use the au courant term — an artisanal commodity, a thing that’s local and homespun, thriving ideally in the individual hippocampus, not the digital hivemind.

Everything Is Contaminated

If I were a scientist, articles like this one from Ed Yong would terrify me:

You’ve got a group of people with a mysterious disease, and you suspect that some microbe might be responsible. You collect blood and tissue samples, you extract the DNA from them using a commonly used kit of chemicals, and you sequence the lot. Eureka! You find that every patient has the same microbe—let’s say Bradyrhizobium, or Brady for short. Congratulations, you have discovered the cause of Disease X.

Don’t celebrate yet.

You run the exact same procedure on nothing more than a tube of sterile water and… you find Brady. The microbe wasn’t in your patients. It was in the chemical reagents you used in your experiments. It’s not the cause of Disease X; it’s a contaminant.

Apparently, this is turning out to be the case in a lot of experiments.

Consciousness-Free Intelligence

Kevin Kelly writing for Wired about how artificial intelligence will differ from our concept of human intelligence:

In fact, this won’t really be intelligence, at least not as we’ve come to think of it. Indeed, intelligence may be a liability—especially if by “intelligence” we mean our peculiar self-awareness, all our frantic loops of introspection and messy currents of self-consciousness. We want our self-driving car to be inhumanly focused on the road, not obsessing over an argument it had with the garage. The synthetic Dr. Watson at our hospital should be maniacal in its work, never wondering whether it should have majored in English instead. As AIs develop, we might have to engineer ways to prevent consciousness in them—and our most premium AI services will likely be advertised as consciousness-free.

The Invention of the Future

From The 10 greatest changes of the past 1,000 years by Ian Mortimer:

There can be no doubt that technology hugely changed the ways in which we lived and died in the 20th century. However, it also masks changes that are arguably even more profound. In 1900 few people seriously considered the future. William Morris and a few socialists wrote utopian visions of the world they wanted to see, but there was little serious consideration of where we were going as a society. Today we predict almost everything: what the weather will be, what housing we will need, what our pensions will be worth, where we will dispose of our rubbish for the next 30 years and so on. The UN predicts world population levels up to the year 2300. Global warming reports are hot news. Novels about the future are 10 a penny. Newspapers and online newsfeeds are increasingly full of stories of what will happen, not what has happened.

(via Kottke)

Time Shifting

Stacey D’Erasmo, writing for the New Yorker online about how the internet is changing our relationship to time:

If it’s an anxious moment concerning time, it’s also a playful and expansive one. All temporal bets are off, including, given climate change, the seasons. It’s still one earth, but it is now subtended by a layer of highly elastic non-time, wild time, that is akin to a global collective unconscious wherein past, present, and future occupy one unmediated plane.

I’ve long thought that one of the reasons adult life feels like it passes much more quickly is that most adults no longer have the sign posts of the school year to keep track of time passing. But it occurs to me with this article that many other sign posts have gone away.

With the advent of time-shifted television, we no longer pin days of the week to certain TV programs. With the advent of email, we no longer feel the passing of time between sending a letter and receiving a response. With the advent of digital photography, we no longer have to get our photos developed. With the advent of Facebook, we no longer have to wait for Christmas cards to see pictures of the people we used to know, and their children.

It's not an original observation, but there is a downside to eliminating all this waiting. I once told a co-worker that whenever I was forced to restart my computer, it always felt like the least productive two minutes of my entire life. It's perhaps inevitable that the more digital technology reduces the time we used to spend waiting for things, the less patient we become.

A Different Kind of Technology Podcast

Take a moment, right now, to look around the room you’re sitting in: your living room, your bedroom, your office, or maybe a public place, a coffee shop, your office at work. Wherever you are, take note of the sheer number of objects in the room that were fashioned by human technology.

As I write this, almost everything in my vantage point is a product of human design: there’s the computer in my lap, of course, and the cellphone in my pocket, but there’s also the room itself, with floor, walls, windows, and ceiling. The room is lit by two lamps, plugged into outlets that lead to electrical lines, which themselves snake off to a power generator, the location of which I don’t even know. Several works of art hang on my walls, including a photograph I took in another country almost twenty years ago. Finally, there’s the furniture, the couch on which I sit, two chairs, and a table scattered with books, a pair of reading glasses, and a pouch holding the nebulizer my son needs to soothe the occasional asthma attack.

Most of these objects are mundane, but some are transformative, even life-saving. Maybe this is obvious. We live in the modern world after all, and technology is the engine of modernity. When we use the word “technology,” we often use it only to refer to the most recent forms of technology, especially anything related to computers. But the broadest definition of technology includes any tool created by humans, from hand axes to poetry, from stone tablets to tablet computers.

Our species has been using tools for our entire history. Technology, though it is sometimes viewed in opposition to humanity, is actually what makes us human. And as we’ve used it to change the world around us, it has, in turn, changed us.

The Podcast

There’s no lack of podcasts about technology, especially podcasts about computers, cellphones, apps, and the people who make them. But I think there is a lack of podcasts, and journalism in general, about how technology affects us. My goal in launching my own podcast is to try to capture those stories: the many ways, both amazing and infuriating, that technology intertwines with how we live.

I’m starting with people I know. The first two episodes are a two-part conversation with my older brother, who’s spent the last fourteen years delivering pizzas for Domino’s Pizza, and has no plans to do anything else. In Episode 1, he describes his life-long, and somewhat problematic, love affair with automobiles. In Episode 2, he explains why he hates computers, and especially the internet, even though the internet helped answer a question he’d had since the day he was born.

Future episodes will include interviews with a chemist who uses X-rays to examine the nature of organization in crystals, a woman who decided after her divorce to spend several years living without television, a navy wife who struggled to use communication technologies to keep in touch with her husband, and a nurse who learned over a forty-five year career that the most powerful tool in her nursing arsenal was conversation. I also plan future episodes about email, the microbiome, the early days of IBM, the technology of taxi cabs, and the life-saving tools of a neonatal intensive care unit.

Future Interviews

As some point, I’d like to incorporate the voices of experts, the people who create the technology we use and the journalists who write about it. But I am especially interested in the people whose lives have been directly impacted by these tools, devices, and machines. If you have a story to share about how technology has impacted your life, get in touch. My email is anxiousmail@gmail.com, the show’s Twitter handle is @anxiousmachine, and I’m on Twitter @robmcmyers.

I’m hoping to post a new episode every two weeks, so please subscribe in iTunes or your favorite podcast app. Just search for Anxious Machine.

Unread, Updated

My most anticipated app update for the iPhone 6 Plus arrived this week. I wrote about why Unread is my favorite RSS app for the iPad on The Sweet Setup, and I made a video showing why I love the interface. The truth is that Unread is possibly my favorite app on iOS, period. It’s been painful to have to use that beautiful interface, with its lovely Whitney font, in scaled up form. Getting this update makes my iPhone 6 Plus experience almost complete (looking in your direction, Omnifocus).

The app is now free to try for 50 articles, with an in-app purchase to unlock unlimited reading. If you're already an Unread user, the upgrade to unlimited is free (though I kicked in a tip for the hard work).

Downloading the update, however, is slightly complicated. Supertop has purchased the app from Jared Sinclair, and Supertop had to jump through a number of hoops to make the app their own. They’ve written a blog post about their experience and also talked about it on a recent episode of Inquisitive. The main thing to understand is that you have to download the new version of the app separately, and then the old and the new app communicate with each other to unlock the full features.

But the important thing is that we now have Unread, the single best iOS app for the 6 and 6 plus (thanks to sloppy swiping) “finally” optimized for those devices. The addition of the share sheet with iOS extensions is gravy on an already amazing app. My only real complaint about the old Unread was that I couldn’t tag links I sent to Pinboard. Now I can.

It was so depressing this past summer to read about Jared Sinclair’s disappointing experience making money from one of the most beautifully designed apps on iOS. Maybe RSS isn’t popular enough to sustain revenue. Maybe the pay upfront model really is dead. All I can say is that I’m enormously happy Supertop decided to keep the app running. Let’s hope this second life is a long one.

Go download it. Seriously.

Anxiety Apps

I recently got an email from Dr. Anne Hallward, the host of a community radio show in Portland, Maine called Safe Space Radio, which has been airing a series about living with anxiety. She had run across my blog and wondered if I’d be interested in producing a radio piece reviewing apps designed for anxiety treatment.

I hadn’t made a radio story since 2009, when I left my job at American Public Media, but Dr. Hallward had caught me at exactly the right time. I’m actually in the process of producing my own podcast (more on that soon), and so I had all the equipment and software I needed to produce a piece for her.

The most surprising part of the experience was that I actually found an app I liked for anxiety management. But more than that, I just had a lot of fun making radio again, even on this somewhat serious topic, and I’m quite happy with how the final piece turned out. If you’re interested in anxiety, apps, or anxiety apps, you can listen to the episode Angst, Introversion, and Apps which includes an interview with a philosopher about existential angst, a conversational book review about introversion, and ends with my piece about apps.

Perils of Digital Intimacy

A somewhat terrifying (for a parent) piece by Hanna Rosen about the culture of sexting in Louisa County, Virginia, where police discovered an Instagram account with hundreds of nude photos of local teen girls:

Within an hour, the deputies realized just how common the sharing of nude pictures was at the school. “The boys kept telling us, ‘It’s nothing unusual. It happens all the time,’ ” Lowe recalls. Every time someone they were interviewing mentioned another kid who might have naked pictures on his or her phone, they had to call that kid in for an interview. After just a couple of days, the deputies had filled multiple evidence bins with phones, and they couldn’t see an end to it. Fears of a cabal got replaced by a more mundane concern: what to do with “hundreds of damned phones. I told the deputies, ‘We got to draw the line somewhere or we’re going to end up talking to every teenager in the damned county!’ ”

What I love is how Rosen goes way beyond the scandal to show that kids aren't being corrupted by technology as much as they are using technology to explore new avenues of intimacy. Of course there's a huge risk in sharing nude photos of yourself (especially if the state then charges you with distribution of child pornography), but there's a risk to any form of intimacy, sexual or otherwise. We assume the teenagers doing this simply aren't assessing that risk, but it's fascinating to hear them explain their own understanding of the tradeoffs.

Why do kids sext? One recent graduate told me that late at night, long after dinner and homework, her parents would watch TV and she would be in her room texting with her boyfriend. “You have a beautiful body,” he’d write. “Can I see it?” She knew it would be hard for him to ever really see it. She had a strict curfew and no driver’s license yet, and Louisa County is too spread out for kids to get anywhere on their own without a car.

“I live literally in the middle of nowhere,” the girl told me. “And this boy I dated lived like 30 minutes away. I didn’t have a car and my parents weren’t going to drop me off, so we didn’t have any alone time. Our only way of being alone was to do it over the phone. It was a way of kind of dating without getting in trouble. A way of being sexual without being sexual, you know? And it was his way of showing he liked me a lot and my way of saying I trusted him.”

Clothing “Naked Capitalism" with Safety Nets

Great piece by Ben Thompson on Uber and why social safety nets and entrepreneurism should go hand in hand.

It’s not that Uber is bad for not hiring workers and giving them attendant benefits, it’s that said benefits shouldn’t be Uber’s – or any employer’s – responsibility at all. It’s employer-based health care that is the problem, and in ways that go beyond the economic benefits of universal health care (the most obvious of which is the broadest possible risk pool, not to mention unmatched buying power). It’s that people are afraid to leave or lose their jobs because they lack the most basic of safety nets.

Free marketeers and proponents of universal healthcare tend not to be on the same side of the political divide, but they should be. You're unlikely to reap the benefits of free markets if most people are too afraid to be free.

Sustaining Optimism

In regards to my piece about Apple Keynotes yesterday, Matt Haughy wrote a similar piece back in 2007 about how he saw Apple taking over the role NASA had once played, by sustaining his optimism in the future:

When I was a kid, the future was filled with optimism. The year 2000 was 10-20 years away and it was this magical goal we were working towards. I was obsessed with astronauts, especially those in NASA that got to ride in the space shuttle. While I never made it to spacecamp, I envied the kids that did.

Then the shuttle blew up, the year 2000 passed without flying cars, and 9/11 sparked another world war. Leaders talked about the past, not the future. Optimism was dead.

How Apple's Keynotes Sell Us the Future

On the morning of the latest Apple Keynote, announcing new iPhone models and a new Apple watch, I made a joke to my wife: "Are you as excited about today as I am?" She wasn't, of course. She couldn't care less. And I was being somewhat ironic about my own excitement. If anything, I felt a mild shame about my excitement, as if admitting to it implicated me both as a nerd and as a consumerist zombie, brainwashed into lusting after each new technological gadget Apple releases.

But I was excited. And I wasn't alone. Several of my students, at the college where I teach writing, told me they were excited too. One of them started watching the keynote on her phone as soon as class was done. I stayed after class, helping other students, and the student with her phone began to narrate to us what she was hearing from the keynote. "There's two new phones! The iPhone 6 and the 6+! They're beautiful! They're gonna be amazing!" Across town, my wife was having a similar experience at the middle school where she teaches, students walking up to each other in the hallway asking, "Did you hear about the new iPhones? They just announced new iPhones!"

After class, I rushed down to my office to catch the second half of the keynote while I ate lunch. I learned a little more about the two new phone models, Apple's new payment service Apple Pay, and of course the new Apple Watch. I didn't have an afternoon class, so when the keynote was done, I packed up my bag and walked out to my car.

I felt giddy with excitement at everything I'd learned, but I still couldn't shake an instinctive skepticism at that excitement. Wasn't it a bit weird to be watching a company's announcement of new products with bated breath? I am a grownup, yet these keynotes put me in the position a child on Christmas morning, mad with anticipation at the unwrapping of presents.

Apple has been making these kinds of product announcements since the introduction of the Macintosh, but I don't think they became cultural phenomenons until the announcement of the original iPhone. That was the first keynote I ever watched, and I've watched nearly every one since. But why? Steve Jobs was a great showman, of course, and it was a delight to watch him work, but I continue to enjoy the unveiling of new products without him. Is it merely my Apple fanboy-ism that keeps me coming back for more? Or is there something unique about the way Apple makes theater out of their products?

Getting into my car, I pulled out my 2 year old iPhone 5, which now looked excessively old and scuffed around the edges, ready for replacement. My podcast app had automatically downloaded a new episode of Radiolab, which concerned a book called In the Dust of This Planet, "an academic treatise about the horror humanity feels as we realize that we are nothing but a speck in the universe."

Radiolab's host, Jad Abumrad, was exploring the idea of nihilism and whether it is beginning to permeate modern culture. At one point, he tied the modern allure of nihilism to the growing fear of threats like terrorism, ebola, and especially global warming, which increasingly feels like an inevitable, looming tsunami of environmental disasters we're powerless to stop. Abumrad mentioned a survey that asked whether people felt the future would be better than the past, and more than 70 percent of respondents said no. They believed the future would be worse.

Suddenly, it occurred to me that one of the things Apple is selling in its keynotes (and part of what makes those keynotes so compelling) is its vision of the future. Where else in the world can we look and say that things are definitively getting better? We have an African American president, but we also have Ferguson. We've ended the war in Iraq, except Iraq is falling apart. There's a few isolated pockets of progress here and there, maybe. But one of the few places of clear progress is technology, and Apple is at the forefront of selling us on the betterness of that technology: the endless progress toward thinner, lighter, more beautiful and more powerful devices.

A vision of that progress can be incredibly comforting. Indeed, Apple's products have been a comfort to me at some of the darkest moments of my life.

I remember in the fall of 2008, as the economy was crashing, I was in the midst of my first real experience of depression. Every night, I woke up around 3 am in a panic. I worried about my job, whether I was making too many mistakes, whether I would forget something, whether I had already forgotten something. I worried about what would happen if I got laid off, or worse yet, fired. I worried what our friends would think if they heard I'd been fired. I worried what my wife would think. I worried about finding a new job in the midst of the economic crisis.

In the middle of all this, I had to plan my family's Thanksgiving trip, a long drive from Saint Paul, Minnesota to Lafayette, Indiana. The week before the trip, our son broke out in hives. Then the laundry machine broke. When I opened the pump, there was so much water that it spilled over onto the floor, and I had to get two towels to wipe it up. I ran the drain cycle again, and listened as the pump made a rattling sound, like a lost tooth was caught inside it. The next day, I noticed my car's brakes were squeaking. Fearing some sort of Biblical curse, I called the nearby Tires Plus to have the brakes checked before our drive. I had to take my kids with me to drop the car off, because my wife wasn't home from work yet. The kids and I walked home in the first real snow shower of the season.

When we got back to the house, I realized that I had left all my keys at the Tires Plus, so my children and I were locked out of the house. The sun was setting, it was thirty degrees and snowing, and I felt utterly defeated. To entertain my children while we waited for my wife to get home, I dug into my pocket and pulled out a small rectangular device. For the next fifteen minutes, as the sky darkened and the snowflakes fell, my children's faces were lit by the glow of a scene from Disney's Fantasia, a vision of fairies dancing to the Nutcracker Suite, playing as if by magic in the palm of my hand.

It's one of the few memories I have of that autumn in color. Everything else in my life felt broken, but this iPhone, the original iPhone, was still amazing.

I'm not so cynical as to believe these devices are merely a distraction from the horrors of the real world. But I think it's worth considering what we're longing for when we watch these keynotes. We want a future where the incredible technological advances of human civilization have made the world a better place. I believe these devices have the power to help us do that. But they also have the power to distract us. I hope we can channel our longing for a better world, so that our vision of the future doesn't remain confined to devices, but actually exists outside those devices as well.

The Fine Line Between Digital and Public

In the wake of the recent theft (not leak, but theft) of photos of female celebrities, I keep thinking about how the persistence of digital information makes this sort of theft possible. The crime committed is just as shameful as (or even more shameful than) a peeping tom peering in through a bedroom window, but the digital nature of the crime magnifies the effect at both ends, multiplying both the number of bedroom windows and the number of peeping tom eyes.

Nearly 80 years ago, Walter Benjamin wrote in "The Work of Art in the Age of Mechanical Reproduction":

In even the most perfect reproduction, one thing is lacking: the here and now of the work of art-its unique existence in a particular place. It is this unique existence-and nothing else-that bears the mark of the history to which the work has been subject.

The problem with digital information, like the photos in this case, is that they lack the here and now of the human to which they're attached, the unique humanity of a particular person. This is privacy in the age of digitial reproduction.

I'm reminded of something I wrote a few months back, in the wake of revelations about NSA spying, on the unforeseen stickiness of the web:

In those early days, the internet felt like an ocean made for skinny-dipping; instead of doffing your clothes, you doffed your identity. You could read about, look at, discuss, and eventually purchase just about anything that interested you, without fear of anyone looking at you funny. This lack of identity could be used for nefarious purposes, of course, and it could lead people down any number of self-destructive paths. But for many, it was liberating to find that, on the web, you could explore your true nature and find fellow travelers without shame.

But as paranoia grows about the NSA reading our emails and Google tapping into our home thermostats, it’s increasingly clear that — rather than providing an identity-free playground — the web can just as easily capture and preserve aspects of our identities we would have preferred to keep hidden. What started as a metaphor to describe the complexly interconnected network has come to suggest a spider’s sticky trap.

It's especially depressing, though not surprising, that women in particular are the targets of this crime, since women historically have been afforded so much less privacy than men. It should go without saying that searching for or looking at these pictures makes you part of the problem.

Clickbait and the Nature of Suspense

On Wednesday, Vox.com posted a story that purported to give a definitive answer to what really happened at the infamous ending of The Sopranos. The tweet that announced the story attracted the attention of the Twitter account @SavedYouAClick, which takes great pleasure in spoiling clickbaity headlines. Nataurally, @SavedYouAClick tweeted the answer to the headline's question: "Did Tony die at the end of The Sopranos?"

Nilay Patel wrote a furious response in The Verge.

Because the headline was phrased in the form of a question — the question of the entire series — Jake Beckman, who runs the Twitter account @savedyouaclick, decided that it wasn't worth it. He "saved you a click" and tweeted the reveal.

This is bullshit.

It is bullshit because he didn't save anyone a click at all — he stole an experience. That story is great. It is absolutely worth the click.

The first response I read to Patel's rant was Nick Herr's at Pixel Envy:

If the article is so dependent on the teaser headline that a single tweet can bust the whole thing up, then Beckman did save people a click.

This was echoed by Jake Beckmann himself, who told the New York Observer,

I’m one person with a Twitter account...If I can disrupt your content distribution strategy from my iPhone, then maybe something is wrong with your content distribution strategy.”

The whole controversy reminds me of Alfred Hitchcock's old adage about the nature of suspense. Lesser storytellers often assume that suspense comes from a lack of knowledge. And it's true that a lack of knowledge can lead an audience forward. It's a lack of knowledge that lies at the heart of every cheap cliffhanger ending and the plot developments of TV shows like Lost. But lack of knowledge is ultimately a distraction, because instead of human drama, the audience becomes obsessed with information, and once the audience gets that information, they usually find it unsatisfying. Because information is, in the end, not dramatic.

Hitchcock said the secret of real suspense is giving the audience more knowledge. Show the audience that there's a bomb under the table that none of the characters can see. Then let the audience watch in horror as the characters go blindly about their business, with no idea they're about to be killed. Rather than trying to solve some ultimately pointless mystery, the audience now feels deeply invested in the heart of the story: these characters.

The problem with clickbaity headlines is not that they're a cheap method of enticing the reader. The problem is that, by using a lack of information as the enticement, they distract the reader from what might be truly valuable.

I clicked on the Vox.com link to find out what really happened to Tony Soprano, and I was so enticed by the headline that all I did was skim the article for the answer. Once I found it, I felt completely unsatisfied, and didn't even finish reading. If Patel is right and the article is a brilliant analysis of the legacy of the Sopranos, that experience was ruined for me.

By the headline.