Rat Cages

Fascinating critique of the idea that drugs cause drug addiction (I'm quoting at length because this section sets up the whole rest of the piece):

The experiment is simple. Put a rat in a cage, alone, with two water bottles. One is just water. The other is water laced with heroin or cocaine. Almost every time you run this experiment, the rat will become obsessed with the drugged water, and keep coming back for more and more, until it kills itself.

The advert explains: “Only one drug is so addictive, nine out of ten laboratory rats will use it. And use it. And use it. Until dead. It’s called cocaine. And it can do the same thing to you.” But in the 1970s, a professor of Psychology in Vancouver called Bruce Alexander noticed something odd about this experiment. The rat is put in the cage all alone. It has nothing to do but take the drugs. What would happen, he wondered, if we tried this differently? So Professor Alexander built Rat Park. It is a lush cage where the rats would have colored balls and the best rat-food and tunnels to scamper down and plenty of friends: everything a rat about town could want. What, Alexander wanted to know, will happen then?

In Rat Park, all the rats obviously tried both water bottles, because they didn’t know what was in them. But what happened next was startling. The rats with good lives didn’t like the drugged water. They mostly shunned it, consuming less than a quarter of the drugs the isolated rats used. None of them died. While all the rats who were alone and unhappy became heavy users, none of the rats who had a happy environment did.

The whole piece is well worth a read, especially for the results of a similar experiment on the heroin addicts in Portugal. The author also has a book.

The Decline of Reading?

Alexis Madrigal on the sad state of reading in the US:

Remember the good old days when everyone read really good books, like, maybe in the post-war years when everyone appreciated a good use of the semi-colon? Everyone's favorite book was by Faulkner or Woolf or Roth. We were a civilized civilization. This was before the Internet and cable television, and so people had these, like, wholly different desires and attention spans. They just craved, craved, craved the erudition and cultivation of our literary kings and queens.

Well, that time never existed.

Everything Is Contaminated

If I were a scientist, articles like this one from Ed Yong would terrify me:

You’ve got a group of people with a mysterious disease, and you suspect that some microbe might be responsible. You collect blood and tissue samples, you extract the DNA from them using a commonly used kit of chemicals, and you sequence the lot. Eureka! You find that every patient has the same microbe—let’s say Bradyrhizobium, or Brady for short. Congratulations, you have discovered the cause of Disease X.

Don’t celebrate yet.

You run the exact same procedure on nothing more than a tube of sterile water and… you find Brady. The microbe wasn’t in your patients. It was in the chemical reagents you used in your experiments. It’s not the cause of Disease X; it’s a contaminant.

Apparently, this is turning out to be the case in a lot of experiments.

Confessing for a Living

Amazing story by Hannah Rosin about her former friend and colleague Stephen Glass, who became infamous for inventing many of the stories he wrote for The New Republic. My favorite detail is about Glass's current job at a personal injury law firm, where he works under a man named Paul Zuckerman:

When clients come in, Steve helps the firm get them ready for trial. The first thing he does is tell them who he is. He says he worked at a magazine and he lied and made up stories and covered them up. He says he got caught, that Hollywood made a movie about it and that there are many people “who dislike me and rightly so.” He has done this about a dozen times a month, for the last decade, meaning that the conference room in the firm’s modern, exposed-brick office has become his equivalent of Zuckerman’s dingy room, where Steve confesses, over and over again.

Zuckerman has Steve do this so the clients won’t find out about his history themselves and because he has to explain why Steve can never appear in court. But there is a deeper reason. In the firm’s lore, personal-injury work is like evangelism. “We are dealing with people who have not only been injured; they’ve been broken and need to be made whole,” says Zuckerman. In order to do that, the lawyers need to know the whole truth about a person, even secrets they’ve never confessed to anyone. But the clients are often afraid to disclose the truth because they fear it will hurt their case. So the lawyers have to work on them. “You can lie to your priest and lie to your wife,” Zuckerman says, “but you can’t lie to us.”

Incredible that this man, made famous for lying, now has to tell the truth about himself over and over again, in order to get others to do the same.

Time Shifting

Stacey D’Erasmo, writing for the New Yorker online about how the internet is changing our relationship to time:

If it’s an anxious moment concerning time, it’s also a playful and expansive one. All temporal bets are off, including, given climate change, the seasons. It’s still one earth, but it is now subtended by a layer of highly elastic non-time, wild time, that is akin to a global collective unconscious wherein past, present, and future occupy one unmediated plane.

I’ve long thought that one of the reasons adult life feels like it passes much more quickly is that most adults no longer have the sign posts of the school year to keep track of time passing. But it occurs to me with this article that many other sign posts have gone away.

With the advent of time-shifted television, we no longer pin days of the week to certain TV programs. With the advent of email, we no longer feel the passing of time between sending a letter and receiving a response. With the advent of digital photography, we no longer have to get our photos developed. With the advent of Facebook, we no longer have to wait for Christmas cards to see pictures of the people we used to know, and their children.

It's not an original observation, but there is a downside to eliminating all this waiting. I once told a co-worker that whenever I was forced to restart my computer, it always felt like the least productive two minutes of my entire life. It's perhaps inevitable that the more digital technology reduces the time we used to spend waiting for things, the less patient we become.

Anxiety Apps

I recently got an email from Dr. Anne Hallward, the host of a community radio show in Portland, Maine called Safe Space Radio, which has been airing a series about living with anxiety. She had run across my blog and wondered if I’d be interested in producing a radio piece reviewing apps designed for anxiety treatment.

I hadn’t made a radio story since 2009, when I left my job at American Public Media, but Dr. Hallward had caught me at exactly the right time. I’m actually in the process of producing my own podcast (more on that soon), and so I had all the equipment and software I needed to produce a piece for her.

The most surprising part of the experience was that I actually found an app I liked for anxiety management. But more than that, I just had a lot of fun making radio again, even on this somewhat serious topic, and I’m quite happy with how the final piece turned out. If you’re interested in anxiety, apps, or anxiety apps, you can listen to the episode Angst, Introversion, and Apps which includes an interview with a philosopher about existential angst, a conversational book review about introversion, and ends with my piece about apps.

Perils of Digital Intimacy

A somewhat terrifying (for a parent) piece by Hanna Rosen about the culture of sexting in Louisa County, Virginia, where police discovered an Instagram account with hundreds of nude photos of local teen girls:

Within an hour, the deputies realized just how common the sharing of nude pictures was at the school. “The boys kept telling us, ‘It’s nothing unusual. It happens all the time,’ ” Lowe recalls. Every time someone they were interviewing mentioned another kid who might have naked pictures on his or her phone, they had to call that kid in for an interview. After just a couple of days, the deputies had filled multiple evidence bins with phones, and they couldn’t see an end to it. Fears of a cabal got replaced by a more mundane concern: what to do with “hundreds of damned phones. I told the deputies, ‘We got to draw the line somewhere or we’re going to end up talking to every teenager in the damned county!’ ”

What I love is how Rosen goes way beyond the scandal to show that kids aren't being corrupted by technology as much as they are using technology to explore new avenues of intimacy. Of course there's a huge risk in sharing nude photos of yourself (especially if the state then charges you with distribution of child pornography), but there's a risk to any form of intimacy, sexual or otherwise. We assume the teenagers doing this simply aren't assessing that risk, but it's fascinating to hear them explain their own understanding of the tradeoffs.

Why do kids sext? One recent graduate told me that late at night, long after dinner and homework, her parents would watch TV and she would be in her room texting with her boyfriend. “You have a beautiful body,” he’d write. “Can I see it?” She knew it would be hard for him to ever really see it. She had a strict curfew and no driver’s license yet, and Louisa County is too spread out for kids to get anywhere on their own without a car.

“I live literally in the middle of nowhere,” the girl told me. “And this boy I dated lived like 30 minutes away. I didn’t have a car and my parents weren’t going to drop me off, so we didn’t have any alone time. Our only way of being alone was to do it over the phone. It was a way of kind of dating without getting in trouble. A way of being sexual without being sexual, you know? And it was his way of showing he liked me a lot and my way of saying I trusted him.”

How Apple's Keynotes Sell Us the Future

On the morning of the latest Apple Keynote, announcing new iPhone models and a new Apple watch, I made a joke to my wife: "Are you as excited about today as I am?" She wasn't, of course. She couldn't care less. And I was being somewhat ironic about my own excitement. If anything, I felt a mild shame about my excitement, as if admitting to it implicated me both as a nerd and as a consumerist zombie, brainwashed into lusting after each new technological gadget Apple releases.

But I was excited. And I wasn't alone. Several of my students, at the college where I teach writing, told me they were excited too. One of them started watching the keynote on her phone as soon as class was done. I stayed after class, helping other students, and the student with her phone began to narrate to us what she was hearing from the keynote. "There's two new phones! The iPhone 6 and the 6+! They're beautiful! They're gonna be amazing!" Across town, my wife was having a similar experience at the middle school where she teaches, students walking up to each other in the hallway asking, "Did you hear about the new iPhones? They just announced new iPhones!"

After class, I rushed down to my office to catch the second half of the keynote while I ate lunch. I learned a little more about the two new phone models, Apple's new payment service Apple Pay, and of course the new Apple Watch. I didn't have an afternoon class, so when the keynote was done, I packed up my bag and walked out to my car.

I felt giddy with excitement at everything I'd learned, but I still couldn't shake an instinctive skepticism at that excitement. Wasn't it a bit weird to be watching a company's announcement of new products with bated breath? I am a grownup, yet these keynotes put me in the position a child on Christmas morning, mad with anticipation at the unwrapping of presents.

Apple has been making these kinds of product announcements since the introduction of the Macintosh, but I don't think they became cultural phenomenons until the announcement of the original iPhone. That was the first keynote I ever watched, and I've watched nearly every one since. But why? Steve Jobs was a great showman, of course, and it was a delight to watch him work, but I continue to enjoy the unveiling of new products without him. Is it merely my Apple fanboy-ism that keeps me coming back for more? Or is there something unique about the way Apple makes theater out of their products?

Getting into my car, I pulled out my 2 year old iPhone 5, which now looked excessively old and scuffed around the edges, ready for replacement. My podcast app had automatically downloaded a new episode of Radiolab, which concerned a book called In the Dust of This Planet, "an academic treatise about the horror humanity feels as we realize that we are nothing but a speck in the universe."

Radiolab's host, Jad Abumrad, was exploring the idea of nihilism and whether it is beginning to permeate modern culture. At one point, he tied the modern allure of nihilism to the growing fear of threats like terrorism, ebola, and especially global warming, which increasingly feels like an inevitable, looming tsunami of environmental disasters we're powerless to stop. Abumrad mentioned a survey that asked whether people felt the future would be better than the past, and more than 70 percent of respondents said no. They believed the future would be worse.

Suddenly, it occurred to me that one of the things Apple is selling in its keynotes (and part of what makes those keynotes so compelling) is its vision of the future. Where else in the world can we look and say that things are definitively getting better? We have an African American president, but we also have Ferguson. We've ended the war in Iraq, except Iraq is falling apart. There's a few isolated pockets of progress here and there, maybe. But one of the few places of clear progress is technology, and Apple is at the forefront of selling us on the betterness of that technology: the endless progress toward thinner, lighter, more beautiful and more powerful devices.

A vision of that progress can be incredibly comforting. Indeed, Apple's products have been a comfort to me at some of the darkest moments of my life.

I remember in the fall of 2008, as the economy was crashing, I was in the midst of my first real experience of depression. Every night, I woke up around 3 am in a panic. I worried about my job, whether I was making too many mistakes, whether I would forget something, whether I had already forgotten something. I worried about what would happen if I got laid off, or worse yet, fired. I worried what our friends would think if they heard I'd been fired. I worried what my wife would think. I worried about finding a new job in the midst of the economic crisis.

In the middle of all this, I had to plan my family's Thanksgiving trip, a long drive from Saint Paul, Minnesota to Lafayette, Indiana. The week before the trip, our son broke out in hives. Then the laundry machine broke. When I opened the pump, there was so much water that it spilled over onto the floor, and I had to get two towels to wipe it up. I ran the drain cycle again, and listened as the pump made a rattling sound, like a lost tooth was caught inside it. The next day, I noticed my car's brakes were squeaking. Fearing some sort of Biblical curse, I called the nearby Tires Plus to have the brakes checked before our drive. I had to take my kids with me to drop the car off, because my wife wasn't home from work yet. The kids and I walked home in the first real snow shower of the season.

When we got back to the house, I realized that I had left all my keys at the Tires Plus, so my children and I were locked out of the house. The sun was setting, it was thirty degrees and snowing, and I felt utterly defeated. To entertain my children while we waited for my wife to get home, I dug into my pocket and pulled out a small rectangular device. For the next fifteen minutes, as the sky darkened and the snowflakes fell, my children's faces were lit by the glow of a scene from Disney's Fantasia, a vision of fairies dancing to the Nutcracker Suite, playing as if by magic in the palm of my hand.

It's one of the few memories I have of that autumn in color. Everything else in my life felt broken, but this iPhone, the original iPhone, was still amazing.

I'm not so cynical as to believe these devices are merely a distraction from the horrors of the real world. But I think it's worth considering what we're longing for when we watch these keynotes. We want a future where the incredible technological advances of human civilization have made the world a better place. I believe these devices have the power to help us do that. But they also have the power to distract us. I hope we can channel our longing for a better world, so that our vision of the future doesn't remain confined to devices, but actually exists outside those devices as well.

Against Stranger Danger

According to this piece in the New York Times, talking to strangers is good for you, partly because you tend to act happier around strangers than you might feel.

When one of us, Liz, was in graduate school, she noticed that her boyfriend, Benjamin, felt free to act grumpy around her. But if he was forced to interact with a stranger or acquaintance, he would perk right up. Then his own pleasant behavior would often erase his bad mood.

One of the perks of being a behavioral scientist is that when your partner does something annoying, you can bring dozens of couples into the laboratory and get to the bottom of it. When Liz tested her hypothesis in a lab experiment, she discovered that most people showed the “Benjamin Effect”: They acted more cheerful around someone they had just met than around their own romantic partner, leaving them happier than they expected.

The Fine Line Between Digital and Public

In the wake of the recent theft (not leak, but theft) of photos of female celebrities, I keep thinking about how the persistence of digital information makes this sort of theft possible. The crime committed is just as shameful as (or even more shameful than) a peeping tom peering in through a bedroom window, but the digital nature of the crime magnifies the effect at both ends, multiplying both the number of bedroom windows and the number of peeping tom eyes.

Nearly 80 years ago, Walter Benjamin wrote in "The Work of Art in the Age of Mechanical Reproduction":

In even the most perfect reproduction, one thing is lacking: the here and now of the work of art-its unique existence in a particular place. It is this unique existence-and nothing else-that bears the mark of the history to which the work has been subject.

The problem with digital information, like the photos in this case, is that they lack the here and now of the human to which they're attached, the unique humanity of a particular person. This is privacy in the age of digitial reproduction.

I'm reminded of something I wrote a few months back, in the wake of revelations about NSA spying, on the unforeseen stickiness of the web:

In those early days, the internet felt like an ocean made for skinny-dipping; instead of doffing your clothes, you doffed your identity. You could read about, look at, discuss, and eventually purchase just about anything that interested you, without fear of anyone looking at you funny. This lack of identity could be used for nefarious purposes, of course, and it could lead people down any number of self-destructive paths. But for many, it was liberating to find that, on the web, you could explore your true nature and find fellow travelers without shame.

But as paranoia grows about the NSA reading our emails and Google tapping into our home thermostats, it’s increasingly clear that — rather than providing an identity-free playground — the web can just as easily capture and preserve aspects of our identities we would have preferred to keep hidden. What started as a metaphor to describe the complexly interconnected network has come to suggest a spider’s sticky trap.

It's especially depressing, though not surprising, that women in particular are the targets of this crime, since women historically have been afforded so much less privacy than men. It should go without saying that searching for or looking at these pictures makes you part of the problem.

Some Measure of Innocence

In a piece for the New York Times Magazine, Mark O'Connell writes beautifully about how having children can give you a whole new perspective on the world and its dangers:

Having a child feels like returning some measure of innocence to the world, and this is wonderful in its way; but we are talking here about a world with an exceptionally poor track record in its dealings with innocence. Unforgivably, perhaps, I think of this much more frequently now than I ever did before deciding to bring a child — this particular child — into the world.

Reading this piece, I was reminded of a conversation I had with my best friend from high school not long after my second child was born. My friend did not have kids yet, and wasn't even in a serious relationship, so I was trying to explain to him how it felt, and I remember saying that it's like you've had this clock running your whole life, counting down the days until the next thing happens and then the next thing, the years of school, getting a driver's license, going to college, getting a job, getting married, and so on. And all the while, you're imagining your future.

But when you have a child, suddenly you start a new clock, and you begin to re-experience and re-anticipate all those same experiences. And the worst part of it is that you begin to imagine this new future, not your future but your child's future, and all the precarious possibilities that future could bring that your child doesn't even know about yet, from war to pandemics to global warming.

O'Connell's essay also reminded me of something the writer George Saunders once said in a radio interview (which I've been unable to track down). He was talking about the day one of his children was born, probably his first, and he remembered looking down at this infinitely innocent, infinitely helpless being in his hands and thinking about how all human beings on this planet were once that innocent and that helpless, and maybe if people could remember that, remember the innocence and helplessness we're all born with, the world wouldn't be such a cruel place.

No Permission Necessary

Love this bit from Kevin Kelly's piece You Are Not Late:

Right now, today, in 2014 is the best time to start something on the internet. There has never been a better time in the whole history of the world to invent something. There has never been a better time with more opportunities, more openings, lower barriers, higher benefit/risk ratios, better returns, greater upside, than now.

(via Shawn Blanc)

Kelly's main point is that the future of the internet still holds many surprises and innovations to come, but really, his statement would be true at any point in time. It reminds me of what Ira Glass said at the end of his recent Lifehacker interview:

Don't wait for permission to make something that's interesting or amusing to you. Just do it now. Don't wait.

And Glass's advice further echos the advice of Radiolab co-host Robert Krulwich, who said in a commencement address years ago:

Suppose, instead of waiting for a job offer from the New Yorker, suppose next month, you go to your living room, sit down, and just do what you love to do. If you write, you write. You write a blog. If you shoot, find a friend, someone you know and like, and the two of you write a script. You make something. No one will pay you. No one will care. No one will notice, except of course you and the people you’re doing it with. But then you publish, you put it on line, which these days is totally doable, and then… you do it again.

I wrote about this a while back in a blog post about the future of blogging.

I had [Krulwich's] words in mind when I started my blog six months ago, and I’ve had them in mind whenever I think I should be pitching one of my blog posts to an online publication like Slate or Salon or The Magazine. I’d like to get paid for what I write, but there’s something wonderfully satisfying about owning and controlling my own work. I also don’t want to wait to see if someone will publish it. I want to publish, and see if the audience comes to me.

The remarkable thing about the internet is that you don't have to wait, you don't need anyone's permission to put your creative work out in the world, you can just do it.

So do it.

Why Teaching Innovations Don’t Spread in the US

I was initially turned off by the shame-inducing headline of this article by Elizabeth Green in the New York Times, Why Do Americans Stink at Math? But the answer to her question is actually surprising. The innovations in math education that have spread around the world, and that have shown remarkable success, actually started here in the United States. So why have those innovations failed to spread here? Because we choose not to invest in the professional development of our teachers.

In Finland and Japan, where students perform at or near the top in math assessments, teachers spend only about 600 hours a week in the classroom, using the rest of their time to prepare lessons, observe other teachers, and receive feedback on their own teaching. American teachers, by contrast, spend more than 1000 hours a year in the classroom, and have traditionally received almost no feedback from fellow teachers (though this is starting to change).

My wife taught middle school and high school for about ten years, and I have taught at the college level for the last five, and I'm consistently frustrated with the carrots and sticks approach to improving our country's schools, as if bribing teachers with merit pay or threatening them with firing are the best ways to motivate them. In fact, most teachers I know are always striving to do better, even if they're already amazing teachers. What they need is the time and the support to actually improve their skills.

As Green writes:

Most policies aimed at improving teaching conceive of the job not as a craft that needs to be taught but as a natural-born talent that teachers either decide to muster or don’t possess. Instead of acknowledging that changes like the new math are something teachers must learn over time, we mandate them as “standards” that teachers are expected to simply “adopt.” We shouldn’t be surprised, then, that their students don’t improve.

(Via @stevenstrogatz)

What Tech Offices Tell Us about the Future of Work

Kate Losse at Aeon magazine on the insidious effect of high-end, handcrafted office design in modern tech culture:

Of course, the remaking of the contemporary tech office into a mixed work-cum-leisure space is not actually meant to promote leisure. Instead, the work/leisure mixing that takes place in the office mirrors what happens across digital, social and professional spaces. Work has seeped into our leisure hours, making the two tough to distinguish. And so, the white-collar work-life blend reaches its logical conclusion with the transformation of modern luxury spaces such as airport lounges into spaces that look much like the offices from which the technocrat has arrived. Perhaps to secure the business of the new moneyed tech class, the design of the new Centurion Lounge for American Express card members draws from the same design palette as today’s tech office: reclaimed-wood panels, tree-stump stools, copious couches and a cafeteria serving kale salad on bespoke ceramic plates. In these lounges, the blurring of recreation and work becomes doubly disconcerting for the tech employee. Is one headed out on vacation or still at the office – and is there a difference?

Zoo Animals and Their Discontents

Alex Halberstad writing for the New York Times magazine about how modern zoo animals, despite being given better enclosures and more "enrichment" still suffer from mental health disorders.

I wondered, too, why disorders like phobias, depression and OCD, documented at zoos, don’t appear to have analogues among animals living in the wild. Irene Pepperberg, a comparative psychologist at Harvard who is known for her work with African gray parrots, told me that she thought one reason had to do with survival. “An animal in the wild can’t afford to be depressed,” Pepperberg said. “It will simply be killed or starve, since its environment requires constant vigilance. The situation kind of reminds me of my Jewish grandparents, whose lives were a lot harder than mine. They never seemed depressed, because I don’t think it ever occurred to them.”

In other words, we'd all be a lot happier if lions were actually trying to eat us.

Better Living (and Less Anxiety) through Software

It was truly a pleasure to be a guest on Brett Terpstra's podcast Systematic this week. He's had some amazingly interesting folks on the show lately, so I just hope I measure up. We talked about my background in radio and then segued into the topic of anxiety and technology.

Fittingly, I began feeling anxious almost as soon as we finished the Skype call. Not that it wasn't a good conversation, but there was one part where I felt I could have explained myself a lot better. I had been talking about a turning point in my life, when I started my second and last job in public radio.

My first job in radio was writing for a show called The Writer's Almanac, and I was good at it, despite the fact that the show's host was notoriously demanding. In my first three years writing for the show, three producers quit, along with several other writers who either quit or got fired. I was finally the only one left standing, so I became the sole writer and producer, and I persisted for two more years. The day I left, they said I should get a plaque for lasting as long as I did. I thought this constituted evidence of my competence.

And yet, when I moved to a different job on a different radio show, I suddenly felt like the least competent person in the world. This was especially confusing because the new job should have been easier. I was no longer the sole writer and producer of a show, I was just one associate producer within a team. I only had to write bits and pieces of script, do occasional research, write the occasional blog post, answer listener emails, book guests, and help edit audio. None of these jobs was high stakes. It should have been a breeze. But it nearly killed me.

Part of the problem was multitasking. At my previous job, I'd been doing one thing at a time. Write this script. Now write that script. I did most of my work from home in a quiet room. I was allowed to focus.

At my new job, I was always juggling multiple projects: researching the next guest, proofreading the latest script, writing a promo, editing tape. I had always relied on my memory to keep track of my to-do list (I rarely wrote down homework assignments in high school or even studied for tests, and still did well), but my memory completely failed me in this new work environment. I began to worry all the time about whether I had forgotten something. Had I booked that guest for the right time? Had I checked the time zone? Did I fact check that script sufficiently? Should I read it through one more time?

Another problem was the office environment. I worked in a cubicle, with team members all around me. There was little space or time to focus deeply on anything. We were all expected to be on email all the time, injecting our thoughts into one another's brains at will. One of my tasks was to respond to listener email, and every Monday we got a flood of responses to our show, both tremendously positive and viciously negative. And if there had been any factual errors in the show, the listeners would take us to task, and the host would not be happy. I began to dread the weekend, imagining the army of potential attackers amassing and hurling their spears into cyberspace, each blow landing in my inbox on Monday morning.

The result of all this anxiety was that I found it harder and harder to concentrate. I began to make the mistakes I so feared making. Which only made me worry more. I started waking up every night at 3:00 AM, unable to get back to sleep, my mind racing with everything I needed to worry about. Then I started waking up at 2:00 AM. Then 1:00 AM. Then Midnight. If this had continued, I would have started waking up before I went to sleep.

If you have not experienced severe depression or anxiety, you might find it hard to understand is how physical an illness it really is. I did not just feel sick in my head. Every cell in my body felt scraped out and raw. I had no patience for my children. I had no energy to help my wife around the house. Imagine how you feel when you realize something horrible is about to happen: you forgot the essential thing you need for that important meeting, your car is sliding on the ice, or your child is falling head first off the jungle gym in slow motion. Now imagine feeling that kind of dread every waking moment for weeks on end.

That was me at my lowest point. I kept asking myself, "Why can't I do this? This shouldn't be so hard. What's wrong with me?"

In the interview with Brett, I alluded to something I read once that compared depression to a fever (unfortunately, the author was the now-discredited Jonah Lehrer, but I still find the article persuasive). In response to an infection, the body raises its own temperature as a way of killing off the infection. Depression, likewise, raises the frequency of negative "ruminative" thoughts. Psychiatrists have typically seen these kinds of thoughts as part of the problem, but some believe depression may be the body's way of forcing you to focus on what's wrong in your life in order to change it.

Imagine, for instance, a depression triggered by a bitter divorce. The ruminations might take the form of regret (“I should have been a better spouse”), recurring counterfactuals (“What if I hadn’t had my affair?”) and anxiety about the future (“How will the kids deal with it? Can I afford my alimony payments?”). While such thoughts reinforce the depression — that’s why therapists try to stop the ruminative cycle — Andrews and Thomson wondered if they might also help people prepare for bachelorhood or allow people to learn from their mistakes. “I started thinking about how, even if you are depressed for a few months, the depression might be worth it if it helps you better understand social relationships,” Andrews says. “Maybe you realize you need to be less rigid or more loving. Those are insights that can come out of depression, and they can be very valuable.”

Of course, it's important to note that while a fever can help rid your body of germs, it can also kill you. I don't know what might have happened to me if I hadn't talked to a doctor at the time. Medication was definitely part of my recovery. It helped reduce my symptoms so that I could see the root cause of the problem: this was not the right job for me.

So I quit, and took a couple months off before I started my next job. In that time, I realized two things. First, I wanted to learn how to be more organized. Second, I wanted to make time for the kind of deep focus creative work that gave my life real meaning. That was five years ago, and I've managed to accomplish both of those goals, largely with the help of software.

There's been some talk lately about whether software tools actually provide any benefit, and whether software design is solving real problems. But for me, every time I dump my mind into Omnifocus, or add an event to Fantastical, or forward an email with attachments to Evernote, or set a reminder in Due, I feel a little more in control of my life. I can much more easily manage my job as a college writing teacher, juggling multiple projects, multiple classes, lesson planning, grading, committee meetings, department responsibilities, and so on.

Keeping my life more organized also makes it possible to have a clear head when I want to focus on something important. One of my goals after quitting my job was to write a novel, and I finally made time for it. The app Scrivener helped me break the novel down into manageable pieces, and for the first time in my life, writing fiction felt enjoyable rather than fraught. More recently, I was inspired by the power of the app Editorial to start writing this website (and have written almost every post with it).

Of course, there's a danger here. Buying a new notebook and a fancy pen does not make you a writer. Making a to-do list is not an actual accomplishment. Tools are not the end goal, and using a tool, no matter how well-designed, does not make hard work any easier. But the right tool can provide an important cue to help create a habit or build a ritual for doing the actual work.

Software has improved my life by making the work feel more possible, creating virtual spaces where I feel less anxious. And the less anxious I feel, the more I feel capable of doing the work that matters, and the more I feel alive.

The Problem with the "Chosen One"

I should start by saying that I loved "The Lego Movie." I laughed with almost inappropriate volume while watching it, nearly cried at the emotional climax (which I will not spoil here), came out of the theater singing the theme song "Everything Is Awesome," and spent dinner with my wife and kids recounting our favorite parts. Moment by moment, it was probably the most entertaining movie I've seen in years.

And yet, something about it did not sit quite right with me, something having to do with the prophesy and the "chosen one."

The theme of the "chosen one" feels so interwoven with the movies of my youth that it's almost hard to pin down its source. The original, for me, was probably Luke Skywalker in "Star Wars," chosen to become the first Jedi in a generation and to defeat the empire. But there was also Jen, the Gelfling chosen to fulfill the prophesy and repair the titular "Dark Crystal" in Jim Henson's masterpiece. I was introduced to T.H. White's version of the story by Disney's "The Sword and the Stone," about a bumbling young squire named Arthur, chosen to be the new king when he inadvertently pulls a sword out of a rock. You can follow the various permutations of this "chosen one" theme over at tvtropes.org, but it should be obvious to anyone paying attention to popular culture that this theme keeps gaining traction, from "The Matrix" to "Harry Potter" to "Kung Fu Panda" to, most recently, "the Lego Movie."

It's obvious why the theme is so appealing. The hero begins most of these stories as utterly ordinary, or even less than ordinary: a farmer in a podunk town, a cook in a noodle restaurant, a office worker in a cubicle, a half-abused kid living under the stairs. And yet, by the end of the story, this utterly ordinary person will learn to wield extraordinary powers, will in fact be the only one who can save the world. Who among the utterly ordinary masses watching these movies doesn't want to dream that we too could become extraordinary?

It's also obvious why this story resonates so strongly in Western culture. It's essentially the story of Jesus, the apparent (but possibly illegitimate) son of a carpenter, born so poor that his mom gave birth in a pen for farm animals. But it turns out he too is the subject of a prophesy, chosen to become (in the words of John the Baptist) "the lamb of God, who takes away the sins of the world." Jesus Christ superhero.

But the Christian overtones of the "chosen one" trope are not what I find disturbing. What I do find disturbing is that so many of the most prominent "chosen ones" in modern popular culture (with only one major exception I can think of) are boys. Of course, it's an old criticism that too many of the heroes in popular culture are male. It's something Pixar and Disney have been working on lately, but sexism is endemic to Hollywood, etc. This is not news.

What is new, or at least new to me, is the realization that so many of these "chosen one" stories are about boys who go through a transformation from ordinary to extraordinary, from the least significant to the most significant person in the universe, all while accompanied by a sidekick who is already extraordinary to begin with. And what's the gender of that extraordinary sidekick? Why, she's a girl of course.

Take Luke Skywallker. While he's helping out his uncle on the farm, buying and fixing junky drones, what's his sister doing? She's a princess, already locked in battle with Darth Vader, already good with a gun, and even has an inkling the force. But is she the one picked to wield a lightsaber to face down her father? No. Instead, Obi Wan and Yoda take their chances on that awkward kid from the farm who knows nothing.

Then there's Harry Potter. While he's busy slacking on exams, playing sports, and sneaking off for snogging and butterbeer, what's Hermione Granger doing? Just becoming the best student in the history of Hogwarts, knowing the right answer to virtually every question, better at magic than anyone else her age. But is she the one who faces down the bad guy? Of course not. She wasn't "chosen".

The same goes for Neo in "The Matrix." The movie starts with a woman in a skin tight leather suit performing incredible feats of Kung Fu agility and power. She can leap between tall buildings. She actually knows what the Matrix is! Can Neo do any of this? Does he know any of this? No. He has to learn it. But he'll be better than that girl ever was. And he won't even break a sweat in his final fight. Because he's the chosen one.

The troubling aspect of this trope becomes especially clear in "Kung Fu Panda" and "The Lego Movie," partly because each movie pokes fun at the very idea of a chosen one. In "Kung Fu Panda," Tigress (voiced by Angelina Jolie) naturally expects to be picked as the Dragon Warrior because she's the best, hardest training Kung Fu student in Master Shifu's dojo. But instead, a clumsy, overweight, slacker Panda gets the job by accident. "The Lego Movie" enacts the exact same scenario, in which "Wyldstyle" (voiced by Elizabeth Banks) expects to become the chosen one because she's the best "Master Builder" training under her teacher Vitruvius, and she possesses ninja-level improvisatory Lego construction powers. Instead, the job goes to Emmet, the utterly ordinary construction worker, king of mediocrity and conformity.

Tigress and Wyldstyle aren't happy to learn the true identity of the chosen one. In fact, they're pissed, and rightfully so. They've been working their assess off to be exceptional, and these guys saunter in and take the top spot without nearly the same qualifications, experience or know how.

Sound familiar? What kind of message is this sending? Stories about chosen ones are really stories about who gets to be, and what it takes to be, exceptional. They're stories about privilege. And I don't just object to the gender imbalance. The problem isn't so much who gets to be chosen but the fact that we're so obsessed being chosen at all.

When our culture celebrates business leaders like Steve Jobs, Mark Zukerberg, and Jeff Bezos, or examines politicians like Ronald Reagan, Bill Clinton or Barack Obama, it rarely holds them up as exemplars of hard work. Instead, they're brilliant, innovative, visionary, charismatic. They possess (were "chosen" to receive) great gifts. But when women reach similar levels of achievement, they're usually praised (or ridiculed) for their dedication and pluck. Working hard has somehow become a feminine, and not-especially admirable, trait.

There is evidence that women are working harder than men in the United States. They've been out performing men in a number of categories, especially education, for years now. And yet they still struggle to reach top positions in business and government. These are the real world Hermiones and Wildstyles, standing in the shadows of their "chosen" male counterparts.

If we keep telling these stories about what it takes to be successful, stories that are also prophesies about who gets to be successful, who gets to be "chosen," those prophesies will be self-fulfilling. It's time we changed the story. I, for one, want my kids to grow up in a world where the Trinitys, Hermiones, Tigresses, and Wildstyles are the real heros, where the prophesy of some old guy in a white beard means nothing in the face of hardworking badassery.

UPDATE: Several people on Twitter have pointed out that The Lego Movie is ironically playing on this trope rather than reenforcing it.

I agree to an extent. The movie's treatment of Emmet as hero is certainly ironic, and reminicent of my favorite episode of the Simpsons, but the ultimate message still rings slightly false. That message (spoiler alert): Emmmet isn't any more "special" than anyone else. The prophesy isn't even real. Anyone can be the most amazing, interesting person in the universe as long as they believe in themselves.

The problem: this also means Emmet isn't any less special than anyone else, namely Wyldstyle. Which he clearly is. (Though I fear that makes me sound like Ayn Rand.)

Artificial Guilt

Great piece in the New York Times Magazine about our Dr. Frankenstein-like quest to play God, subvert sin, and build a better artificial sweetener.

The science on these questions is inconclusive at best. There’s no clear evidence that artificial sweeteners cause cancer or obesity, at least in human beings. But the fear of artificial sweeteners was never quite a function of the scientific evidence — or never of just that. It stems as much from a sense that every pleasure has its consequences: that when we try to hack our taste buds in the lab — to wrench the thrill of sugar from its ill effects — we’re cheating at a game we’ll never win.

Just beware the first paragraph, which may spoil aspects of Breaking Bad for those who have not finished it.

Caught Like Insects in a Web

I’d estimate that the New Yorker has published more than 50,000 cartoons since its first issue in 1925 (I couldn’t find a precise number in a cursory Google search). So it’s surprising to learn that the single most reprinted cartoon of that nearly 90 year history is the one by Peter Steiner from 1993 that says, “On the internet, nobody knows you’re a dog.”

In an interview in 2000, Steiner said the line didn’t feel that profound to him when he wrote it. “I guess, though, when you tap into the zeitgeist you don’t necessarily know you’re doing it.” But the idea quickly caught on as shorthand for the internet’s spirit of anonymity, especially in chatrooms and message boards—a spirit that lives on in sites like Reddit, where “doxing” someone is one of the worst crimes you can commit.

In those early days, the internet felt like an ocean made for skinny-dipping; instead of doffing your clothes, you doffed your identity. You could read about, look at, discuss, and eventually purchase just about anything that interested you, without fear of anyone looking at you funny. This lack of identity could be used for nefarious purposes, of course, and it could lead people down any number of self-destructive paths. But for many, it was liberating to find that, on the web, you could explore your true nature and find fellow travelers without shame.

But as paranoia grows about the NSA reading our emails and Google tapping into our home thermostats, it’s increasingly clear that — rather than providing an identity-free playground — the web can just as easily capture and preserve aspects of our identities we would have preferred to keep hidden. What started as a metaphor to describe the complexly interconnected network has come to suggest a spider’s sticky trap.

I thought of this listening to a recent episode of WTF with Marc Maron. Comedian Artie Lang was telling the story of how he came into his own as a stand-up comedian by exploring, with brutal honesty, the darkness of his personal life. Then he stopped himself for a second to explain that he would never have been able to achieve that level of honesty onstage if he’d worried about his sets appearing on the internet.

Lang: It was before every jerk off had a cellphone taping you. Remember when it was midnight at a club in Cincinnati. It was just you and those people! That was it….Now it’s you and everyone in the fucking world.

Maron: And on Twitter…you can’t do anonymous sets anymore.

Lang: Exactly. An anonymous set is what makes you…The comics are going to get worse man, ’cause they’re gonna check themselves…They’re not gonna wanna see themselves bombing on Instagram or wherever the fuck it is and they’re never gonna take risks.

Where the internet used to encourage risk, now it seems to inhibit it, because it turns out the web can capture anything. What you say in front of friends, or even in front of an audience, can blow away with the wind. On the web, your words can stick around, can be passed around. Celebrities may have been the early victims, but now anyone is fair game. Millions of people are potentially watching you, ready to descend in a feeding frenzy of judgment. In the New Yorker’s Page Turner blog, Mark O’Connell writes about the phenomenon of Twitter users deleting their tweets, something he has seen happen in real time.

It’s a rare and fleeting sight, this emergency recall of language, and I find it touching, as though the person had reached out to pluck his words from the air before they could set about doing their disastrous work in the world, making their author seem boring or unfunny or ignorant or glib or stupid.

Maybe we should treat the web like a public place, with certain standards of behavior. Maybe those who engage in disorderly conduct, posting creepshots and racist tweets, should be exposed and held to account. Perhaps our expectation of anonymity on the internet never made sense. The problem is that the digital trails we leave on web can blur the line between speech and thought, between imagination and intent.

It’s that blurred line Robert Kolker explores in his piece for New York magazine about the so-called “Cannibal Cop,” Gilberto Valle, who never kidnapped, raped, murdered, or cannibalized anyone, but who chatted about it obsessively online. And even though there was little evidence that he took any steps to make his fantasies a reality, his online discussions served to convict him of conspiracy to do so. Kolker writes:

The line between criminal thoughts and action is something the courts have pondered for decades…What’s changed in recent years are the tools used to detect intent—namely, a person’s online activity. “We’ve always said you can’t punish for thoughts alone, but now we really know what the thoughts are,” says Audrey Rogers, a law professor at Pace University. [emphasis mine]

I’m reminded of a recent Radiolab episode about Ötzi, the 5000 year old Iceman discovered in the Alps in 1991. For more than two decades, Archaeologists have poured over the details of his clothing, his possessions, his tattoos, and the arrowhead lodged in his back, evidence he was murdered. From the contents of his stomach, they’ve even determined what he ate for his final meal. I wonder if there will someday be archaeologists who sift through our hard drives, tracing out the many digital trails we’ve left in the web, trying to determine not what we were eating, but what we were thinking. Will their findings be accurate?

To paraphrase John Keats, most lives used to be writ in water. Now they’re writ in code. As much as our digital lives are only partial documents, they often seem more real to strangers simply because they are what has been documented. Maybe the internet doesn’t know you’re a dog, but it doesn’t care. In the eyes of strangers, you are that which the web reveals you to be, because the web is where the evidence is.