The Best Podcasting Apps for iPhone

When Marco Arment announced that he was making a podcast app, he was deliberately entering a crowded market. He said that he wanted to rid himself of his irrational fear of direct competition.

The crowded market doesn't seem to be hurting him. The reviews of his app Overcast in Macstories, Macworld, and elsewhere have all been great. But it's also important to note that the crowded market itself is not diluted by the number of independent developers entering it. On the contrary, the diversity of apps, and all the different approaches to the problem of podcast delivery, can serve to improve the quality of all those apps.

There's a post on the Supertop blog (the developers of the podcast app Castro) about why they welcome the competition from Overcast.

From our perspective, a user trying any third party app is good for all third party apps. If a user is persuaded to download one alternative they should be more likely to consider others in the future, especially given the variety of apps that are available...I encourage you to try Overcast. In fact, if you really love podcasts, I encourage you to try all the others too.

I decided to do just that, purchasing and trying out Instacast, Downcast, Pocketcasts, Podwrangler, Castro, Mocast, and Overcast. I made comparison video below to share what I observed.

I have no background in design, but perhaps because I have a degree in comparative literature, I find it endlessly fascinating to compare the interface of apps and savor the details of what each one offers. None of these apps is perfect, but it's inspiring to see how a group of talented humans use their skills in different ways, through the endlessly plastic medium of software, to approach the same problems.

The Genesis of Goodnight Moon

I loved Aimee Bender's appreciation of Goodnight Moon (via Kottke). She does a great job of describing how the book's genius lies in how it creates a structure and then deviates from it in surprising ways:

For writers, this is all such a useful reminder. Yes, move around in a structure. But also float out of that structure. “Goodnight nobody” is an author’s inspired moment that is inexplicable and moving and creates an unknown that lingers. How wonderful that this oddly compassionate moment, where even nobody gets a good night, shows up in the picture book that is the most popular! There is no template, ever.

I wrote a bit about Margaret Wise Brown years ago for The Writer's Almanac, and I was especially interested in how, in writing Goodnight Moon, she drew on her knowledge of the way children learn language.

Brown wanted to become a writer as a young woman, and she once took a creative writing class from Gertrude Stein. But she had a hard time coming up with story ideas, so she went into education. She got a job at an organization called the Bureau of Educational Experiments, researching the way that children learn to use language. What she found was that children in the earliest stage of linguistic development relish language with patterns of sound and fixed rhythms. She also found that young children have a special attachment to words for objects they can see and touch, like shoes and socks and bowls and bathtubs.

She eventually began to write books for children based on her research, and in 1938 she became the editor of a publishing house called William R. Scott & Company, which specialized in new children's literature. The Great Depression had made children's books into luxury items, and most other publishing houses had phased out children's literature. Margaret Wise Brown helped make children's books profitable, because she understood that children experience books as sensual objects. She invested in high quality color illustrations, and she printed her books on strong paper with durable bindings, so that children could grab, squeeze, and bite their books the way they did with all their toys.

Brown had been a fairly successful writer and editor for almost ten years when, one morning, she woke up and wrote a poem, listing the items in a house, and then saying goodnight to each item, including the famous lines “Goodnight room / Goodnight moon / Goodnight cow jumping over the moon … / Goodnight comb / And goodnight brush / Goodnight nobody / Goodnight mush. / And goodnight to the old lady whispering 'hush' / Goodnight stars / Goodnight air / Goodnight noises everywhere.” She thought the poem could be made into a book, so she sent it off to her publisher, and it was published in 1947 as Goodnight Moon.

The influential New York Public Library gave it a terrible review, and it didn't sell as well as some of Brown's other books in its first year. But parents were amazed at the book's almost hypnotic effect on children, its ability to calm them down before bed. Brown thought the book was successful because it helped children let go of the world around them piece by piece, just before turning out the light and falling asleep.

Parents recommended the book to each other, and it slowly became a word-of-mouth best-seller. It sold about 1,500 copies in 1953, 4,000 in 1955, 8,000 in 1960, 20,000 in 1970; and by 1990 the total number of copies sold had reached more than four million.

You Can Delete But You Can’t Forget

Jacqui Shine made a rash decision to delete all the email from her late mother; in a piece for the Atlantic, she writes about why the digital nature of that deletion is uniquely haunting.

In another way, though, those deleted emails do survive, though—or, at least, the data that Google has extracted from them in order to build your user profile has...Every time I get served an ad for a fawning book about the Founding Fathers or for a deviled egg tray, it’s a kind of tiny haunting: a palimpsest of what once was, stripped of what made it really meaningful. And those tiny traces may be the problem—not because they can’t allow us to recover the things we’ve lost, but because they allow us to believe that we can.

What Tech Offices Tell Us about the Future of Work

Kate Losse at Aeon magazine on the insidious effect of high-end, handcrafted office design in modern tech culture:

Of course, the remaking of the contemporary tech office into a mixed work-cum-leisure space is not actually meant to promote leisure. Instead, the work/leisure mixing that takes place in the office mirrors what happens across digital, social and professional spaces. Work has seeped into our leisure hours, making the two tough to distinguish. And so, the white-collar work-life blend reaches its logical conclusion with the transformation of modern luxury spaces such as airport lounges into spaces that look much like the offices from which the technocrat has arrived. Perhaps to secure the business of the new moneyed tech class, the design of the new Centurion Lounge for American Express card members draws from the same design palette as today’s tech office: reclaimed-wood panels, tree-stump stools, copious couches and a cafeteria serving kale salad on bespoke ceramic plates. In these lounges, the blurring of recreation and work becomes doubly disconcerting for the tech employee. Is one headed out on vacation or still at the office – and is there a difference?

Zoo Animals and Their Discontents

Alex Halberstad writing for the New York Times magazine about how modern zoo animals, despite being given better enclosures and more "enrichment" still suffer from mental health disorders.

I wondered, too, why disorders like phobias, depression and OCD, documented at zoos, don’t appear to have analogues among animals living in the wild. Irene Pepperberg, a comparative psychologist at Harvard who is known for her work with African gray parrots, told me that she thought one reason had to do with survival. “An animal in the wild can’t afford to be depressed,” Pepperberg said. “It will simply be killed or starve, since its environment requires constant vigilance. The situation kind of reminds me of my Jewish grandparents, whose lives were a lot harder than mine. They never seemed depressed, because I don’t think it ever occurred to them.”

In other words, we'd all be a lot happier if lions were actually trying to eat us.

Punctuated Equilibrium

Joe Pinsker, writing for the Atlantic about the fate of the apostrophe in the 21st century, points out how computers are actually preserving aspects of language we might otherwise be willing to let atrophy:

Autocorrect, the now-ubiquitous software that’s always reading over our shoulders, tends to put apostrophes in when we omit them—which means they might remain a feature of informal writing for longer than they otherwise would. The software may also prop up other formal conventions, among them capitalization and “silent” letters (like the u, g, and h that drop out as though becomes tho). “Autocorrect is acting like a language preservative,” says Alexander Bergs, a linguistics professor at Germany’s Osnabrück University. “Which is funny, as usually new media like cellphones and computers are blamed for language decay.”

Feeling More Comfortable with Computers

Tom Jacobs writing for Pacific Standard about a study of how patients feel when describing symptoms to a computer instead of a human:

The result: People disclosed information more honestly and openly when they were told they were speaking exclusively to the computer. The participants also “reported significantly lower fear of self-disclosure” under those circumstances. These results were reiterated by the analysis of their facial expressions, which found they “allowed themselves to display more intense expressions of sadness” when they believed no human was watching them.

This makes perfect sense to me. I can't remember a time when I felt too embarrassed to tell my doctor something, but I've definitely felt judged by doctors, who acted as if I'd come to them for minor ailments. The feeling of judgement certainly affected how much I told them, and I know I'm not alone in this experience. I know a woman who once went to a doctor because she had recently experienced some weight loss and was having dizzy spells. Her impression was that the doctor assumed she was anorexic. He gave her a prescription for anti-dizziness pills, as if that was the problem she was trying to solve.

In a piece for the New Yorker about how doctors make decisions, Jerome Groopman wrote,

Doctors typically begin to diagnose patients the moment they meet them. Even before they conduct an examination, they are interpreting a patient’s appearance: his complexion, the tilt of his head, the movements of his eyes and mouth, the way he sits or stands up, the sound of his breathing. Doctors’ theories about what is wrong continue to evolve as they listen to the patient’s heart, or press on his liver. But research shows that most physicians already have in mind two or three possible diagnoses within minutes of meeting a patient, and that they tend to develop their hunches from very incomplete information.

Perhaps using computers for patient intake could improve both sides of the equation: putting the patent more at ease to share all the relevant information, and giving the doctor a fuller picture of that information before they start forming a premature diagnosis.

Too much Delight?

Interesting take from Sean Madden at Wired on why the Amazon Fire Phone may be too delightful for its own good.

The average smartphone user interacts with his or her mobile device over 100 times per day, and the majority of those interactions fall into just a few categories: opening an app, selecting from a list, bringing up a keyboard, and so on. If each of them is imbued with too much visual whiz-bang, using your phone becomes the digital equivalent of eating birthday cake for every meal.

I would argue that it's not so much the frequency of the effect but the utility that matters. If the effect slows down the experience without offering anything other than eye candy, it's bad design. “Whiz-bang” is a sparkly coat of paint on the surface of the interface. Delight is the spark of life that lives inside the app, coded deep into its DNA.

Welcome Cracks in the Walled Garden

The first good sign was the opening video. Last year's video was a visually pleasing but somewhat abstract statement of purpose about Apple's design principles. The message seemed to be, "We're Apple. We know design. Learn from us." This year, the video focused on people talking about apps and how they improve people's lives. The content wasn't amazing, but the contrast was stark. Apple often takes a moment to nod toward the importance of developers, but this felt bigger than that. Rather than focusing inward on their own expertise, Apple was focusing outward on the people who build on their platforms and use their products. The video ended with the words "thank you" addressed directly to developers. I'm not sure how this went over in the room, but as a user who feels deep gratitude for the apps I use every day, I felt like that thank you was long overdue. And that was just the beginning. Apple spent the rest of the keynote demonstrating this new outward focus by tearing down walls.

Critics of the company love to toss around terms like "walled garden," in reference to Apple's paternal approach to interoperability. It's a fair criticism, especially when it comes to iOS. The App Store, sandboxing, and iCloud each put their own restrictions on how users can access software and data. But another way to see it is that Apple has always been a device-centric rather than a data-centric company.

Other players in the computer industry always saw a sharp divide between the hardware and the software, but Apple has always tried to take a holistic view, controlling as much of both the hardware and the software as possible. This approach only increased with iOS, which gave Apple even greater control of what software could be loaded onto the device, how applications could communicate with each other, and what users could (and couldn't) customize about their experience. That level of control made iPhones and iPads more approachable than any computing devices had ever been before. And Apple's device-centric approach filtered down to developers, who made apps that themselves felt like mini-virtual devices, each with their own unique designs, powers, and solutions.

But overtime, that device-centric approach has felt more and more limiting. Every time you tap on a notification and get pulled sideways into a new app, or you tap open in and find yourself flung in a different direction, you feel your head bump against the walls of the walled garden. Apple wants to hide the file system because ordinary users find it confusing, but is it really less confusing to open a photo in a photo editing app, make changes, and then have to export them as an entirely new photo to the Photos app?

Apple has finally decided to tackle these problems, making the walls of its many walled gardens rather more porous. The most obvious of these changes is a new iCloud document picker, which will allow iOS apps to select a file and then save it without creating second copies. This is the closest Apple has come to introducing a real file system to iOS, and without a demo, it remains to be seen what it will actually look like, but the keynote mentioned that iCloud will not be the only storage option for this document picker. Theoretically, customers could choose Google Drive, One Drive, or even Dropbox.

Other changes include interactive notifications, such as the ability to accept appointment requests, respond directly to messages, and apparently take action on third party notifications (though the only example was Facebook). So instead of having to bounce over to the app in question, entering its little garden, you can just interact with the information itself wherever you are. Another example is third party widgets in the Today view of Notification Center (something that carries over to the Mac). Again, you'd be interacting with the data of an app or the feature of an app without having to enter the the app itself. And Healthkit and Homekit, which were touted as rumors in the run up to the keynote, were described as aggregators of data from other apps. The data, liberated from its silos, can be collected, examined, and transformed with new meaning.

Apple also pulled down the walls between iOS devices and the Mac. There's a new feature called "Continuity," which gives devices a variety of ways to share data more fluidly. You will be able to use Airdrop to send data between Mac and iOS. You can also "hand off" tasks from one device to the next. Writing an email on your phone? Easily switch to writing it on your Mac. Get a call or an SMS on your phone? View the call or the message on your Mac. Each of these features frees the computing task at hand from its confinement to one specific app or one specific device.

But finally, the feature on almost everyone's iOS wish list came true. Apple introduced "Extensibility," a kind of inter-app communication that would allow apps to open up instances of each other's UI to take certain actions. The example Apple showed was of opening a photo in the Photos app and being able to use a filter from another installed app without leaving Photos. It isn't clear yet whether one third party app will be able to interact with another third party app, but that was the implication.

The larger implication is that developers can now begin to think about apps as either stand-alone powerful pieces of software or as extensions of other pieces of software. I don't really want to buy any more apps that let me add filters to my photos, but I might buy an extension to my main photo editing app that gives me extra features.

Power users are no doubt cheering all of these additions. For me, what's really exciting is not the features in themselves (though I am excited to try them) but the apparent change in philosophy, the willingness to trust the users and the developers. With iOS 7, Apple seemed to be saying that people are comfortable enough with touch interfaces that they don't need skeuomorphic designs anymore to make them feel comfortable. With iOS 8, Apple seems to be saying that people are comfortable enough with the various data they manage through their devices and their apps. That data can now begin to move more fluidly between those devices and apps.

Recently, in "Sharing the Ecosystem" I wrote,

I find it fitting that the number one request on most people’s lists for iOS 8 is better sharing of information between apps. What Apple needs is better sharing, period. Healthy ecosystems are all about sharing. “Better can’t be better if it doesn’t consider everything.” Just as Tim Cook sees the value in sustaining the world’s ecosystem, he needs to see the value in sustaining the developer ecosystem. It’s those developers who can provide the real return on investment, making both his products, and the world, better.

I came away from the keynote feeling that Tim Cook understands this. He chose to begin the keynote with a thank you to developers, and he ended it by asking all the Apple employees in the audience to stand up to receive recognition. For the last two decades, Apple was defined by one man's vision, even if there were many people behind that vision. Tim Cook wants to celebrate all the people working to make Apple better. I have rarely felt more hopeful about the company.

The Origin of "Don't Be Evil"

When my wife was in graduate school to get a master's degree in education, she took a class about how to teach students of different cultures without racial bias. Near the end of the class, one of her classmates said of the textbook they'd been reading, "You know, this book should just be called, 'Don't Be a Dick.' And all the pages could be blank."

I thought of that story recently while reading Steven Levy's book about Google, In the Plex, which includes the origin story of Google's infamous company motto, "Don't Be Evil." It's common these days for bloggers and journalists to point out all the ways in which Google falls short of the ideal expressed in that motto. So it was surprising, for me at least, to learn that the motto actually started as a kind of joke, not unlike the joke my wife's classmate made about not being a dick.

According to Steven Levy, Google held a meeting in 2001 to try to nail down its corporate values. Stacy Sullivan, the head of human resources, stood at the front of the room with a giant notepad, writing down platitudes like, "Google will strive to honor all its commitments." But engineer Paul Buchheit thought the whole thing was absurd.

Levy writes,

Paul Buchheit was thinking, This is lame. Jawboning about citizenship and values seemed like the kind of thing you do at a big company. He’d seen enough of that at his previous job at Intel. At one point the chipmaker had given employees little cards with a list of values you could attach to your badge. If something objectionable came up you were to look at your little corporate values card and say, “This violates value number five.” Lame. “That whole thing rubbed me the wrong way,” Buchheit later recalled. “ So I suggested something that would make people feel uncomfortable but also be interesting. It popped into my mind that ‘Don’t be evil’ would be a catchy and interesting statement. And people laughed. But l said, ‘No, reaIIy.”’

The slogan made Stacy Sullivan uncomfortable. It was so negative. “Can’t we phrase it as ‘Do the right thing’ or something more positive?” she asked. Marissa and Salar agreed with her. But the geeks—Buchheit and Patel—wouldn’t budge. “Don’t be evil” pretty much said it all, as far as they were concerned. They fought off every attempt to drop it from the list.

“They liked it the way it was,” Sullivan would later say with a sigh. “It was very important to engineering that they were not going to be like Microsoft, they were not going to be an evil company.”

I just love the fact that the motto did not originate out of some wide-eyed idealism. Instead, it was an attempt to cut through the whole bullshit concept of "corporate values." It's no wonder the company has had trouble living up to that ideal. "Don't Be Evil" is the implicit motto of every idealistic company before it gets mired in the messy, morally compromised world of actually making money.

Better Living (and Less Anxiety) through Software

It was truly a pleasure to be a guest on Brett Terpstra's podcast Systematic this week. He's had some amazingly interesting folks on the show lately, so I just hope I measure up. We talked about my background in radio and then segued into the topic of anxiety and technology.

Fittingly, I began feeling anxious almost as soon as we finished the Skype call. Not that it wasn't a good conversation, but there was one part where I felt I could have explained myself a lot better. I had been talking about a turning point in my life, when I started my second and last job in public radio.

My first job in radio was writing for a show called The Writer's Almanac, and I was good at it, despite the fact that the show's host was notoriously demanding. In my first three years writing for the show, three producers quit, along with several other writers who either quit or got fired. I was finally the only one left standing, so I became the sole writer and producer, and I persisted for two more years. The day I left, they said I should get a plaque for lasting as long as I did. I thought this constituted evidence of my competence.

And yet, when I moved to a different job on a different radio show, I suddenly felt like the least competent person in the world. This was especially confusing because the new job should have been easier. I was no longer the sole writer and producer of a show, I was just one associate producer within a team. I only had to write bits and pieces of script, do occasional research, write the occasional blog post, answer listener emails, book guests, and help edit audio. None of these jobs was high stakes. It should have been a breeze. But it nearly killed me.

Part of the problem was multitasking. At my previous job, I'd been doing one thing at a time. Write this script. Now write that script. I did most of my work from home in a quiet room. I was allowed to focus.

At my new job, I was always juggling multiple projects: researching the next guest, proofreading the latest script, writing a promo, editing tape. I had always relied on my memory to keep track of my to-do list (I rarely wrote down homework assignments in high school or even studied for tests, and still did well), but my memory completely failed me in this new work environment. I began to worry all the time about whether I had forgotten something. Had I booked that guest for the right time? Had I checked the time zone? Did I fact check that script sufficiently? Should I read it through one more time?

Another problem was the office environment. I worked in a cubicle, with team members all around me. There was little space or time to focus deeply on anything. We were all expected to be on email all the time, injecting our thoughts into one another's brains at will. One of my tasks was to respond to listener email, and every Monday we got a flood of responses to our show, both tremendously positive and viciously negative. And if there had been any factual errors in the show, the listeners would take us to task, and the host would not be happy. I began to dread the weekend, imagining the army of potential attackers amassing and hurling their spears into cyberspace, each blow landing in my inbox on Monday morning.

The result of all this anxiety was that I found it harder and harder to concentrate. I began to make the mistakes I so feared making. Which only made me worry more. I started waking up every night at 3:00 AM, unable to get back to sleep, my mind racing with everything I needed to worry about. Then I started waking up at 2:00 AM. Then 1:00 AM. Then Midnight. If this had continued, I would have started waking up before I went to sleep.

If you have not experienced severe depression or anxiety, you might find it hard to understand is how physical an illness it really is. I did not just feel sick in my head. Every cell in my body felt scraped out and raw. I had no patience for my children. I had no energy to help my wife around the house. Imagine how you feel when you realize something horrible is about to happen: you forgot the essential thing you need for that important meeting, your car is sliding on the ice, or your child is falling head first off the jungle gym in slow motion. Now imagine feeling that kind of dread every waking moment for weeks on end.

That was me at my lowest point. I kept asking myself, "Why can't I do this? This shouldn't be so hard. What's wrong with me?"

In the interview with Brett, I alluded to something I read once that compared depression to a fever (unfortunately, the author was the now-discredited Jonah Lehrer, but I still find the article persuasive). In response to an infection, the body raises its own temperature as a way of killing off the infection. Depression, likewise, raises the frequency of negative "ruminative" thoughts. Psychiatrists have typically seen these kinds of thoughts as part of the problem, but some believe depression may be the body's way of forcing you to focus on what's wrong in your life in order to change it.

Imagine, for instance, a depression triggered by a bitter divorce. The ruminations might take the form of regret (“I should have been a better spouse”), recurring counterfactuals (“What if I hadn’t had my affair?”) and anxiety about the future (“How will the kids deal with it? Can I afford my alimony payments?”). While such thoughts reinforce the depression — that’s why therapists try to stop the ruminative cycle — Andrews and Thomson wondered if they might also help people prepare for bachelorhood or allow people to learn from their mistakes. “I started thinking about how, even if you are depressed for a few months, the depression might be worth it if it helps you better understand social relationships,” Andrews says. “Maybe you realize you need to be less rigid or more loving. Those are insights that can come out of depression, and they can be very valuable.”

Of course, it's important to note that while a fever can help rid your body of germs, it can also kill you. I don't know what might have happened to me if I hadn't talked to a doctor at the time. Medication was definitely part of my recovery. It helped reduce my symptoms so that I could see the root cause of the problem: this was not the right job for me.

So I quit, and took a couple months off before I started my next job. In that time, I realized two things. First, I wanted to learn how to be more organized. Second, I wanted to make time for the kind of deep focus creative work that gave my life real meaning. That was five years ago, and I've managed to accomplish both of those goals, largely with the help of software.

There's been some talk lately about whether software tools actually provide any benefit, and whether software design is solving real problems. But for me, every time I dump my mind into Omnifocus, or add an event to Fantastical, or forward an email with attachments to Evernote, or set a reminder in Due, I feel a little more in control of my life. I can much more easily manage my job as a college writing teacher, juggling multiple projects, multiple classes, lesson planning, grading, committee meetings, department responsibilities, and so on.

Keeping my life more organized also makes it possible to have a clear head when I want to focus on something important. One of my goals after quitting my job was to write a novel, and I finally made time for it. The app Scrivener helped me break the novel down into manageable pieces, and for the first time in my life, writing fiction felt enjoyable rather than fraught. More recently, I was inspired by the power of the app Editorial to start writing this website (and have written almost every post with it).

Of course, there's a danger here. Buying a new notebook and a fancy pen does not make you a writer. Making a to-do list is not an actual accomplishment. Tools are not the end goal, and using a tool, no matter how well-designed, does not make hard work any easier. But the right tool can provide an important cue to help create a habit or build a ritual for doing the actual work.

Software has improved my life by making the work feel more possible, creating virtual spaces where I feel less anxious. And the less anxious I feel, the more I feel capable of doing the work that matters, and the more I feel alive.

The Illusion of Power

I love this Rolling Stone interview with George R.R. Martin, which goes a long way toward explaining why the Game of Thrones books (i.e. Song of Ice and Fire) are so much more than escapist fiction. I read them as a sword and sorcery version of The Wire, with a Hobbesian view of power as the central theme. As Martin says,

One of the central questions in the book is Varys' riddle: The rich man, the priest and the king give an order to a common sellsword. Each one says kill the other two. So who has the power? Is it the priest, who supposedly speaks for God? The king, who has the power of state? The rich man, who has the gold? Of course, doesn't the swordsman have the power? He's the one with the sword – he could kill all three if he wanted. Or he could listen to anyone. But he's just the average grunt. If he doesn't do what they say, then they each call other swordsmen who will do what they say. But why does anybody do what they say? This is the fundamental mystery of power and leadership and war through all history....It's all based on an illusion.

Most people familiar with the books or the TV show remember the dramatic deaths of various characters best, but for me, one of the most powerful scenes in any of the books (mild spoiler from book/season 1), was the moment Ned Stark and Cersei face each other down after the death of the king. Ned holds the king's seal, which he claims gives him the power to rule. Cersei claims the power belongs to her and her son, the heir to the throne. The room is filled with armed guards, who have to decide whom to follow. What makes the scene so dramatic is that Ned and Cersei have no real power. They have no weapons to wield but words. All their power flows from the people around them who choose to believe they have power.

For some reason, that scene lays bare the illusion of power better than almost anything I've ever read. I think about it all the time, in department meetings at the college where I teach, at campus events when the president of my college gives a speech, even when I watch the President of the United States on TV. It reminds me of something the physicist and author Janna Levin said on a radio show where I used to work, about how her cosmological view of the universe sometimes gives her a strange perspective on our race of primates and the ways we organize ourselves on this tiny planet:

You know, for me, it's so absurd, because it's so small and it's so — this funny thing that this one species is acting out on this tiny planet in this huge, vast cosmos. Of course, I take very seriously our voting process and I'm, you know, very, try to be politically conscious. But sometimes, when I think about it, I have to laugh that we're all just agreeing to respect this agreement that this person has been elected for something. And that is really a totally human construct that we could turn around tomorrow and all choose to behave differently. We're animals that organize in a certain way. So it's not that I completely dismiss it or don't take it seriously, but I think a lot of the things we are acting out are these animalistic things that are consequences of our instincts. And they aren't, in some sense, as meaningful to me as the things that will live on after our species comes and goes.

Sharing the Ecosystem

Tim Cook got a lot of attention back in February when he was challenged at a shareholder meeting to explain Apple’s commitment to green energy initiatives. A conservative group of shareholders had put forward a proposal asking Apple to focus only on initiatives that had a clear ROI (return on investment). According to a report in Mac Observer, Tim Cook grew visibly angry at the suggestion:

When we work on making our devices accessible by the blind. I don’t consider the bloody ROI….If you want me to do things only for ROI reasons, you should get out of this stock.

Cook underlined his commitment to the environment again this past week by providing the voiceover for Apple’s promotional video Better, about Apple’s use of solar and wind energy, among other environmentally friendly practices. But it’s worth noting the difference in the message. At the shareholder meeting, Cook seemed to be saying that he doesn’t care about return on investment – doesn’t care about profits – when it comes to doing things that are just right. But in the video he keeps repeating the word “better” in reference both to Apple’s products and Apple’s commitment to the environment. It’s not that he doesn’t care about return on investment; it’s that he’s enlarging the very meaning of the term.

Better. It’s a powerful word and a powerful ideal. It makes us look at the world and want more than anything to change it for the better, to innovate, improve, to reinvent, to make it better. It’s in our DNA. And better can’t be better if it doesn’t consider everything. Our products. Our values.

If Tim Cook hadn’t gotten so angry at that guy at the shareholder meeting, he might have explained that profits are only one return on the investment. If you’re the most valuable company in the world, and you’re not concerned about the impact of your company on the environment, you’re not playing the long game. We all share the same ecosystem. Investing in that ecosystem is investing in the future. It might not look like a profitable investment, but it could yield immeasurable returns.

So I’m heartened by Apple’s apparent commitment to the environmental ecosystem, but I wish they had the same attitude toward their software ecosystem.

I know the history of that ecosystem from my vantage point as a user. I switched to a Mac in 2007, and as much as I loved the hardware, I discovered pretty quickly that the real advantage was the software, and not just the software made by Apple. Independent developers who cared about design, who wanted to make software for individuals rather than the enterprise, had been using Macs and writing Mac software for years. Using those applications for the first time, I began to see software as a kind of art form in and of itself.

The iPhone and the App Store brought that art form to the masses. By creating a place where customers could easily, and without fear, download any number of apps, Apple made software mainstream. Before that, most customers only bought software when they bought their computers, preloaded with an office suite and maybe one or two more apps. The iPhone, and later the iPad, provided both the perfect delivery and the perfect medium for software, because the entire device itself changed based on whatever software had just been launched.

The result was that Apple managed to cultivate what I’d argue was the richest software ecosystem in the history of computing. Which is why it’s so strange that Apple now seems to be on the cusp of letting that ecosystem wither. It’s no secret that the App Store is broken, that developers are having a harder and harder time making good money. Marco Arment has been talking about this for a long time, most clearly when he made his case this past fall that paid-upfront apps are dead. Ben Thomson wrote a series of blog posts around the same time at Stretechery, laying out the reason why Apple is motivated to drive down the cost of apps (and why it's a big picture mistake).

Apple makes money on hardware. It’s in their interest that said hardware be sold for as much of a premium as the market will bear. However, it’s equally in their interest that the complements to that hardware are sold as cheaply as possible, and are preferably free….In the case of apps, the current app store, full of a wide variety of inexpensive apps, is perfect from Apple’s perspective. It’s a reason to buy Apple hardware, and that’s all that matters. Anything that on the surface makes the store less desirable for hardware buyers – such as more expensive apps – is not in Apple’s interest.

This is bloody ROI thinking. In its latest commercials, with iPads on mountain tops and iPhones on motorcycles, Apple wants to trade on the power of its devices to do amazing things. But software is what gives those devices their power. And Apple is doing very little to help sustain the people who create that software, let alone give them the respect and the gratitude they deserve. As Justin Williams recently said about his trip to the Microsoft developer’s conference:

What’s different though is that it feels like Microsoft is more interested in working with us as a partner whereas Apple has always given off a vibe of just sort of dealing with us because they have to.

I find it fitting that the number one request on most people’s lists for iOS 8 is better sharing of information between apps. What Apple needs is better sharing, period. Healthy ecosystems are all about sharing. “Better can’t be better if it doesn’t consider everything.” Just as Tim Cook sees the value in sustaining the world’s ecosystem, he needs to see the value in sustaining the developer ecosystem. It’s those developers who can provide the real return on investment, making both his products, and the world, better.

Screens Aren't Evil

Matt Honan's recent post on Wired about parenting in the age of ubiquitous screens has gotten people like Shawn Blanc and Stephen Hackett talking.


But the ever-present touchscreens make me incredibly uneasy—probably because they make parenting so easy. There is always one at hand to make restaurants and long drives and air travel much more pleasant. The tablet is the new pacifier.

For a much more in depth look at this subject, check out Hanna Rosin's piece for The Atlantic, “The Touch Screen Generation,” which I expected to be another hand-wringing expose on the decline of western civilization by way of modern technology. But it's quite the opposite. One of the myths she busts is the idea that screens are inherently less stimulating than something like reading.

An early strain of research claimed that when we watch television, our brains mostly exhibit slow alpha waves—indicating a low level of arousal, similar to when we are daydreaming. These findings have been largely discarded by the scientific community, but the myth persists that watching television is the mental equivalent of, as one Web site put it, “staring at a blank wall.” These common metaphors are misleading, argues Heather Kirkorian, who studies media and attention at the University of Wisconsin at Madison. A more accurate point of comparison for a TV viewer’s physiological state would be that of someone deep in a book, says Kirkorian, because during both activities we are still, undistracted, and mentally active.

I distinctly remember hearing about that alpha wave research, and I took it for gospel ever since that screens make us more passive than books. But it's false! Most of the articles on children and screen-time assume the medium is the problem, but the medium is neutral. In fact, research has shown that interactive content can be more educational than passive content. And yet we still feel more comfortable with our kids reading books for hours than we do with them playing video games for hours.

What we need to focus on is the content. I live in Minnesota, and this past winter was miserable, so I let my kids have more screen-time than usual. One of the things I noticed was the dramatically different effects of different video games. When they played Flappy Bird, they fell into a hypnotic trance. When they played Wii Sports, they were jumping all over the room, actually winding themselves and breaking a sweat. And when they played Minecraft, they came mentally alive, building incredibly complex virtual structures, collaborating with each other, giving each other tips, and talking all the time.

Of course, kids shouldn't spend all day with screens. Devoting yourself to only one activity is always a problem. If your kid played basketball every waking moment and never learned to read, that would be a problem. If they did nothing but read Shakespeare and never moved a muscle, that would also be cause for concern. Moderation is the obvious solution.

But we need to get beyond worrying about whether “screens” are melting our kids' brains. What we need to be conscious of is encouraging our kids, and ourselves, to engage in activities that enrich us. Sometimes that's interacting with each other, sometimes that's a hike in the forest, sometimes thats a great book, and sometimes that's an incredible video game. It's not the medium that matters, but what we take from it.


Stephen Hackett wrote a follow up post that addresses some of the things I'm talking about.

Not Remembering Ourselves

In a piece about memory for National Geographic Magazine last month – “Remember This”– Joshua Foer profiles both a man who cannot form new memories as well as a woman who can’t stop remembering every single thing that happens to her. In the middle of the piece, Foer takes a detour to discuss technology:

We’ve gradually replaced our internal memory with what psychologists refer to as external memory, a vast superstructure of technological crutches that we’ve invented so that we don’t have to store information in our brains. We’ve gone, you might say, from remembering everything to remembering awfully little. We have photographs to record our experiences, calendars to keep track of our schedules, books (and now the Internet) to store our collective knowledge, and Post-it notes for our scribbles. What have the implications of this outsourcing of memory been for ourselves and for our society? Has something been lost?

I thought of this when I read about and then listened to a podcast discussion of the controversy around Greg Knauss’s iPhone app Romantimatic. Brief recap: Knauss wrote the app to help people remember to contact their “sweetheart” every so often by text or phone. Knauss included a few prefab text messages you could send to your sweetheart, most of which were clearly intended as humorous (e.g. “My phone just vibrated and I thought of you.”) Maybe it was the prefab messages, and maybe it was the current knee-jerk fear of how technology is taking over our lives, but a lot of people freaked out. (I highly recommend Knauss’s meta-analysis of the outrage.)

One of the most vehement critics was Evan Selinger, a Fellow at the Institute for Ethics and Emerging Technology, who wrote a take down of Romantimatic for the Atlantic (“The Outsourced Lover”) and then a companion piece for Wired that linked Romantimatic to other apps that are “Turning Us into Sociopaths” :

While I am far from a Luddite who fetishizes a life without tech, we need to consider the consequences of this latest batch of apps and tools that remind us to contact significant others, boost our willpower, provide us with moral guidance, and encourage us to be civil. Taken together, we’re observing the emergence of tech that doesn’t just augment our intellect and lives — but is now beginning to automate and outsource our humanity." (emphasis in the original)

Note that word “outsource” again, the same word Foer used to describe how technology is taking over aspects of what we used to remember. The implications of “outsource” are not only pejorative, but carry strange class-based economic associations, as if by letting our phones take on certain tasks, we’re stealing jobs from the working masses and taking advantage of cheap labor overseas.

Of course, technology has always promised to make labor cheaper, or at least easier. From the invention of fire to the wheel, from washing machines to nail guns, the goal of technology has always been to lessen the load of physical labor, to “outsource” it, if you will. We don’t find outsourcing our physical labor to be problematic, though. What makes us uncomfortable is when technology encroaches on the labor of our brains rather than our bodies.

This despite the fact that technology has been supplementing our brains for at least as long as we’ve been making tools. Before the internet, before computers, before printing presses, books, or even writing of any kind, people stored information in their brains. But they did so using a primitive technology, i.e. most cultures stored their most important historical and spiritual information in the form of song and verse. Why? Because music and rhyme are easier to remember; they are the original brain augmentation technology.

Then, of course, we invented writing and everything went downhill. Socrates famously spoke out against the invention of writing, saying it would, “create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves.” And now here we are, having to be reminded by our phones to send a sweet message to our sweethearts. Just as Socrates predicted, we’ve lost our humanity!

Except not. Because what is humanity? Isn’t the fact that we create these tools part of what makes us human? I’m reminded of a passage from the amazing essay “Mind vs. Machine” by Brian Christian (which later became a book) about an annual Turing Test, in which humans and computers compete to figure out who seems the most human.

Christian delves into the history of computers and points out that the word “computer” actually used to refer to human beings, most of them women, who did the hard work of calculating any sufficiently complex bit of data that needed to be calculated. When the first computer-like devices were invented, they were described as “digital computers”: digital versions of the human workers. Alan Turing himself said, “These machines are intended to carry out any operations which could be done by a human computer.”

Christian writes about the irony of this reversal:

Now it is “digital computer” that is not only the default term, but the literal one. In the mid–20th century, a piece of cutting-edge mathematical gadgetry was said to be “like a computer.” In the 21st century, it is the human math whiz who is “like a computer.” It’s an odd twist: we’re like the thing that used to be like us. We imitate our old imitators, in one of the strange reversals in the long saga of human uniqueness.

This is why I find it odd that apps like Romantimatic are accused of outsourcing our humanity. I’ve been trying out Romantimatic myself in the past couple of weeks. I chose to delete all the prefab text messages (which may have been a design flaw, though I appreciate their sense of humor). Instead, I use the app as an unscheduled, random reminder to think about my wife and tell her what I’m thinking. This does not rob me of my humanity. If anything, it stops me in the middle of my day and reminds me to think about something of greater importance, not unlike a brief meditation or prayer.

Technology is not our savior, ready to deliver some utopian future, but it does not have to be our enemy. It’s been with us since the beginning, from poetry to reminder apps. Far from making us less human, it can even reawaken us to our humanity in the midst of our mechanized, busy lives. We just have to learn how to use it.