The Price of Great Software

Everybody's writing about and linking to Jared Sinclair's blog post where he breaks down the sales figures of his RSS reader Unread, calculating that "the actual take-home pay from the combined sales of both apps is $21,000, or $1,750/month."

Considering the enormous amount of effort I have put into these apps over the past year, that’s a depressing figure. I try not to think about the salary I could earn if I worked for another company, with my skills and qualifications. It’s also a solid piece of evidence that shows that paid-up-front app sales are not a sustainable way to make money on the App Store.

Most of the commentary so far seems to be pretty pessimistic about what this means for the future of iOS app development. Either it's nearly impossible to make good money designing apps, or you just have to design apps that take less work to design. As Benjamin Mayo puts it:

If you want to maximize your profitability, make small apps that do a few things well. The amount of effort you put into an app has very little to do with how much of the market will buy it. This means that making big apps exposes you to substantially more risk, which is not fairly counterbalanced by significantly higher earnings potential.

Marco Arment thinks the way forward is being more efficient:

As the economics get tighter, it becomes much harder to support the lavish treatment that developers have given apps in the past, such as full-time staffs, offices, pixel-perfect custom designs of every screen, frequent free updates, and completely different iPhone and iPad interfaces...Efficiency is key. And efficiency means doing more (or all) of the work yourself, writing a lot less custom code and UI, dropping support for older OSes, and providing less customer support.

But there's another option: simply charge more if your app is worth it, and charge for every major update. On a recent episode of Iterate, Joe Cieplinski of Bombing Brain describes how his company developed the premier teleprompter app for the iPad, and how it not only sold well at a relatively high price but went on to sell even better when they raised the price higher. People have been saying similar things about the Omnigroup's pricing for years.

Brent Simmons recently made the case that most indie software developers in the Apple ecosystem make apps for the Mac. The implication is that Mac apps make more money, because developers typically charge more for them. Tyler Hall backs up this point from his own experience:

It’s my experience that you CAN build a sustainable software business selling to consumers as an independent developer. You just can’t do it in the App Store any longer – if you ever could. You need to start building for the Mac, where you can charge a fair price, sell directly to your customers, and charge for upgrades. Even then, success won’t happen overnight. But it is doable.

Of course, I'm just a user, not a developer, so this is all just speculation. But when I look at that sales chart for Unread, and see that huge spike in the first few days, I see myself and other people like me, people who love these beautifully designed, "hand-crafted" apps.

ufy-iphone-regular-graph 2.png

We aren't buying these apps on impulse. We're buying them because we read the reviews and we know what the best apps are, and we want to own the best. Maybe indie devs need to stop chasing the normals (who think everything should be free anyway) and just charge a fair price from the folks who care.

Why Teaching Innovations Don’t Spread in the US

I was initially turned off by the shame-inducing headline of this article by Elizabeth Green in the New York Times, Why Do Americans Stick at Math? But the answer to her question is actually surprising. The innovations in math education that have spread around the world, and that have shown remarkable success, actually started here in the United States. So why have those innovations failed to spread here? Because we choose not to invest in the professional development of our teachers.

In Finland and Japan, where students perform at or near the top in math assessments, teachers spend only about 600 hours a week in the classroom, using the rest of their time to prepare lessons, observe other teachers, and receive feedback on their own teaching. American teachers, by contrast, spend more than 1000 hours a year in the classroom, and have traditionally received almost no feedback from fellow teachers (though this is starting to change).

My wife taught middle school and high school for about ten years, and I have taught at the college level for the last five, and I'm consistently frustrated with the carrots and sticks approach to improving our country's schools, as if bribing teachers with merit pay or threatening them with firing are the best ways to motivate them. In fact, most teachers I know are always striving to do better, even if they're already amazing teachers. What they need is the time and the support to actually improve their skills.

As Green writes:

Most policies aimed at improving teaching conceive of the job not as a craft that needs to be taught but as a natural-born talent that teachers either decide to muster or don’t possess. Instead of acknowledging that changes like the new math are something teachers must learn over time, we mandate them as “standards” that teachers are expected to simply “adopt.” We shouldn’t be surprised, then, that their students don’t improve.

(Via @stevenstrogatz)

The Staggering Scale of Minecraft

Five years in, Minecraft (the system) has bloomed into something bigger and more beautiful than any game studio — whether a tiny one like Markus Persson’s or a huge one like EA — could ever produce on its own. The scale of it is staggering; overwhelming. As you explore the extended Minecraft-verse online, you start to get the same oceanic feeling that huge internet systems like YouTube and Twitter often inspire: the mingling of despair (“I’ll never see it all”) with delight (“People made this”) with dizzying anthropic awe (“So… many… people.”)

What impressed me about Minecraft, from the moment I first saw my children playing it, was how its open-ended structure could result in such wildly different forms of play.

The first thing my son showed me was the complex, working roller coaster he'd constructed inside the game, which we could ride in a mining cart. Then my daughter invited me to see the house she'd built. She maneuvered the POV inside the door, and suddenly, dozens of eyes turned and looked at us from every direction. “These are my cats!” she announced. She'd stuffed her house to the brim with these pixilated creatures.

In other words, the same game served as both my son's virtual erector set and as a virtual extension of my daughter's growing stuffed animal collection. And that was within the first week of them playing it.

(Via DF)

The Best Podcasting Apps for iPhone

When Marco Arment announced that he was making a podcast app, he was deliberately entering a crowded market. He said that he wanted to rid himself of his irrational fear of direct competition.

The crowded market doesn't seem to be hurting him. The reviews of his app Overcast in Macstories, Macworld, and elsewhere have all been great. But it's also important to note that the crowded market itself is not diluted by the number of independent developers entering it. On the contrary, the diversity of apps, and all the different approaches to the problem of podcast delivery, can serve to improve the quality of all those apps.

There's a post on the Supertop blog (the developers of the podcast app Castro) about why they welcome the competition from Overcast.

From our perspective, a user trying any third party app is good for all third party apps. If a user is persuaded to download one alternative they should be more likely to consider others in the future, especially given the variety of apps that are available...I encourage you to try Overcast. In fact, if you really love podcasts, I encourage you to try all the others too.

I decided to do just that, purchasing and trying out Instacast, Downcast, Pocketcasts, Podwrangler, Castro, Mocast, and Overcast. I made comparison video below to share what I observed.

I have no background in design, but perhaps because I have a degree in comparative literature, I find it endlessly fascinating to compare the interface of apps and savor the details of what each one offers. None of these apps is perfect, but it's inspiring to see how a group of talented humans use their skills in different ways, through the endlessly plastic medium of software, to approach the same problems.

The Genesis of Goodnight Moon

I loved Aimee Bender's appreciation of Goodnight Moon (via Kottke). She does a great job of describing how the book's genius lies in how it creates a structure and then deviates from it in surprising ways:

For writers, this is all such a useful reminder. Yes, move around in a structure. But also float out of that structure. “Goodnight nobody” is an author’s inspired moment that is inexplicable and moving and creates an unknown that lingers. How wonderful that this oddly compassionate moment, where even nobody gets a good night, shows up in the picture book that is the most popular! There is no template, ever.

I wrote a bit about Margaret Wise Brown years ago for The Writer's Almanac, and I was especially interested in how, in writing Goodnight Moon, she drew on her knowledge of the way children learn language.

Brown wanted to become a writer as a young woman, and she once took a creative writing class from Gertrude Stein. But she had a hard time coming up with story ideas, so she went into education. She got a job at an organization called the Bureau of Educational Experiments, researching the way that children learn to use language. What she found was that children in the earliest stage of linguistic development relish language with patterns of sound and fixed rhythms. She also found that young children have a special attachment to words for objects they can see and touch, like shoes and socks and bowls and bathtubs.

She eventually began to write books for children based on her research, and in 1938 she became the editor of a publishing house called William R. Scott & Company, which specialized in new children's literature. The Great Depression had made children's books into luxury items, and most other publishing houses had phased out children's literature. Margaret Wise Brown helped make children's books profitable, because she understood that children experience books as sensual objects. She invested in high quality color illustrations, and she printed her books on strong paper with durable bindings, so that children could grab, squeeze, and bite their books the way they did with all their toys.

Brown had been a fairly successful writer and editor for almost ten years when, one morning, she woke up and wrote a poem, listing the items in a house, and then saying goodnight to each item, including the famous lines “Goodnight room / Goodnight moon / Goodnight cow jumping over the moon … / Goodnight comb / And goodnight brush / Goodnight nobody / Goodnight mush. / And goodnight to the old lady whispering 'hush' / Goodnight stars / Goodnight air / Goodnight noises everywhere.” She thought the poem could be made into a book, so she sent it off to her publisher, and it was published in 1947 as Goodnight Moon.

The influential New York Public Library gave it a terrible review, and it didn't sell as well as some of Brown's other books in its first year. But parents were amazed at the book's almost hypnotic effect on children, its ability to calm them down before bed. Brown thought the book was successful because it helped children let go of the world around them piece by piece, just before turning out the light and falling asleep.

Parents recommended the book to each other, and it slowly became a word-of-mouth best-seller. It sold about 1,500 copies in 1953, 4,000 in 1955, 8,000 in 1960, 20,000 in 1970; and by 1990 the total number of copies sold had reached more than four million.

You Can Delete But You Can’t Forget

Jacqui Shine made a rash decision to delete all the email from her late mother; in a piece for the Atlantic, she writes about why the digital nature of that deletion is uniquely haunting.

In another way, though, those deleted emails do survive, though—or, at least, the data that Google has extracted from them in order to build your user profile has...Every time I get served an ad for a fawning book about the Founding Fathers or for a deviled egg tray, it’s a kind of tiny haunting: a palimpsest of what once was, stripped of what made it really meaningful. And those tiny traces may be the problem—not because they can’t allow us to recover the things we’ve lost, but because they allow us to believe that we can.

What Tech Offices Tell Us about the Future of Work

Kate Losse at Aeon magazine on the insidious effect of high-end, handcrafted office design in modern tech culture:

Of course, the remaking of the contemporary tech office into a mixed work-cum-leisure space is not actually meant to promote leisure. Instead, the work/leisure mixing that takes place in the office mirrors what happens across digital, social and professional spaces. Work has seeped into our leisure hours, making the two tough to distinguish. And so, the white-collar work-life blend reaches its logical conclusion with the transformation of modern luxury spaces such as airport lounges into spaces that look much like the offices from which the technocrat has arrived. Perhaps to secure the business of the new moneyed tech class, the design of the new Centurion Lounge for American Express card members draws from the same design palette as today’s tech office: reclaimed-wood panels, tree-stump stools, copious couches and a cafeteria serving kale salad on bespoke ceramic plates. In these lounges, the blurring of recreation and work becomes doubly disconcerting for the tech employee. Is one headed out on vacation or still at the office – and is there a difference?

Zoo Animals and Their Discontents

Alex Halberstad writing for the New York Times magazine about how modern zoo animals, despite being given better enclosures and more "enrichment" still suffer from mental health disorders.

I wondered, too, why disorders like phobias, depression and OCD, documented at zoos, don’t appear to have analogues among animals living in the wild. Irene Pepperberg, a comparative psychologist at Harvard who is known for her work with African gray parrots, told me that she thought one reason had to do with survival. “An animal in the wild can’t afford to be depressed,” Pepperberg said. “It will simply be killed or starve, since its environment requires constant vigilance. The situation kind of reminds me of my Jewish grandparents, whose lives were a lot harder than mine. They never seemed depressed, because I don’t think it ever occurred to them.”

In other words, we'd all be a lot happier if lions were actually trying to eat us.

Punctuated Equilibrium

Joe Pinsker, writing for the Atlantic about the fate of the apostrophe in the 21st century, points out how computers are actually preserving aspects of language we might otherwise be willing to let atrophy:

Autocorrect, the now-ubiquitous software that’s always reading over our shoulders, tends to put apostrophes in when we omit them—which means they might remain a feature of informal writing for longer than they otherwise would. The software may also prop up other formal conventions, among them capitalization and “silent” letters (like the u, g, and h that drop out as though becomes tho). “Autocorrect is acting like a language preservative,” says Alexander Bergs, a linguistics professor at Germany’s Osnabrück University. “Which is funny, as usually new media like cellphones and computers are blamed for language decay.”

Feeling More Comfortable with Computers

Tom Jacobs writing for Pacific Standard about a study of how patients feel when describing symptoms to a computer instead of a human:

The result: People disclosed information more honestly and openly when they were told they were speaking exclusively to the computer. The participants also “reported significantly lower fear of self-disclosure” under those circumstances. These results were reiterated by the analysis of their facial expressions, which found they “allowed themselves to display more intense expressions of sadness” when they believed no human was watching them.

This makes perfect sense to me. I can't remember a time when I felt too embarrassed to tell my doctor something, but I've definitely felt judged by doctors, who acted as if I'd come to them for minor ailments. The feeling of judgement certainly affected how much I told them, and I know I'm not alone in this experience. I know a woman who once went to a doctor because she had recently experienced some weight loss and was having dizzy spells. Her impression was that the doctor assumed she was anorexic. He gave her a prescription for anti-dizziness pills, as if that was the problem she was trying to solve.

In a piece for the New Yorker about how doctors make decisions, Jerome Groopman wrote,

Doctors typically begin to diagnose patients the moment they meet them. Even before they conduct an examination, they are interpreting a patient’s appearance: his complexion, the tilt of his head, the movements of his eyes and mouth, the way he sits or stands up, the sound of his breathing. Doctors’ theories about what is wrong continue to evolve as they listen to the patient’s heart, or press on his liver. But research shows that most physicians already have in mind two or three possible diagnoses within minutes of meeting a patient, and that they tend to develop their hunches from very incomplete information.

Perhaps using computers for patient intake could improve both sides of the equation: putting the patent more at ease to share all the relevant information, and giving the doctor a fuller picture of that information before they start forming a premature diagnosis.

Too much Delight?

Interesting take from Sean Madden at Wired on why the Amazon Fire Phone may be too delightful for its own good.

The average smartphone user interacts with his or her mobile device over 100 times per day, and the majority of those interactions fall into just a few categories: opening an app, selecting from a list, bringing up a keyboard, and so on. If each of them is imbued with too much visual whiz-bang, using your phone becomes the digital equivalent of eating birthday cake for every meal.

I would argue that it's not so much the frequency of the effect but the utility that matters. If the effect slows down the experience without offering anything other than eye candy, it's bad design. “Whiz-bang” is a sparkly coat of paint on the surface of the interface. Delight is the spark of life that lives inside the app, coded deep into its DNA.

Welcome Cracks in the Walled Garden

The first good sign was the opening video. Last year's video was a visually pleasing but somewhat abstract statement of purpose about Apple's design principles. The message seemed to be, "We're Apple. We know design. Learn from us." This year, the video focused on people talking about apps and how they improve people's lives. The content wasn't amazing, but the contrast was stark. Apple often takes a moment to nod toward the importance of developers, but this felt bigger than that. Rather than focusing inward on their own expertise, Apple was focusing outward on the people who build on their platforms and use their products. The video ended with the words "thank you" addressed directly to developers. I'm not sure how this went over in the room, but as a user who feels deep gratitude for the apps I use every day, I felt like that thank you was long overdue. And that was just the beginning. Apple spent the rest of the keynote demonstrating this new outward focus by tearing down walls.

Critics of the company love to toss around terms like "walled garden," in reference to Apple's paternal approach to interoperability. It's a fair criticism, especially when it comes to iOS. The App Store, sandboxing, and iCloud each put their own restrictions on how users can access software and data. But another way to see it is that Apple has always been a device-centric rather than a data-centric company.

Other players in the computer industry always saw a sharp divide between the hardware and the software, but Apple has always tried to take a holistic view, controlling as much of both the hardware and the software as possible. This approach only increased with iOS, which gave Apple even greater control of what software could be loaded onto the device, how applications could communicate with each other, and what users could (and couldn't) customize about their experience. That level of control made iPhones and iPads more approachable than any computing devices had ever been before. And Apple's device-centric approach filtered down to developers, who made apps that themselves felt like mini-virtual devices, each with their own unique designs, powers, and solutions.

But overtime, that device-centric approach has felt more and more limiting. Every time you tap on a notification and get pulled sideways into a new app, or you tap open in and find yourself flung in a different direction, you feel your head bump against the walls of the walled garden. Apple wants to hide the file system because ordinary users find it confusing, but is it really less confusing to open a photo in a photo editing app, make changes, and then have to export them as an entirely new photo to the Photos app?

Apple has finally decided to tackle these problems, making the walls of its many walled gardens rather more porous. The most obvious of these changes is a new iCloud document picker, which will allow iOS apps to select a file and then save it without creating second copies. This is the closest Apple has come to introducing a real file system to iOS, and without a demo, it remains to be seen what it will actually look like, but the keynote mentioned that iCloud will not be the only storage option for this document picker. Theoretically, customers could choose Google Drive, One Drive, or even Dropbox.

Other changes include interactive notifications, such as the ability to accept appointment requests, respond directly to messages, and apparently take action on third party notifications (though the only example was Facebook). So instead of having to bounce over to the app in question, entering its little garden, you can just interact with the information itself wherever you are. Another example is third party widgets in the Today view of Notification Center (something that carries over to the Mac). Again, you'd be interacting with the data of an app or the feature of an app without having to enter the the app itself. And Healthkit and Homekit, which were touted as rumors in the run up to the keynote, were described as aggregators of data from other apps. The data, liberated from its silos, can be collected, examined, and transformed with new meaning.

Apple also pulled down the walls between iOS devices and the Mac. There's a new feature called "Continuity," which gives devices a variety of ways to share data more fluidly. You will be able to use Airdrop to send data between Mac and iOS. You can also "hand off" tasks from one device to the next. Writing an email on your phone? Easily switch to writing it on your Mac. Get a call or an SMS on your phone? View the call or the message on your Mac. Each of these features frees the computing task at hand from its confinement to one specific app or one specific device.

But finally, the feature on almost everyone's iOS wish list came true. Apple introduced "Extensibility," a kind of inter-app communication that would allow apps to open up instances of each other's UI to take certain actions. The example Apple showed was of opening a photo in the Photos app and being able to use a filter from another installed app without leaving Photos. It isn't clear yet whether one third party app will be able to interact with another third party app, but that was the implication.

The larger implication is that developers can now begin to think about apps as either stand-alone powerful pieces of software or as extensions of other pieces of software. I don't really want to buy any more apps that let me add filters to my photos, but I might buy an extension to my main photo editing app that gives me extra features.

Power users are no doubt cheering all of these additions. For me, what's really exciting is not the features in themselves (though I am excited to try them) but the apparent change in philosophy, the willingness to trust the users and the developers. With iOS 7, Apple seemed to be saying that people are comfortable enough with touch interfaces that they don't need skeuomorphic designs anymore to make them feel comfortable. With iOS 8, Apple seems to be saying that people are comfortable enough with the various data they manage through their devices and their apps. That data can now begin to move more fluidly between those devices and apps.

Recently, in "Sharing the Ecosystem" I wrote,

I find it fitting that the number one request on most people’s lists for iOS 8 is better sharing of information between apps. What Apple needs is better sharing, period. Healthy ecosystems are all about sharing. “Better can’t be better if it doesn’t consider everything.” Just as Tim Cook sees the value in sustaining the world’s ecosystem, he needs to see the value in sustaining the developer ecosystem. It’s those developers who can provide the real return on investment, making both his products, and the world, better.

I came away from the keynote feeling that Tim Cook understands this. He chose to begin the keynote with a thank you to developers, and he ended it by asking all the Apple employees in the audience to stand up to receive recognition. For the last two decades, Apple was defined by one man's vision, even if there were many people behind that vision. Tim Cook wants to celebrate all the people working to make Apple better. I have rarely felt more hopeful about the company.

The Origin of "Don't Be Evil"

When my wife was in graduate school to get a master's degree in education, she took a class about how to teach students of different cultures without racial bias. Near the end of the class, one of her classmates said of the textbook they'd been reading, "You know, this book should just be called, 'Don't Be a Dick.' And all the pages could be blank."

I thought of that story recently while reading Steven Levy's book about Google, In the Plex, which includes the origin story of Google's infamous company motto, "Don't Be Evil." It's common these days for bloggers and journalists to point out all the ways in which Google falls short of the ideal expressed in that motto. So it was surprising, for me at least, to learn that the motto actually started as a kind of joke, not unlike the joke my wife's classmate made about not being a dick.

According to Steven Levy, Google held a meeting in 2001 to try to nail down its corporate values. Stacy Sullivan, the head of human resources, stood at the front of the room with a giant notepad, writing down platitudes like, "Google will strive to honor all its commitments." But engineer Paul Buchheit thought the whole thing was absurd.

Levy writes,

Paul Buchheit was thinking, This is lame. Jawboning about citizenship and values seemed like the kind of thing you do at a big company. He’d seen enough of that at his previous job at Intel. At one point the chipmaker had given employees little cards with a list of values you could attach to your badge. If something objectionable came up you were to look at your little corporate values card and say, “This violates value number five.” Lame. “That whole thing rubbed me the wrong way,” Buchheit later recalled. “ So I suggested something that would make people feel uncomfortable but also be interesting. It popped into my mind that ‘Don’t be evil’ would be a catchy and interesting statement. And people laughed. But l said, ‘No, reaIIy.”’

The slogan made Stacy Sullivan uncomfortable. It was so negative. “Can’t we phrase it as ‘Do the right thing’ or something more positive?” she asked. Marissa and Salar agreed with her. But the geeks—Buchheit and Patel—wouldn’t budge. “Don’t be evil” pretty much said it all, as far as they were concerned. They fought off every attempt to drop it from the list.

“They liked it the way it was,” Sullivan would later say with a sigh. “It was very important to engineering that they were not going to be like Microsoft, they were not going to be an evil company.”

I just love the fact that the motto did not originate out of some wide-eyed idealism. Instead, it was an attempt to cut through the whole bullshit concept of "corporate values." It's no wonder the company has had trouble living up to that ideal. "Don't Be Evil" is the implicit motto of every idealistic company before it gets mired in the messy, morally compromised world of actually making money.

Better Living (and Less Anxiety) through Software

It was truly a pleasure to be a guest on Brett Terpstra's podcast Systematic this week. He's had some amazingly interesting folks on the show lately, so I just hope I measure up. We talked about my background in radio and then segued into the topic of anxiety and technology.

Fittingly, I began feeling anxious almost as soon as we finished the Skype call. Not that it wasn't a good conversation, but there was one part where I felt I could have explained myself a lot better. I had been talking about a turning point in my life, when I started my second and last job in public radio.

My first job in radio was writing for a show called The Writer's Almanac, and I was good at it, despite the fact that the show's host was notoriously demanding. In my first three years writing for the show, three producers quit, along with several other writers who either quit or got fired. I was finally the only one left standing, so I became the sole writer and producer, and I persisted for two more years. The day I left, they said I should get a plaque for lasting as long as I did. I thought this constituted evidence of my competence.

And yet, when I moved to a different job on a different radio show, I suddenly felt like the least competent person in the world. This was especially confusing because the new job should have been easier. I was no longer the sole writer and producer of a show, I was just one associate producer within a team. I only had to write bits and pieces of script, do occasional research, write the occasional blog post, answer listener emails, book guests, and help edit audio. None of these jobs was high stakes. It should have been a breeze. But it nearly killed me.

Part of the problem was multitasking. At my previous job, I'd been doing one thing at a time. Write this script. Now write that script. I did most of my work from home in a quiet room. I was allowed to focus.

At my new job, I was always juggling multiple projects: researching the next guest, proofreading the latest script, writing a promo, editing tape. I had always relied on my memory to keep track of my to-do list (I rarely wrote down homework assignments in high school or even studied for tests, and still did well), but my memory completely failed me in this new work environment. I began to worry all the time about whether I had forgotten something. Had I booked that guest for the right time? Had I checked the time zone? Did I fact check that script sufficiently? Should I read it through one more time?

Another problem was the office environment. I worked in a cubicle, with team members all around me. There was little space or time to focus deeply on anything. We were all expected to be on email all the time, injecting our thoughts into one another's brains at will. One of my tasks was to respond to listener email, and every Monday we got a flood of responses to our show, both tremendously positive and viciously negative. And if there had been any factual errors in the show, the listeners would take us to task, and the host would not be happy. I began to dread the weekend, imagining the army of potential attackers amassing and hurling their spears into cyberspace, each blow landing in my inbox on Monday morning.

The result of all this anxiety was that I found it harder and harder to concentrate. I began to make the mistakes I so feared making. Which only made me worry more. I started waking up every night at 3:00 AM, unable to get back to sleep, my mind racing with everything I needed to worry about. Then I started waking up at 2:00 AM. Then 1:00 AM. Then Midnight. If this had continued, I would have started waking up before I went to sleep.

If you have not experienced severe depression or anxiety, you might find it hard to understand is how physical an illness it really is. I did not just feel sick in my head. Every cell in my body felt scraped out and raw. I had no patience for my children. I had no energy to help my wife around the house. Imagine how you feel when you realize something horrible is about to happen: you forgot the essential thing you need for that important meeting, your car is sliding on the ice, or your child is falling head first off the jungle gym in slow motion. Now imagine feeling that kind of dread every waking moment for weeks on end.

That was me at my lowest point. I kept asking myself, "Why can't I do this? This shouldn't be so hard. What's wrong with me?"

In the interview with Brett, I alluded to something I read once that compared depression to a fever (unfortunately, the author was the now-discredited Jonah Lehrer, but I still find the article persuasive). In response to an infection, the body raises its own temperature as a way of killing off the infection. Depression, likewise, raises the frequency of negative "ruminative" thoughts. Psychiatrists have typically seen these kinds of thoughts as part of the problem, but some believe depression may be the body's way of forcing you to focus on what's wrong in your life in order to change it.

Imagine, for instance, a depression triggered by a bitter divorce. The ruminations might take the form of regret (“I should have been a better spouse”), recurring counterfactuals (“What if I hadn’t had my affair?”) and anxiety about the future (“How will the kids deal with it? Can I afford my alimony payments?”). While such thoughts reinforce the depression — that’s why therapists try to stop the ruminative cycle — Andrews and Thomson wondered if they might also help people prepare for bachelorhood or allow people to learn from their mistakes. “I started thinking about how, even if you are depressed for a few months, the depression might be worth it if it helps you better understand social relationships,” Andrews says. “Maybe you realize you need to be less rigid or more loving. Those are insights that can come out of depression, and they can be very valuable.”

Of course, it's important to note that while a fever can help rid your body of germs, it can also kill you. I don't know what might have happened to me if I hadn't talked to a doctor at the time. Medication was definitely part of my recovery. It helped reduce my symptoms so that I could see the root cause of the problem: this was not the right job for me.

So I quit, and took a couple months off before I started my next job. In that time, I realized two things. First, I wanted to learn how to be more organized. Second, I wanted to make time for the kind of deep focus creative work that gave my life real meaning. That was five years ago, and I've managed to accomplish both of those goals, largely with the help of software.

There's been some talk lately about whether software tools actually provide any benefit, and whether software design is solving real problems. But for me, every time I dump my mind into Omnifocus, or add an event to Fantastical, or forward an email with attachments to Evernote, or set a reminder in Due, I feel a little more in control of my life. I can much more easily manage my job as a college writing teacher, juggling multiple projects, multiple classes, lesson planning, grading, committee meetings, department responsibilities, and so on.

Keeping my life more organized also makes it possible to have a clear head when I want to focus on something important. One of my goals after quitting my job was to write a novel, and I finally made time for it. The app Scrivener helped me break the novel down into manageable pieces, and for the first time in my life, writing fiction felt enjoyable rather than fraught. More recently, I was inspired by the power of the app Editorial to start writing this website (and have written almost every post with it).

Of course, there's a danger here. Buying a new notebook and a fancy pen does not make you a writer. Making a to-do list is not an actual accomplishment. Tools are not the end goal, and using a tool, no matter how well-designed, does not make hard work any easier. But the right tool can provide an important cue to help create a habit or build a ritual for doing the actual work.

Software has improved my life by making the work feel more possible, creating virtual spaces where I feel less anxious. And the less anxious I feel, the more I feel capable of doing the work that matters, and the more I feel alive.

The Illusion of Power

I love this Rolling Stone interview with George R.R. Martin, which goes a long way toward explaining why the Game of Thrones books (i.e. Song of Ice and Fire) are so much more than escapist fiction. I read them as a sword and sorcery version of The Wire, with a Hobbesian view of power as the central theme. As Martin says,

One of the central questions in the book is Varys' riddle: The rich man, the priest and the king give an order to a common sellsword. Each one says kill the other two. So who has the power? Is it the priest, who supposedly speaks for God? The king, who has the power of state? The rich man, who has the gold? Of course, doesn't the swordsman have the power? He's the one with the sword – he could kill all three if he wanted. Or he could listen to anyone. But he's just the average grunt. If he doesn't do what they say, then they each call other swordsmen who will do what they say. But why does anybody do what they say? This is the fundamental mystery of power and leadership and war through all history....It's all based on an illusion.

Most people familiar with the books or the TV show remember the dramatic deaths of various characters best, but for me, one of the most powerful scenes in any of the books (mild spoiler from book/season 1), was the moment Ned Stark and Cersei face each other down after the death of the king. Ned holds the king's seal, which he claims gives him the power to rule. Cersei claims the power belongs to her and her son, the heir to the throne. The room is filled with armed guards, who have to decide whom to follow. What makes the scene so dramatic is that Ned and Cersei have no real power. They have no weapons to wield but words. All their power flows from the people around them who choose to believe they have power.

For some reason, that scene lays bare the illusion of power better than almost anything I've ever read. I think about it all the time, in department meetings at the college where I teach, at campus events when the president of my college gives a speech, even when I watch the President of the United States on TV. It reminds me of something the physicist and author Janna Levin said on a radio show where I used to work, about how her cosmological view of the universe sometimes gives her a strange perspective on our race of primates and the ways we organize ourselves on this tiny planet:

You know, for me, it's so absurd, because it's so small and it's so — this funny thing that this one species is acting out on this tiny planet in this huge, vast cosmos. Of course, I take very seriously our voting process and I'm, you know, very, try to be politically conscious. But sometimes, when I think about it, I have to laugh that we're all just agreeing to respect this agreement that this person has been elected for something. And that is really a totally human construct that we could turn around tomorrow and all choose to behave differently. We're animals that organize in a certain way. So it's not that I completely dismiss it or don't take it seriously, but I think a lot of the things we are acting out are these animalistic things that are consequences of our instincts. And they aren't, in some sense, as meaningful to me as the things that will live on after our species comes and goes.

Sharing the Ecosystem

Tim Cook got a lot of attention back in February when he was challenged at a shareholder meeting to explain Apple’s commitment to green energy initiatives. A conservative group of shareholders had put forward a proposal asking Apple to focus only on initiatives that had a clear ROI (return on investment). According to a report in Mac Observer, Tim Cook grew visibly angry at the suggestion:

When we work on making our devices accessible by the blind. I don’t consider the bloody ROI….If you want me to do things only for ROI reasons, you should get out of this stock.

Cook underlined his commitment to the environment again this past week by providing the voiceover for Apple’s promotional video Better, about Apple’s use of solar and wind energy, among other environmentally friendly practices. But it’s worth noting the difference in the message. At the shareholder meeting, Cook seemed to be saying that he doesn’t care about return on investment – doesn’t care about profits – when it comes to doing things that are just right. But in the video he keeps repeating the word “better” in reference both to Apple’s products and Apple’s commitment to the environment. It’s not that he doesn’t care about return on investment; it’s that he’s enlarging the very meaning of the term.

Better. It’s a powerful word and a powerful ideal. It makes us look at the world and want more than anything to change it for the better, to innovate, improve, to reinvent, to make it better. It’s in our DNA. And better can’t be better if it doesn’t consider everything. Our products. Our values.

If Tim Cook hadn’t gotten so angry at that guy at the shareholder meeting, he might have explained that profits are only one return on the investment. If you’re the most valuable company in the world, and you’re not concerned about the impact of your company on the environment, you’re not playing the long game. We all share the same ecosystem. Investing in that ecosystem is investing in the future. It might not look like a profitable investment, but it could yield immeasurable returns.

So I’m heartened by Apple’s apparent commitment to the environmental ecosystem, but I wish they had the same attitude toward their software ecosystem.

I know the history of that ecosystem from my vantage point as a user. I switched to a Mac in 2007, and as much as I loved the hardware, I discovered pretty quickly that the real advantage was the software, and not just the software made by Apple. Independent developers who cared about design, who wanted to make software for individuals rather than the enterprise, had been using Macs and writing Mac software for years. Using those applications for the first time, I began to see software as a kind of art form in and of itself.

The iPhone and the App Store brought that art form to the masses. By creating a place where customers could easily, and without fear, download any number of apps, Apple made software mainstream. Before that, most customers only bought software when they bought their computers, preloaded with an office suite and maybe one or two more apps. The iPhone, and later the iPad, provided both the perfect delivery and the perfect medium for software, because the entire device itself changed based on whatever software had just been launched.

The result was that Apple managed to cultivate what I’d argue was the richest software ecosystem in the history of computing. Which is why it’s so strange that Apple now seems to be on the cusp of letting that ecosystem wither. It’s no secret that the App Store is broken, that developers are having a harder and harder time making good money. Marco Arment has been talking about this for a long time, most clearly when he made his case this past fall that paid-upfront apps are dead. Ben Thomson wrote a series of blog posts around the same time at Stretechery, laying out the reason why Apple is motivated to drive down the cost of apps (and why it's a big picture mistake).

Apple makes money on hardware. It’s in their interest that said hardware be sold for as much of a premium as the market will bear. However, it’s equally in their interest that the complements to that hardware are sold as cheaply as possible, and are preferably free….In the case of apps, the current app store, full of a wide variety of inexpensive apps, is perfect from Apple’s perspective. It’s a reason to buy Apple hardware, and that’s all that matters. Anything that on the surface makes the store less desirable for hardware buyers – such as more expensive apps – is not in Apple’s interest.

This is bloody ROI thinking. In its latest commercials, with iPads on mountain tops and iPhones on motorcycles, Apple wants to trade on the power of its devices to do amazing things. But software is what gives those devices their power. And Apple is doing very little to help sustain the people who create that software, let alone give them the respect and the gratitude they deserve. As Justin Williams recently said about his trip to the Microsoft developer’s conference:

What’s different though is that it feels like Microsoft is more interested in working with us as a partner whereas Apple has always given off a vibe of just sort of dealing with us because they have to.

I find it fitting that the number one request on most people’s lists for iOS 8 is better sharing of information between apps. What Apple needs is better sharing, period. Healthy ecosystems are all about sharing. “Better can’t be better if it doesn’t consider everything.” Just as Tim Cook sees the value in sustaining the world’s ecosystem, he needs to see the value in sustaining the developer ecosystem. It’s those developers who can provide the real return on investment, making both his products, and the world, better.