Socially Unproductive Attachments

A fascinating but creepy piece by Julie Beck in the Atlantic on the growing sex doll industry:

In her Ph.D. dissertation, [Cynthia Ann] Moya questions why there is something uniquely perverse about owning a sex doll. As she puts it, “A better spatula does not inspire lengthy monologues about human alienation and the reifying effects of technological mechanization on our lifestyles.” Sexuality is an appetite, not unlike hunger, but we treat the devices used to satisfy that appetite differently. If the doll owners aren’t hurting anyone, why should we condemn something that is basically just fancy masturbation?

But sex dolls do retain something of an ick-factor, even as vibrators and other sex toys have become more mainstream. That’s because the dolls are tied up with questions about gender and power in a way that spatulas (and even vibrators) are not.

According to [Dr. Marquard] Smith, any sort of non-reproductive sexual behavior has historically been seen as perverse. These days, though, many people are okay with sex that isn’t reproductive. We’re less okay with emotional attachments that aren’t socially productive.

The Power of a Camera

In light of the fiasco in Ferguson, Derek Thompson has written a piece for the Atlantic online about a small technological tool that could dramatically improve the relationship between police and the policed:

In 2012, Rialto, a small city in California's San Bernardino County, outfitted its police officers with small Body Cams to be worn at all times and record all working hours. The $900 cameras weighed 108 grams and were small enough to fit on each officer's collar or sunglasses. They recorded full-color video for up to 12 hours, which was automatically uploaded at the end of each shift, where it could be held and analyzed in a central database. 

When researchers studied the effect of cameras on police behavior, the conclusions were striking. Within a year, the number of complaints filed against police officers in Rialto fell by 88 percent and "use of force" fell by 59 percent. “When you put a camera on a police officer, they tend to behave a little better, follow the rules a little better,” Chief William A. Farrar, the Rialto police chief, told the New York Times. “And if a citizen knows the officer is wearing a camera, chances are the citizen will behave a little better.”

Two Kinds of Memory

Annie Murphy Paul writing for Slate on the difference between electronic and organic memory and the evolving uses of each in fields like healthcare:

The second insight that emerges from a close look at electronic and organic memory is that E-memory is good for invariant storage, while O-memory is good for elaborated connections. If we make note of an upcoming appointment in our smartphone, its digital calendar won’t misremember the date or time, as our all-too-fallible brains are apt to do. On the other hand, if we enter the germ of an idea in our phone’s note-taking app, we won’t return after a busy weekend or a good night’s sleep to find that the idea has grown new connections and layers of meaning, as an idea planted in our organic memory is likely to do.

This explains why I find apps like Fantastical and Due to be so essential in aiding my leaky memory, but whenever I capture an idea in an app like Evernote, I rarely think about that idea again. The act of capturing gives me an excuse to forget. Perhaps someone should create a note-taking app (if it doesn't already exist) specifically designed for capturing ideas, which then periodically reminds you to think about those ideas.

The Best RSS App for the iPad

I wrote a review of iPad RSS Apps for the The Sweet Setup. My favorite was Unread, because it's the best app for simply reading your feeds, rather than endlessly processing them, but they each have great features.

In the course of the review, I also decided to make another design comparison video, showing the different features of the three big players in the RSS app space on the iPad: Reeder, Mr. Reader, and Unread.

No Permission Necessary

Love this bit from Kevin Kelly's piece You Are Not Late:

Right now, today, in 2014 is the best time to start something on the internet. There has never been a better time in the whole history of the world to invent something. There has never been a better time with more opportunities, more openings, lower barriers, higher benefit/risk ratios, better returns, greater upside, than now.

(via Shawn Blanc)

Kelly's main point is that the future of the internet still holds many surprises and innovations to come, but really, his statement would be true at any point in time. It reminds me of what Ira Glass said at the end of his recent Lifehacker interview:

Don't wait for permission to make something that's interesting or amusing to you. Just do it now. Don't wait.

And Glass's advice further echos the advice of Radiolab co-host Robert Krulwich, who said in a commencement address years ago:

Suppose, instead of waiting for a job offer from the New Yorker, suppose next month, you go to your living room, sit down, and just do what you love to do. If you write, you write. You write a blog. If you shoot, find a friend, someone you know and like, and the two of you write a script. You make something. No one will pay you. No one will care. No one will notice, except of course you and the people you’re doing it with. But then you publish, you put it on line, which these days is totally doable, and then… you do it again.

I wrote about this a while back in a blog post about the future of blogging.

I had [Krulwich's] words in mind when I started my blog six months ago, and I’ve had them in mind whenever I think I should be pitching one of my blog posts to an online publication like Slate or Salon or The Magazine. I’d like to get paid for what I write, but there’s something wonderfully satisfying about owning and controlling my own work. I also don’t want to wait to see if someone will publish it. I want to publish, and see if the audience comes to me.

The remarkable thing about the internet is that you don't have to wait, you don't need anyone's permission to put your creative work out in the world, you can just do it.

So do it.

The Price of Great Software

Everybody's writing about and linking to Jared Sinclair's blog post where he breaks down the sales figures of his RSS reader Unread, calculating that "the actual take-home pay from the combined sales of both apps is $21,000, or $1,750/month."

Considering the enormous amount of effort I have put into these apps over the past year, that’s a depressing figure. I try not to think about the salary I could earn if I worked for another company, with my skills and qualifications. It’s also a solid piece of evidence that shows that paid-up-front app sales are not a sustainable way to make money on the App Store.

Most of the commentary so far seems to be pretty pessimistic about what this means for the future of iOS app development. Either it's nearly impossible to make good money designing apps, or you just have to design apps that take less work to design. As Benjamin Mayo puts it:

If you want to maximize your profitability, make small apps that do a few things well. The amount of effort you put into an app has very little to do with how much of the market will buy it. This means that making big apps exposes you to substantially more risk, which is not fairly counterbalanced by significantly higher earnings potential.

Marco Arment thinks the way forward is being more efficient:

As the economics get tighter, it becomes much harder to support the lavish treatment that developers have given apps in the past, such as full-time staffs, offices, pixel-perfect custom designs of every screen, frequent free updates, and completely different iPhone and iPad interfaces...Efficiency is key. And efficiency means doing more (or all) of the work yourself, writing a lot less custom code and UI, dropping support for older OSes, and providing less customer support.

But there's another option: simply charge more if your app is worth it, and charge for every major update. On a recent episode of Iterate, Joe Cieplinski of Bombing Brain describes how his company developed the premier teleprompter app for the iPad, and how it not only sold well at a relatively high price but went on to sell even better when they raised the price higher. People have been saying similar things about the Omnigroup's pricing for years.

Brent Simmons recently made the case that most indie software developers in the Apple ecosystem make apps for the Mac. The implication is that Mac apps make more money, because developers typically charge more for them. Tyler Hall backs up this point from his own experience:

It’s my experience that you CAN build a sustainable software business selling to consumers as an independent developer. You just can’t do it in the App Store any longer – if you ever could. You need to start building for the Mac, where you can charge a fair price, sell directly to your customers, and charge for upgrades. Even then, success won’t happen overnight. But it is doable.

Of course, I'm just a user, not a developer, so this is all just speculation. But when I look at that sales chart for Unread, and see that huge spike in the first few days, I see myself and other people like me, people who love these beautifully designed, "hand-crafted" apps.

ufy-iphone-regular-graph 2.png

We aren't buying these apps on impulse. We're buying them because we read the reviews and we know what the best apps are, and we want to own the best. Maybe indie devs need to stop chasing the normals (who think everything should be free anyway) and just charge a fair price from the folks who care.

The Staggering Scale of Minecraft

Five years in, Minecraft (the system) has bloomed into something bigger and more beautiful than any game studio — whether a tiny one like Markus Persson’s or a huge one like EA — could ever produce on its own. The scale of it is staggering; overwhelming. As you explore the extended Minecraft-verse online, you start to get the same oceanic feeling that huge internet systems like YouTube and Twitter often inspire: the mingling of despair (“I’ll never see it all”) with delight (“People made this”) with dizzying anthropic awe (“So… many… people.”)

What impressed me about Minecraft, from the moment I first saw my children playing it, was how its open-ended structure could result in such wildly different forms of play.

The first thing my son showed me was the complex, working roller coaster he'd constructed inside the game, which we could ride in a mining cart. Then my daughter invited me to see the house she'd built. She maneuvered the POV inside the door, and suddenly, dozens of eyes turned and looked at us from every direction. “These are my cats!” she announced. She'd stuffed her house to the brim with these pixilated creatures.

In other words, the same game served as both my son's virtual erector set and as a virtual extension of my daughter's growing stuffed animal collection. And that was within the first week of them playing it.

(Via DF)

The Best Podcast Apps for iPhone

When Marco Arment announced that he was making a podcast app, he was deliberately entering a crowded market. He said that he wanted to rid himself of his irrational fear of direct competition.

The crowded market doesn't seem to be hurting him. The reviews of his app Overcast in Macstories, Macworld, and elsewhere have all been great. But it's also important to note that the crowded market itself is not diluted by the number of independent developers entering it. On the contrary, the diversity of apps, and all the different approaches to the problem of podcast delivery, can serve to improve the quality of all those apps.

There's a post on the Supertop blog (the developers of the podcast app Castro) about why they welcome the competition from Overcast.

From our perspective, a user trying any third party app is good for all third party apps. If a user is persuaded to download one alternative they should be more likely to consider others in the future, especially given the variety of apps that are available...I encourage you to try Overcast. In fact, if you really love podcasts, I encourage you to try all the others too.

I decided to do just that, purchasing and trying out Instacast, Downcast, Pocketcasts, Podwrangler, Castro, Mocast, and Overcast. I made comparison video below to share what I observed.

I have no background in design, but perhaps because I have a degree in comparative literature, I find it endlessly fascinating to compare the interface of apps and savor the details of what each one offers. None of these apps is perfect, but it's inspiring to see how a group of talented humans use their skills in different ways, through the endlessly plastic medium of software, to approach the same problems.

You Can Delete But You Can’t Forget

Jacqui Shine made a rash decision to delete all the email from her late mother; in a piece for the Atlantic, she writes about why the digital nature of that deletion is uniquely haunting.

In another way, though, those deleted emails do survive, though—or, at least, the data that Google has extracted from them in order to build your user profile has...Every time I get served an ad for a fawning book about the Founding Fathers or for a deviled egg tray, it’s a kind of tiny haunting: a palimpsest of what once was, stripped of what made it really meaningful. And those tiny traces may be the problem—not because they can’t allow us to recover the things we’ve lost, but because they allow us to believe that we can.

What Tech Offices Tell Us about the Future of Work

Kate Losse at Aeon magazine on the insidious effect of high-end, handcrafted office design in modern tech culture:

Of course, the remaking of the contemporary tech office into a mixed work-cum-leisure space is not actually meant to promote leisure. Instead, the work/leisure mixing that takes place in the office mirrors what happens across digital, social and professional spaces. Work has seeped into our leisure hours, making the two tough to distinguish. And so, the white-collar work-life blend reaches its logical conclusion with the transformation of modern luxury spaces such as airport lounges into spaces that look much like the offices from which the technocrat has arrived. Perhaps to secure the business of the new moneyed tech class, the design of the new Centurion Lounge for American Express card members draws from the same design palette as today’s tech office: reclaimed-wood panels, tree-stump stools, copious couches and a cafeteria serving kale salad on bespoke ceramic plates. In these lounges, the blurring of recreation and work becomes doubly disconcerting for the tech employee. Is one headed out on vacation or still at the office – and is there a difference?

Welcome Cracks in the Walled Garden

The first good sign was the opening video. Last year's video was a visually pleasing but somewhat abstract statement of purpose about Apple's design principles. The message seemed to be, "We're Apple. We know design. Learn from us." This year, the video focused on people talking about apps and how they improve people's lives. The content wasn't amazing, but the contrast was stark. Apple often takes a moment to nod toward the importance of developers, but this felt bigger than that. Rather than focusing inward on their own expertise, Apple was focusing outward on the people who build on their platforms and use their products. The video ended with the words "thank you" addressed directly to developers. I'm not sure how this went over in the room, but as a user who feels deep gratitude for the apps I use every day, I felt like that thank you was long overdue. And that was just the beginning. Apple spent the rest of the keynote demonstrating this new outward focus by tearing down walls.

Critics of the company love to toss around terms like "walled garden," in reference to Apple's paternal approach to interoperability. It's a fair criticism, especially when it comes to iOS. The App Store, sandboxing, and iCloud each put their own restrictions on how users can access software and data. But another way to see it is that Apple has always been a device-centric rather than a data-centric company.

Other players in the computer industry always saw a sharp divide between the hardware and the software, but Apple has always tried to take a holistic view, controlling as much of both the hardware and the software as possible. This approach only increased with iOS, which gave Apple even greater control of what software could be loaded onto the device, how applications could communicate with each other, and what users could (and couldn't) customize about their experience. That level of control made iPhones and iPads more approachable than any computing devices had ever been before. And Apple's device-centric approach filtered down to developers, who made apps that themselves felt like mini-virtual devices, each with their own unique designs, powers, and solutions.

But overtime, that device-centric approach has felt more and more limiting. Every time you tap on a notification and get pulled sideways into a new app, or you tap open in and find yourself flung in a different direction, you feel your head bump against the walls of the walled garden. Apple wants to hide the file system because ordinary users find it confusing, but is it really less confusing to open a photo in a photo editing app, make changes, and then have to export them as an entirely new photo to the Photos app?

Apple has finally decided to tackle these problems, making the walls of its many walled gardens rather more porous. The most obvious of these changes is a new iCloud document picker, which will allow iOS apps to select a file and then save it without creating second copies. This is the closest Apple has come to introducing a real file system to iOS, and without a demo, it remains to be seen what it will actually look like, but the keynote mentioned that iCloud will not be the only storage option for this document picker. Theoretically, customers could choose Google Drive, One Drive, or even Dropbox.

Other changes include interactive notifications, such as the ability to accept appointment requests, respond directly to messages, and apparently take action on third party notifications (though the only example was Facebook). So instead of having to bounce over to the app in question, entering its little garden, you can just interact with the information itself wherever you are. Another example is third party widgets in the Today view of Notification Center (something that carries over to the Mac). Again, you'd be interacting with the data of an app or the feature of an app without having to enter the the app itself. And Healthkit and Homekit, which were touted as rumors in the run up to the keynote, were described as aggregators of data from other apps. The data, liberated from its silos, can be collected, examined, and transformed with new meaning.

Apple also pulled down the walls between iOS devices and the Mac. There's a new feature called "Continuity," which gives devices a variety of ways to share data more fluidly. You will be able to use Airdrop to send data between Mac and iOS. You can also "hand off" tasks from one device to the next. Writing an email on your phone? Easily switch to writing it on your Mac. Get a call or an SMS on your phone? View the call or the message on your Mac. Each of these features frees the computing task at hand from its confinement to one specific app or one specific device.

But finally, the feature on almost everyone's iOS wish list came true. Apple introduced "Extensibility," a kind of inter-app communication that would allow apps to open up instances of each other's UI to take certain actions. The example Apple showed was of opening a photo in the Photos app and being able to use a filter from another installed app without leaving Photos. It isn't clear yet whether one third party app will be able to interact with another third party app, but that was the implication.

The larger implication is that developers can now begin to think about apps as either stand-alone powerful pieces of software or as extensions of other pieces of software. I don't really want to buy any more apps that let me add filters to my photos, but I might buy an extension to my main photo editing app that gives me extra features.

Power users are no doubt cheering all of these additions. For me, what's really exciting is not the features in themselves (though I am excited to try them) but the apparent change in philosophy, the willingness to trust the users and the developers. With iOS 7, Apple seemed to be saying that people are comfortable enough with touch interfaces that they don't need skeuomorphic designs anymore to make them feel comfortable. With iOS 8, Apple seems to be saying that people are comfortable enough with the various data they manage through their devices and their apps. That data can now begin to move more fluidly between those devices and apps.

Recently, in "Sharing the Ecosystem" I wrote,

I find it fitting that the number one request on most people’s lists for iOS 8 is better sharing of information between apps. What Apple needs is better sharing, period. Healthy ecosystems are all about sharing. “Better can’t be better if it doesn’t consider everything.” Just as Tim Cook sees the value in sustaining the world’s ecosystem, he needs to see the value in sustaining the developer ecosystem. It’s those developers who can provide the real return on investment, making both his products, and the world, better.

I came away from the keynote feeling that Tim Cook understands this. He chose to begin the keynote with a thank you to developers, and he ended it by asking all the Apple employees in the audience to stand up to receive recognition. For the last two decades, Apple was defined by one man's vision, even if there were many people behind that vision. Tim Cook wants to celebrate all the people working to make Apple better. I have rarely felt more hopeful about the company.

Better Living (and Less Anxiety) through Software

It was truly a pleasure to be a guest on Brett Terpstra's podcast Systematic this week. He's had some amazingly interesting folks on the show lately, so I just hope I measure up. We talked about my background in radio and then segued into the topic of anxiety and technology.

Fittingly, I began feeling anxious almost as soon as we finished the Skype call. Not that it wasn't a good conversation, but there was one part where I felt I could have explained myself a lot better. I had been talking about a turning point in my life, when I started my second and last job in public radio.

My first job in radio was writing for a show called The Writer's Almanac, and I was good at it, despite the fact that the show's host was notoriously demanding. In my first three years writing for the show, three producers quit, along with several other writers who either quit or got fired. I was finally the only one left standing, so I became the sole writer and producer, and I persisted for two more years. The day I left, they said I should get a plaque for lasting as long as I did. I thought this constituted evidence of my competence.

And yet, when I moved to a different job on a different radio show, I suddenly felt like the least competent person in the world. This was especially confusing because the new job should have been easier. I was no longer the sole writer and producer of a show, I was just one associate producer within a team. I only had to write bits and pieces of script, do occasional research, write the occasional blog post, answer listener emails, book guests, and help edit audio. None of these jobs was high stakes. It should have been a breeze. But it nearly killed me.

Part of the problem was multitasking. At my previous job, I'd been doing one thing at a time. Write this script. Now write that script. I did most of my work from home in a quiet room. I was allowed to focus.

At my new job, I was always juggling multiple projects: researching the next guest, proofreading the latest script, writing a promo, editing tape. I had always relied on my memory to keep track of my to-do list (I rarely wrote down homework assignments in high school or even studied for tests, and still did well), but my memory completely failed me in this new work environment. I began to worry all the time about whether I had forgotten something. Had I booked that guest for the right time? Had I checked the time zone? Did I fact check that script sufficiently? Should I read it through one more time?

Another problem was the office environment. I worked in a cubicle, with team members all around me. There was little space or time to focus deeply on anything. We were all expected to be on email all the time, injecting our thoughts into one another's brains at will. One of my tasks was to respond to listener email, and every Monday we got a flood of responses to our show, both tremendously positive and viciously negative. And if there had been any factual errors in the show, the listeners would take us to task, and the host would not be happy. I began to dread the weekend, imagining the army of potential attackers amassing and hurling their spears into cyberspace, each blow landing in my inbox on Monday morning.

The result of all this anxiety was that I found it harder and harder to concentrate. I began to make the mistakes I so feared making. Which only made me worry more. I started waking up every night at 3:00 AM, unable to get back to sleep, my mind racing with everything I needed to worry about. Then I started waking up at 2:00 AM. Then 1:00 AM. Then Midnight. If this had continued, I would have started waking up before I went to sleep.

If you have not experienced severe depression or anxiety, you might find it hard to understand is how physical an illness it really is. I did not just feel sick in my head. Every cell in my body felt scraped out and raw. I had no patience for my children. I had no energy to help my wife around the house. Imagine how you feel when you realize something horrible is about to happen: you forgot the essential thing you need for that important meeting, your car is sliding on the ice, or your child is falling head first off the jungle gym in slow motion. Now imagine feeling that kind of dread every waking moment for weeks on end.

That was me at my lowest point. I kept asking myself, "Why can't I do this? This shouldn't be so hard. What's wrong with me?"

In the interview with Brett, I alluded to something I read once that compared depression to a fever (unfortunately, the author was the now-discredited Jonah Lehrer, but I still find the article persuasive). In response to an infection, the body raises its own temperature as a way of killing off the infection. Depression, likewise, raises the frequency of negative "ruminative" thoughts. Psychiatrists have typically seen these kinds of thoughts as part of the problem, but some believe depression may be the body's way of forcing you to focus on what's wrong in your life in order to change it.

Imagine, for instance, a depression triggered by a bitter divorce. The ruminations might take the form of regret (“I should have been a better spouse”), recurring counterfactuals (“What if I hadn’t had my affair?”) and anxiety about the future (“How will the kids deal with it? Can I afford my alimony payments?”). While such thoughts reinforce the depression — that’s why therapists try to stop the ruminative cycle — Andrews and Thomson wondered if they might also help people prepare for bachelorhood or allow people to learn from their mistakes. “I started thinking about how, even if you are depressed for a few months, the depression might be worth it if it helps you better understand social relationships,” Andrews says. “Maybe you realize you need to be less rigid or more loving. Those are insights that can come out of depression, and they can be very valuable.”

Of course, it's important to note that while a fever can help rid your body of germs, it can also kill you. I don't know what might have happened to me if I hadn't talked to a doctor at the time. Medication was definitely part of my recovery. It helped reduce my symptoms so that I could see the root cause of the problem: this was not the right job for me.

So I quit, and took a couple months off before I started my next job. In that time, I realized two things. First, I wanted to learn how to be more organized. Second, I wanted to make time for the kind of deep focus creative work that gave my life real meaning. That was five years ago, and I've managed to accomplish both of those goals, largely with the help of software.

There's been some talk lately about whether software tools actually provide any benefit, and whether software design is solving real problems. But for me, every time I dump my mind into Omnifocus, or add an event to Fantastical, or forward an email with attachments to Evernote, or set a reminder in Due, I feel a little more in control of my life. I can much more easily manage my job as a college writing teacher, juggling multiple projects, multiple classes, lesson planning, grading, committee meetings, department responsibilities, and so on.

Keeping my life more organized also makes it possible to have a clear head when I want to focus on something important. One of my goals after quitting my job was to write a novel, and I finally made time for it. The app Scrivener helped me break the novel down into manageable pieces, and for the first time in my life, writing fiction felt enjoyable rather than fraught. More recently, I was inspired by the power of the app Editorial to start writing this website (and have written almost every post with it).

Of course, there's a danger here. Buying a new notebook and a fancy pen does not make you a writer. Making a to-do list is not an actual accomplishment. Tools are not the end goal, and using a tool, no matter how well-designed, does not make hard work any easier. But the right tool can provide an important cue to help create a habit or build a ritual for doing the actual work.

Software has improved my life by making the work feel more possible, creating virtual spaces where I feel less anxious. And the less anxious I feel, the more I feel capable of doing the work that matters, and the more I feel alive.

Sharing the Ecosystem

Tim Cook got a lot of attention back in February when he was challenged at a shareholder meeting to explain Apple’s commitment to green energy initiatives. A conservative group of shareholders had put forward a proposal asking Apple to focus only on initiatives that had a clear ROI (return on investment). According to a report in Mac Observer, Tim Cook grew visibly angry at the suggestion:

When we work on making our devices accessible by the blind. I don’t consider the bloody ROI….If you want me to do things only for ROI reasons, you should get out of this stock.

Cook underlined his commitment to the environment again this past week by providing the voiceover for Apple’s promotional video Better, about Apple’s use of solar and wind energy, among other environmentally friendly practices. But it’s worth noting the difference in the message. At the shareholder meeting, Cook seemed to be saying that he doesn’t care about return on investment – doesn’t care about profits – when it comes to doing things that are just right. But in the video he keeps repeating the word “better” in reference both to Apple’s products and Apple’s commitment to the environment. It’s not that he doesn’t care about return on investment; it’s that he’s enlarging the very meaning of the term.

Better. It’s a powerful word and a powerful ideal. It makes us look at the world and want more than anything to change it for the better, to innovate, improve, to reinvent, to make it better. It’s in our DNA. And better can’t be better if it doesn’t consider everything. Our products. Our values.

If Tim Cook hadn’t gotten so angry at that guy at the shareholder meeting, he might have explained that profits are only one return on the investment. If you’re the most valuable company in the world, and you’re not concerned about the impact of your company on the environment, you’re not playing the long game. We all share the same ecosystem. Investing in that ecosystem is investing in the future. It might not look like a profitable investment, but it could yield immeasurable returns.

So I’m heartened by Apple’s apparent commitment to the environmental ecosystem, but I wish they had the same attitude toward their software ecosystem.

I know the history of that ecosystem from my vantage point as a user. I switched to a Mac in 2007, and as much as I loved the hardware, I discovered pretty quickly that the real advantage was the software, and not just the software made by Apple. Independent developers who cared about design, who wanted to make software for individuals rather than the enterprise, had been using Macs and writing Mac software for years. Using those applications for the first time, I began to see software as a kind of art form in and of itself.

The iPhone and the App Store brought that art form to the masses. By creating a place where customers could easily, and without fear, download any number of apps, Apple made software mainstream. Before that, most customers only bought software when they bought their computers, preloaded with an office suite and maybe one or two more apps. The iPhone, and later the iPad, provided both the perfect delivery and the perfect medium for software, because the entire device itself changed based on whatever software had just been launched.

The result was that Apple managed to cultivate what I’d argue was the richest software ecosystem in the history of computing. Which is why it’s so strange that Apple now seems to be on the cusp of letting that ecosystem wither. It’s no secret that the App Store is broken, that developers are having a harder and harder time making good money. Marco Arment has been talking about this for a long time, most clearly when he made his case this past fall that paid-upfront apps are dead. Ben Thomson wrote a series of blog posts around the same time at Stretechery, laying out the reason why Apple is motivated to drive down the cost of apps (and why it's a big picture mistake).

Apple makes money on hardware. It’s in their interest that said hardware be sold for as much of a premium as the market will bear. However, it’s equally in their interest that the complements to that hardware are sold as cheaply as possible, and are preferably free….In the case of apps, the current app store, full of a wide variety of inexpensive apps, is perfect from Apple’s perspective. It’s a reason to buy Apple hardware, and that’s all that matters. Anything that on the surface makes the store less desirable for hardware buyers – such as more expensive apps – is not in Apple’s interest.

This is bloody ROI thinking. In its latest commercials, with iPads on mountain tops and iPhones on motorcycles, Apple wants to trade on the power of its devices to do amazing things. But software is what gives those devices their power. And Apple is doing very little to help sustain the people who create that software, let alone give them the respect and the gratitude they deserve. As Justin Williams recently said about his trip to the Microsoft developer’s conference:

What’s different though is that it feels like Microsoft is more interested in working with us as a partner whereas Apple has always given off a vibe of just sort of dealing with us because they have to.

I find it fitting that the number one request on most people’s lists for iOS 8 is better sharing of information between apps. What Apple needs is better sharing, period. Healthy ecosystems are all about sharing. “Better can’t be better if it doesn’t consider everything.” Just as Tim Cook sees the value in sustaining the world’s ecosystem, he needs to see the value in sustaining the developer ecosystem. It’s those developers who can provide the real return on investment, making both his products, and the world, better.

Talking About iPads and Real Work

Shawn Blanc and I were apparently on a similar wavelength yesterday, responding to Lukas Mathis's thoughtful piece about Windows 8 and the shortcomings of iPad productivity. I love Blanc's point about how those of us trying to use devices like iPads for "real work" and "real creativity" aren't just nerds. We are nerds, no doubt, but we're also helping shape what those devices are capable of.

The Affordance of Intimacy

The latest iPad commercial struck me as overwrought when it first came out. You know the one, with the mountain climbers, the scuba divers, the documentary filmmaker on the precipice of a waterfall, and a voiceover by Robin Williams from "Dead Poets Society," talking about poetry and what it means to be alive. It's not a terrible commercial. But unlike the holiday iPhone commercial, which showcased how ordinary people, even anti-social teenagers, can do extraordinary things with technology, the iPad commercial seemed to be about how extraordinary people can do extraordinary things with technology, especially if they have extraordinarily protective or specially designed cases for their iPads (and plenty of AppleCare).

But then I listened to John Gruber on his most recent Talk Show podcast. He was talking to Joanna Stern about her piece in the Wall Street Journal, arguing that tablet computers still aren't good for doing "real" work, like working with documents, Microsoft Office, etc. Articles on this subject seem to be a trend.

We've all spent the last 15-20 years using computers to work with documents, usually Microsoft Office documents, so we've come to see that as the primary productive purpose of computers. In a piece about the Surface Pro 2, Lukas Mathis recently detailed all the ways a simple task like writing a job application is easier on a PC than an iPad, how you can have a webpage open as you're writing, grab images and easily embed them into the document, look at a friend's emailed suggestions alongside what you're writing, all the way up to the choice of file formats for the final product:

...you might want to export your letter and CV as PDFs, maybe combine them into a single PDF, or maybe ZIP them. You want to attach the resulting file to an email. It’s reasonably simple on a Mac or PC, but I’m not sure if some of these things are even possible on an iPad.

All of this is true. These are the productivity strengths of the PC: the multi-window, multitasking, multi-file-formatting abilities. But the question isn't whether the iPad is better at any of these activities. The question is whether the iPad is better at any productive activities. And why do we care?

Which brings me back to John Gruber's podcast. Discussing the iPad commercial with Joanna Stern, Gruber made a point that hadn't occurred to me before about what kinds of "work" can be done with a tablet computer.

[That commercial] shows that a lot, if not most, of the things that you could call work or creation that you can do on tablets are things that weren't really good or still aren't good for doing with the laptop. It's new things, right? One of the examples, there's a hockey team and they've got some kind of app and they're using the camera and showing this, and they've got like a play or something, and the guy can draw on the screen...It seems totally natural that the coach is there on the ice with an iPad in his hand, and it would look ridiculous if he was there holding a laptop.

The operative phrase there is "in his hand." When Steve Jobs gave the first public demonstration of the iPad, he famously began the demo by sitting back in a comfortable chair. For some commentators at the time, this signaled the fact that the iPad was a "lean back" rather than a "lean forward" device. Hence, the continuing debate about content consumption over content creation. But it's important to remember the first thing Steve Jobs said as he was sitting down in that chair: "It's so much more intimate than a laptop."

Designers like to talk about affordances, the property of an object that encourages a specific kind of action. Levers afford pulling, knobs afford twisting, buttons afford pushing, and so on. I am not a designer, but I first learned of the concept of affordances in the field of education. Educational psychologists argue that most behavior in a classroom, both good and bad, is the result of affordances. If you wander around the room checking students' homework and you don't give the students anything to do, you shouldn't be surprised if the class descends into chaos. You afforded that behavior.

What makes the iPad stand out from other tablet computers, and what makes it so much more appealing, is that it was designed with intimacy in mind. And I think we're just on the cusp of discovering how that intimacy affords different kinds of behaviors, different kinds of creativity and productivity.

To give just one example from my own life: I left my job as a public radio producer several years ago and took a job teaching writing. My college serves a large population of West African immigrants, many of whom came to this country as refugees, so there are numerous language issues I have to work with in their writing. I determined early on that writing comments on their papers by hand was too difficult. I couldn't fit my chicken scratch words legibly between the lines, and I often ran out of space in the margins.

So I started having them turn in all their papers digitally. That way, I could use Microsoft Word (and eventually Pages) to track changes and insert comments digitally. I even developed keyboard shortcuts so that I could insert certain comments with a single keystroke. This digital system felt more efficient, because I could type faster than I could write, and I didn't have to deal with so much paper.

But there were also certain drawbacks. The process of grading papers felt less humane somehow, like I was merely at the controls of a machine, cranking out widgets. I also didn't love the look of my printed comments: gray boxes with skeletal lines tying them back to the students' original words. My students were often confused about which comments referred to which words.

So recently, I decided to see if I could grade my students' papers entirely with an iPad. I bought Readdle's PDF Expert based on Federico Vittici's review in Macstories, bought myself a decent stylus (since replaced with this one) converted all my students' papers to PDF documents, and got to work.

In his book, "The Hand: how its use shapes the brain, language, and human culture", the neurologist Frank R. Wilson writes,

When personal desire prompts anyone to learn to do something well with the hands, an extremely complicated process is initiated that endows the work with a powerful emotional charge...Indeed, I would go further: I would argue that any theory of human intelligence which ignores the interdependence of hand and brain function, the historic origins of that relationship, or the impact of that history on developmental dynamics in modern humans, is grossly misleading and sterile.

As someone who hasn't enjoyed writing in longhand since I was about ten years old, I was frankly shocked by how different the grading experience felt when I began to annotate my students' words directly on the screen. Somehow, using my hand more directly made all the difference. Not only could I reach out with my pen, circle, and underline, the way I would have on paper, but I could instantly erase and start again, and even zoom in to impossibly small spaces, and then back out again to see the whole document. And if I wanted to use text instead of handwriting, I could just tap in the column and type, or even dictate my words.

When my students got their papers back, they said my comments were much easier to understand, because most of them were written directly beneath the words to which they referred. It seems like a small thing, but the effects matter. Students who had come to this country as refugees were learning how to write better thanks to the tiny words I could scrawl directly on the screen of this device.

The iPad also freed me from my desk. I could still grade at a desk if I wanted, but I could also sit in an easy chair or curl up on the couch. I even spent a whole Sunday morning (one of our recent double digit subzero days in Minnesota) grading in bed.

Which leads me to the biggest difference: how I felt about the process. I didn't dread grading the way I used to. It felt less like grinding away at a machine and more like a creative act. The iPad still allowed me to capture my students' work digitally, so it wasn't a mess of papers, but also engendered this renewed intimacy. By taking my fingers off the keyboard, putting the screen in my hands, and creating that slightly more intimate space, the iPad has turned my interaction with my students' words from an act of digital drudgery to an act of communication.

Can the iPad still improve? Become more powerful? More versatile? Better at inter-app communication? Am I jealous of Lukas Mathis's experience with the Surface Pro's stylus? Of course. But the first thing Apple got right, the most important thing, was how it feels. It's such a small distance from typing in front of a screen to holding the screen in your hands, but something new happens when you reduce that distance. I, for one, am excited to see how developers begin to harness the power that intimacy affords.

Artificial Communication

After watching "Her," the new Spike Jonze movie about a man falling in love with an artificially intelligent operating system, I got in my car, started it up, and then briefly held my thumb down on the home button of my phone. The phone emitted a cheerful, questioning double beep. "Tell my wife," I said, "'I'm on my way home.'" The phone parsed my words into a text message. A woman's voice asked, "Are you ready to send it?" I was.

It's easy to see the movie as a exaggeration of my interaction with Siri, to argue that our current fixation with technology could lead down a slippery slope to imaginary relationships with artificially intelligent beings like Samantha, the Scarlett Johansson-voiced operating system from the movie. Several articles (like this one) have linked the movie to a famous chat bot named ELIZA, created at MIT in the late sixties, which used vaguely empathetic questions to create the illusion of a conversation with human users. Joseph Weizenbaum, the creator of the chatbot, later wrote,

I was startled to see how quickly and how very deeply people conversing with [it] became emotionally involved with the computer and how unequivocally they anthropomorphized it. Once my secretary, who had watched me work on the program for many months and therefore surely knew it to be merely a computer program, started conversing with it. After only a few interchanges with it, she asked me to leave the room.

I expect most people to find it sad, or even disturbing, that humans could be so easily duped by technology. Sherry Turkle (whose theories about how technology is driving us apart may not be supported by the evidence) has written of her horror at observing people interacting with robots.

One of the most haunting experiences during my research came when I brought one of these robots, designed in the shape of a baby seal, to an elder-care facility, and an older woman began to talk to it about the loss of her child. The robot seemed to be looking into her eyes. It seemed to be following the conversation. The woman was comforted.

That final sentence is meant to fill you with dread. The usual narrative about technology in Western culture, going back at least as far as Mary Shelly's "Frankenstein," is that technology makes a lot of promises, but those promises, at best, prove empty. And at worst, they will give rise to monsters that viciously murder everyone we care about. I've written about this before.

The problem with this narrative is that it conflates and denigrates forms of technology that have, in fact, very little to do with each other. My smartphone is both addictive (and maddening) not because it listens to me or simulates empathy, but because it can be so many things. I could use it to check my email, Twitter, Facebook, my RSS reader, my Instapaper queue, Flipboard, Tumblr, Instagram. I could also add an item to my todo list, write a journal entry, write a blog post, take a picture, listen to a podcast, read a book. And just as the device can be many things, so it reminds me that I can be many things: an employee, a teacher, a spouse, a friend, a family member, a reader, a photographer, a writer. I can feel it pulsing with obligations in my pocket. I sometimes find myself flipping through apps, and potential identities, the way I used to flip through TV channels. All that possibility can be overwhelming.

When Steve Jobs introduced the iPhone, he famously said it was three devices: a wide-screen iPod, a revolutionary phone, and a break-through internet communicator. And if you watch the video of that introduction, everyone cheers the idea of a revolutionary phone, not so much an "internet communicator." Of course, as others have pointed out, it was the internet communicator that was the real revolution. And in many ways, it's the phone that's been left behind.

Which is why it's significant that Joaquin Phoenix's character interacts with Samantha, his operating system, through a kind of high fidelity phone call. So much of what feels clumsy and alien about our experience of computers is our ability to communicate with them. What if that communication became entirely familiar, as familiar as a real conversation? This "input method" of a phone call also removes the need for a screen. Instead of staring at a device, Joaquin Phoenix spends much of the movie staring at the world. And even more importantly, rather than presenting an endless array of possibilities, Samantha unifies those possibilities into one experience, the experience of her company.

You can argue about whether such an artificially intelligent operating system would turn out well for humanity in real life, and I don't want to give anything away about the movie, but if a human being derived meaning from such a relationship, I don't see how that meaning is any less relevant, any less meaningful, simply because it's a relationship with something "artificial." Humans have always derived meaning from artificial things. As Brian Christian writes in a piece about "Her" for The New Yorker's "Page-Turner" blog, the original technology that messed with our heads was language itself.

As both an author and a lover of literature, I would be a hypocrite to condemn too strongly the power of indirect or one-way intimacy. I run the disembodied thoughts of some other mind through my own, like code, and feel close to someone else, living or dead, while risking nothing, offering nothing. And yet the communion, I would argue, is real. Books themselves are perhaps the first chatbots: long-winded and poor listeners, they nonetheless have the power to make the reader feel known, understood, challenged, spurred to greatness, not alone.

Writing, drama, printing, photography, motion pictures, recorded music, typewriters, word processors, the internet: all have at various times been called enemies of culture, even of humanity. But the fact is that technology is part of our culture, part of our humanity. Of course there's the potential that we could get lost in the rabbit hole of communicating with an artificially intelligent being, but would that be any better or worse than getting lost in Netflix TV show marathons or Minecraft expeditions? Or, for that matter, spending one's life reading the classics of literature?

What I loved about "Her" was how it depicted an imaginary relationship with technology that was neither utopic nor dystopic. It was just problematic. Like any passionate, fiery relationship.

Humanity and Technology

Upon the launch of David Pogue's new Yahoo Tech site, I was initially excited, as I had long been wishing for a different kind of tech journalism. The initial word coming out of the CES announcement was that the new site would try to inject a little more humanity into tech coverage. All to the good, I thought.

But then I looked at the site, and found a series articles about "What the heck is bitcoin?" "How the internet is blowing your mind!" "How to keep your kids safe on Facebook," and "Why selfies are the end of civilization as we know it!" I'm paraphrasing, but only slightly.

In a paroxysm of disgust, I butted (perhaps rudely) into a Twitter exchange Jason Snell and Stephen Hackett were having about the new site.

@ismh @jsnell Tech journalists need criticism, but Yahoo tech is the disease, not the cure.

— Rob Mcginley Myers (@robmcmyers) January 7, 2014

Jason Snell, a writer I very much admire, did not agree.

@robmcmyers @ismh I'd say that simplifies it far too much. Less coverage of VC investors and more practicality is not a bad concept.

— Jason Snell (@jsnell) January 7, 2014

He's right of course. But the execution of that concept depends entirely upon your definition of "practicality." I agree that the problem with much of technology journalism is that instead of covering technology, it's covering the technology business. This is why there are so many articles about market share and profit share, whether Apple or Google is winning at any given moment, why Blackberry is dying and why Microsoft is fading in relevance.

I find most of that stuff tremendously boring. I'm not a VC funder or an investor, I'm just fascinated by technology, and I want to read thoughtful coverage of it, not coverage of the money it makes or doesn't make. The problem with Yahoo Tech is that it goes too far in the other direction. It's full of articles about quirky apps and products ("Computerized Jacket Visibly Shows Your Excitement Whenever You Eat Chocolate," "This Digital Whale Will Follow Your Mouse Pointer Around"), 5 most important these things, 5 most intriguing those things, 5 steps to accomplishing this other thing.

Maybe "normals" will care about and click on this stuff, but the reason it feels like a "disease" to me is that it spreads the notion that technology is mostly frivolous, there to entertain or distract us briefly before we get back to doing something important.

So it's refreshing to be reading a series of pieces this week that actually inject what I think of as "humanity" into tech journalism. First there was Shawn Blanc's piece on how the iPad has changed his grandfather's relationship to his family.

My Grandpa’s iPad has enabled him to do something that he’s been unable to do for as long as I can remember. The 9.7-inch touch screen has turned my Grandpa into a photographer.

Then there was Federico Vittici's beautiful story of how he bought his first iPod and his first Mac, and how it changed his life.

As the world is wishing a happy 30th birthday to the Mac, I think about my first iPod and I realize just how important Apple's halo effect has been for my generation. Perhaps I was going to buy a Mac anyway eventually because I was too fed up with Windows, but the iPod made me curious, excited, and, more importantly, a loyal and satisfied customer. The Mac made me eager to learn more about Macs apps and the people who were making them, so I decided to write about it and somehow I had a job again and I've met so many great people along the way, every doubt and criticism was worth it.

Finally, there's John Siracusa's piece about the introduction of the Mac, which he calls "the single most important product announcement of my life." I love that the image that he associates most strongly with the computer is the image of the team of humans that built it.

It wasn’t just the product that galvanized me; it was the act of its creation. The Macintosh team, idealized and partially fictionalized as it surely was in my adolescent mind, nevertheless served as my north star, my proof that knowledge and passion could produce great things.

This is the "humanity" we need in tech journalism. How humans strive through technology to make great things, and how humans are affected by those great things that have been made. More of that please.