The Power of a Camera

In light of the fiasco in Ferguson, Derek Thompson has written a piece for the Atlantic online about a small technological tool that could dramatically improve the relationship between police and the policed:

In 2012, Rialto, a small city in California's San Bernardino County, outfitted its police officers with small Body Cams to be worn at all times and record all working hours. The $900 cameras weighed 108 grams and were small enough to fit on each officer's collar or sunglasses. They recorded full-color video for up to 12 hours, which was automatically uploaded at the end of each shift, where it could be held and analyzed in a central database. 

When researchers studied the effect of cameras on police behavior, the conclusions were striking. Within a year, the number of complaints filed against police officers in Rialto fell by 88 percent and "use of force" fell by 59 percent. “When you put a camera on a police officer, they tend to behave a little better, follow the rules a little better,” Chief William A. Farrar, the Rialto police chief, told the New York Times. “And if a citizen knows the officer is wearing a camera, chances are the citizen will behave a little better.”

Two Kinds of Memory

Annie Murphy Paul writing for Slate on the difference between electronic and organic memory and the evolving uses of each in fields like healthcare:

The second insight that emerges from a close look at electronic and organic memory is that E-memory is good for invariant storage, while O-memory is good for elaborated connections. If we make note of an upcoming appointment in our smartphone, its digital calendar won’t misremember the date or time, as our all-too-fallible brains are apt to do. On the other hand, if we enter the germ of an idea in our phone’s note-taking app, we won’t return after a busy weekend or a good night’s sleep to find that the idea has grown new connections and layers of meaning, as an idea planted in our organic memory is likely to do.

This explains why I find apps like Fantastical and Due to be so essential in aiding my leaky memory, but whenever I capture an idea in an app like Evernote, I rarely think about that idea again. The act of capturing gives me an excuse to forget. Perhaps someone should create a note-taking app (if it doesn't already exist) specifically designed for capturing ideas, which then periodically reminds you to think about those ideas.

The Best RSS App for the iPad

I wrote a review of iPad RSS Apps for the The Sweet Setup. My favorite was Unread, because it's the best app for simply reading your feeds, rather than endlessly processing them, but they each have great features.

In the course of the review, I also decided to make another design comparison video, showing the different features of the three big players in the RSS app space on the iPad: Reeder, Mr. Reader, and Unread.

Some Measure of Innocence

In a piece for the New York Times Magazine, Mark O'Connell writes beautifully about how having children can give you a whole new perspective on the world and its dangers:

Having a child feels like returning some measure of innocence to the world, and this is wonderful in its way; but we are talking here about a world with an exceptionally poor track record in its dealings with innocence. Unforgivably, perhaps, I think of this much more frequently now than I ever did before deciding to bring a child — this particular child — into the world.

Reading this piece, I was reminded of a conversation I had with my best friend from high school not long after my second child was born. My friend did not have kids yet, and wasn't even in a serious relationship, so I was trying to explain to him how it felt, and I remember saying that it's like you've had this clock running your whole life, counting down the days until the next thing happens and then the next thing, the years of school, getting a driver's license, going to college, getting a job, getting married, and so on. And all the while, you're imagining your future.

But when you have a child, suddenly you start a new clock, and you begin to re-experience and re-anticipate all those same experiences. And the worst part of it is that you begin to imagine this new future, not your future but your child's future, and all the precarious possibilities that future could bring that your child doesn't even know about yet, from war to pandemics to global warming.

O'Connell's essay also reminded me of something the writer George Saunders once said in a radio interview (which I've been unable to track down). He was talking about the day one of his children was born, probably his first, and he remembered looking down at this infinitely innocent, infinitely helpless being in his hands and thinking about how all human beings on this planet were once that innocent and that helpless, and maybe if people could remember that, remember the innocence and helplessness we're all born with, the world wouldn't be such a cruel place.

No Permission Necessary

Love this bit from Kevin Kelly's piece You Are Not Late:

Right now, today, in 2014 is the best time to start something on the internet. There has never been a better time in the whole history of the world to invent something. There has never been a better time with more opportunities, more openings, lower barriers, higher benefit/risk ratios, better returns, greater upside, than now.

(via Shawn Blanc)

Kelly's main point is that the future of the internet still holds many surprises and innovations to come, but really, his statement would be true at any point in time. It reminds me of what Ira Glass said at the end of his recent Lifehacker interview:

Don't wait for permission to make something that's interesting or amusing to you. Just do it now. Don't wait.

And Glass's advice further echos the advice of Radiolab co-host Robert Krulwich, who said in a commencement address years ago:

Suppose, instead of waiting for a job offer from the New Yorker, suppose next month, you go to your living room, sit down, and just do what you love to do. If you write, you write. You write a blog. If you shoot, find a friend, someone you know and like, and the two of you write a script. You make something. No one will pay you. No one will care. No one will notice, except of course you and the people you’re doing it with. But then you publish, you put it on line, which these days is totally doable, and then… you do it again.

I wrote about this a while back in a blog post about the future of blogging.

I had [Krulwich's] words in mind when I started my blog six months ago, and I’ve had them in mind whenever I think I should be pitching one of my blog posts to an online publication like Slate or Salon or The Magazine. I’d like to get paid for what I write, but there’s something wonderfully satisfying about owning and controlling my own work. I also don’t want to wait to see if someone will publish it. I want to publish, and see if the audience comes to me.

The remarkable thing about the internet is that you don't have to wait, you don't need anyone's permission to put your creative work out in the world, you can just do it.

So do it.

The Price of Great Software

Everybody's writing about and linking to Jared Sinclair's blog post where he breaks down the sales figures of his RSS reader Unread, calculating that "the actual take-home pay from the combined sales of both apps is $21,000, or $1,750/month."

Considering the enormous amount of effort I have put into these apps over the past year, that’s a depressing figure. I try not to think about the salary I could earn if I worked for another company, with my skills and qualifications. It’s also a solid piece of evidence that shows that paid-up-front app sales are not a sustainable way to make money on the App Store.

Most of the commentary so far seems to be pretty pessimistic about what this means for the future of iOS app development. Either it's nearly impossible to make good money designing apps, or you just have to design apps that take less work to design. As Benjamin Mayo puts it:

If you want to maximize your profitability, make small apps that do a few things well. The amount of effort you put into an app has very little to do with how much of the market will buy it. This means that making big apps exposes you to substantially more risk, which is not fairly counterbalanced by significantly higher earnings potential.

Marco Arment thinks the way forward is being more efficient:

As the economics get tighter, it becomes much harder to support the lavish treatment that developers have given apps in the past, such as full-time staffs, offices, pixel-perfect custom designs of every screen, frequent free updates, and completely different iPhone and iPad interfaces...Efficiency is key. And efficiency means doing more (or all) of the work yourself, writing a lot less custom code and UI, dropping support for older OSes, and providing less customer support.

But there's another option: simply charge more if your app is worth it, and charge for every major update. On a recent episode of Iterate, Joe Cieplinski of Bombing Brain describes how his company developed the premier teleprompter app for the iPad, and how it not only sold well at a relatively high price but went on to sell even better when they raised the price higher. People have been saying similar things about the Omnigroup's pricing for years.

Brent Simmons recently made the case that most indie software developers in the Apple ecosystem make apps for the Mac. The implication is that Mac apps make more money, because developers typically charge more for them. Tyler Hall backs up this point from his own experience:

It’s my experience that you CAN build a sustainable software business selling to consumers as an independent developer. You just can’t do it in the App Store any longer – if you ever could. You need to start building for the Mac, where you can charge a fair price, sell directly to your customers, and charge for upgrades. Even then, success won’t happen overnight. But it is doable.

Of course, I'm just a user, not a developer, so this is all just speculation. But when I look at that sales chart for Unread, and see that huge spike in the first few days, I see myself and other people like me, people who love these beautifully designed, "hand-crafted" apps.

ufy-iphone-regular-graph 2.png

We aren't buying these apps on impulse. We're buying them because we read the reviews and we know what the best apps are, and we want to own the best. Maybe indie devs need to stop chasing the normals (who think everything should be free anyway) and just charge a fair price from the folks who care.

Why Teaching Innovations Don’t Spread in the US

I was initially turned off by the shame-inducing headline of this article by Elizabeth Green in the New York Times, Why Do Americans Stink at Math? But the answer to her question is actually surprising. The innovations in math education that have spread around the world, and that have shown remarkable success, actually started here in the United States. So why have those innovations failed to spread here? Because we choose not to invest in the professional development of our teachers.

In Finland and Japan, where students perform at or near the top in math assessments, teachers spend only about 600 hours a week in the classroom, using the rest of their time to prepare lessons, observe other teachers, and receive feedback on their own teaching. American teachers, by contrast, spend more than 1000 hours a year in the classroom, and have traditionally received almost no feedback from fellow teachers (though this is starting to change).

My wife taught middle school and high school for about ten years, and I have taught at the college level for the last five, and I'm consistently frustrated with the carrots and sticks approach to improving our country's schools, as if bribing teachers with merit pay or threatening them with firing are the best ways to motivate them. In fact, most teachers I know are always striving to do better, even if they're already amazing teachers. What they need is the time and the support to actually improve their skills.

As Green writes:

Most policies aimed at improving teaching conceive of the job not as a craft that needs to be taught but as a natural-born talent that teachers either decide to muster or don’t possess. Instead of acknowledging that changes like the new math are something teachers must learn over time, we mandate them as “standards” that teachers are expected to simply “adopt.” We shouldn’t be surprised, then, that their students don’t improve.

(Via @stevenstrogatz)

The Staggering Scale of Minecraft

Five years in, Minecraft (the system) has bloomed into something bigger and more beautiful than any game studio — whether a tiny one like Markus Persson’s or a huge one like EA — could ever produce on its own. The scale of it is staggering; overwhelming. As you explore the extended Minecraft-verse online, you start to get the same oceanic feeling that huge internet systems like YouTube and Twitter often inspire: the mingling of despair (“I’ll never see it all”) with delight (“People made this”) with dizzying anthropic awe (“So… many… people.”)

What impressed me about Minecraft, from the moment I first saw my children playing it, was how its open-ended structure could result in such wildly different forms of play.

The first thing my son showed me was the complex, working roller coaster he'd constructed inside the game, which we could ride in a mining cart. Then my daughter invited me to see the house she'd built. She maneuvered the POV inside the door, and suddenly, dozens of eyes turned and looked at us from every direction. “These are my cats!” she announced. She'd stuffed her house to the brim with these pixilated creatures.

In other words, the same game served as both my son's virtual erector set and as a virtual extension of my daughter's growing stuffed animal collection. And that was within the first week of them playing it.

(Via DF)

The Best Podcast Apps for iPhone

When Marco Arment announced that he was making a podcast app, he was deliberately entering a crowded market. He said that he wanted to rid himself of his irrational fear of direct competition.

The crowded market doesn't seem to be hurting him. The reviews of his app Overcast in Macstories, Macworld, and elsewhere have all been great. But it's also important to note that the crowded market itself is not diluted by the number of independent developers entering it. On the contrary, the diversity of apps, and all the different approaches to the problem of podcast delivery, can serve to improve the quality of all those apps.

There's a post on the Supertop blog (the developers of the podcast app Castro) about why they welcome the competition from Overcast.

From our perspective, a user trying any third party app is good for all third party apps. If a user is persuaded to download one alternative they should be more likely to consider others in the future, especially given the variety of apps that are available...I encourage you to try Overcast. In fact, if you really love podcasts, I encourage you to try all the others too.

I decided to do just that, purchasing and trying out Instacast, Downcast, Pocketcasts, Podwrangler, Castro, Mocast, and Overcast. I made comparison video below to share what I observed.

I have no background in design, but perhaps because I have a degree in comparative literature, I find it endlessly fascinating to compare the interface of apps and savor the details of what each one offers. None of these apps is perfect, but it's inspiring to see how a group of talented humans use their skills in different ways, through the endlessly plastic medium of software, to approach the same problems.

The Genesis of Goodnight Moon

I loved Aimee Bender's appreciation of Goodnight Moon (via Kottke). She does a great job of describing how the book's genius lies in how it creates a structure and then deviates from it in surprising ways:

For writers, this is all such a useful reminder. Yes, move around in a structure. But also float out of that structure. “Goodnight nobody” is an author’s inspired moment that is inexplicable and moving and creates an unknown that lingers. How wonderful that this oddly compassionate moment, where even nobody gets a good night, shows up in the picture book that is the most popular! There is no template, ever.

I wrote a bit about Margaret Wise Brown years ago for The Writer's Almanac, and I was especially interested in how, in writing Goodnight Moon, she drew on her knowledge of the way children learn language.

Brown wanted to become a writer as a young woman, and she once took a creative writing class from Gertrude Stein. But she had a hard time coming up with story ideas, so she went into education. She got a job at an organization called the Bureau of Educational Experiments, researching the way that children learn to use language. What she found was that children in the earliest stage of linguistic development relish language with patterns of sound and fixed rhythms. She also found that young children have a special attachment to words for objects they can see and touch, like shoes and socks and bowls and bathtubs.

She eventually began to write books for children based on her research, and in 1938 she became the editor of a publishing house called William R. Scott & Company, which specialized in new children's literature. The Great Depression had made children's books into luxury items, and most other publishing houses had phased out children's literature. Margaret Wise Brown helped make children's books profitable, because she understood that children experience books as sensual objects. She invested in high quality color illustrations, and she printed her books on strong paper with durable bindings, so that children could grab, squeeze, and bite their books the way they did with all their toys.

Brown had been a fairly successful writer and editor for almost ten years when, one morning, she woke up and wrote a poem, listing the items in a house, and then saying goodnight to each item, including the famous lines “Goodnight room / Goodnight moon / Goodnight cow jumping over the moon … / Goodnight comb / And goodnight brush / Goodnight nobody / Goodnight mush. / And goodnight to the old lady whispering 'hush' / Goodnight stars / Goodnight air / Goodnight noises everywhere.” She thought the poem could be made into a book, so she sent it off to her publisher, and it was published in 1947 as Goodnight Moon.

The influential New York Public Library gave it a terrible review, and it didn't sell as well as some of Brown's other books in its first year. But parents were amazed at the book's almost hypnotic effect on children, its ability to calm them down before bed. Brown thought the book was successful because it helped children let go of the world around them piece by piece, just before turning out the light and falling asleep.

Parents recommended the book to each other, and it slowly became a word-of-mouth best-seller. It sold about 1,500 copies in 1953, 4,000 in 1955, 8,000 in 1960, 20,000 in 1970; and by 1990 the total number of copies sold had reached more than four million.

You Can Delete But You Can’t Forget

Jacqui Shine made a rash decision to delete all the email from her late mother; in a piece for the Atlantic, she writes about why the digital nature of that deletion is uniquely haunting.

In another way, though, those deleted emails do survive, though—or, at least, the data that Google has extracted from them in order to build your user profile has...Every time I get served an ad for a fawning book about the Founding Fathers or for a deviled egg tray, it’s a kind of tiny haunting: a palimpsest of what once was, stripped of what made it really meaningful. And those tiny traces may be the problem—not because they can’t allow us to recover the things we’ve lost, but because they allow us to believe that we can.

What Tech Offices Tell Us about the Future of Work

Kate Losse at Aeon magazine on the insidious effect of high-end, handcrafted office design in modern tech culture:

Of course, the remaking of the contemporary tech office into a mixed work-cum-leisure space is not actually meant to promote leisure. Instead, the work/leisure mixing that takes place in the office mirrors what happens across digital, social and professional spaces. Work has seeped into our leisure hours, making the two tough to distinguish. And so, the white-collar work-life blend reaches its logical conclusion with the transformation of modern luxury spaces such as airport lounges into spaces that look much like the offices from which the technocrat has arrived. Perhaps to secure the business of the new moneyed tech class, the design of the new Centurion Lounge for American Express card members draws from the same design palette as today’s tech office: reclaimed-wood panels, tree-stump stools, copious couches and a cafeteria serving kale salad on bespoke ceramic plates. In these lounges, the blurring of recreation and work becomes doubly disconcerting for the tech employee. Is one headed out on vacation or still at the office – and is there a difference?

Zoo Animals and Their Discontents

Alex Halberstad writing for the New York Times magazine about how modern zoo animals, despite being given better enclosures and more "enrichment" still suffer from mental health disorders.

I wondered, too, why disorders like phobias, depression and OCD, documented at zoos, don’t appear to have analogues among animals living in the wild. Irene Pepperberg, a comparative psychologist at Harvard who is known for her work with African gray parrots, told me that she thought one reason had to do with survival. “An animal in the wild can’t afford to be depressed,” Pepperberg said. “It will simply be killed or starve, since its environment requires constant vigilance. The situation kind of reminds me of my Jewish grandparents, whose lives were a lot harder than mine. They never seemed depressed, because I don’t think it ever occurred to them.”

In other words, we'd all be a lot happier if lions were actually trying to eat us.

Punctuated Equilibrium

Joe Pinsker, writing for the Atlantic about the fate of the apostrophe in the 21st century, points out how computers are actually preserving aspects of language we might otherwise be willing to let atrophy:

Autocorrect, the now-ubiquitous software that’s always reading over our shoulders, tends to put apostrophes in when we omit them—which means they might remain a feature of informal writing for longer than they otherwise would. The software may also prop up other formal conventions, among them capitalization and “silent” letters (like the u, g, and h that drop out as though becomes tho). “Autocorrect is acting like a language preservative,” says Alexander Bergs, a linguistics professor at Germany’s Osnabrück University. “Which is funny, as usually new media like cellphones and computers are blamed for language decay.”

Feeling More Comfortable with Computers

Tom Jacobs writing for Pacific Standard about a study of how patients feel when describing symptoms to a computer instead of a human:

The result: People disclosed information more honestly and openly when they were told they were speaking exclusively to the computer. The participants also “reported significantly lower fear of self-disclosure” under those circumstances. These results were reiterated by the analysis of their facial expressions, which found they “allowed themselves to display more intense expressions of sadness” when they believed no human was watching them.

This makes perfect sense to me. I can't remember a time when I felt too embarrassed to tell my doctor something, but I've definitely felt judged by doctors, who acted as if I'd come to them for minor ailments. The feeling of judgement certainly affected how much I told them, and I know I'm not alone in this experience. I know a woman who once went to a doctor because she had recently experienced some weight loss and was having dizzy spells. Her impression was that the doctor assumed she was anorexic. He gave her a prescription for anti-dizziness pills, as if that was the problem she was trying to solve.

In a piece for the New Yorker about how doctors make decisions, Jerome Groopman wrote,

Doctors typically begin to diagnose patients the moment they meet them. Even before they conduct an examination, they are interpreting a patient’s appearance: his complexion, the tilt of his head, the movements of his eyes and mouth, the way he sits or stands up, the sound of his breathing. Doctors’ theories about what is wrong continue to evolve as they listen to the patient’s heart, or press on his liver. But research shows that most physicians already have in mind two or three possible diagnoses within minutes of meeting a patient, and that they tend to develop their hunches from very incomplete information.

Perhaps using computers for patient intake could improve both sides of the equation: putting the patent more at ease to share all the relevant information, and giving the doctor a fuller picture of that information before they start forming a premature diagnosis.

Too much Delight?

Interesting take from Sean Madden at Wired on why the Amazon Fire Phone may be too delightful for its own good.

The average smartphone user interacts with his or her mobile device over 100 times per day, and the majority of those interactions fall into just a few categories: opening an app, selecting from a list, bringing up a keyboard, and so on. If each of them is imbued with too much visual whiz-bang, using your phone becomes the digital equivalent of eating birthday cake for every meal.

I would argue that it's not so much the frequency of the effect but the utility that matters. If the effect slows down the experience without offering anything other than eye candy, it's bad design. “Whiz-bang” is a sparkly coat of paint on the surface of the interface. Delight is the spark of life that lives inside the app, coded deep into its DNA.

Welcome Cracks in the Walled Garden

The first good sign was the opening video. Last year's video was a visually pleasing but somewhat abstract statement of purpose about Apple's design principles. The message seemed to be, "We're Apple. We know design. Learn from us." This year, the video focused on people talking about apps and how they improve people's lives. The content wasn't amazing, but the contrast was stark. Apple often takes a moment to nod toward the importance of developers, but this felt bigger than that. Rather than focusing inward on their own expertise, Apple was focusing outward on the people who build on their platforms and use their products. The video ended with the words "thank you" addressed directly to developers. I'm not sure how this went over in the room, but as a user who feels deep gratitude for the apps I use every day, I felt like that thank you was long overdue. And that was just the beginning. Apple spent the rest of the keynote demonstrating this new outward focus by tearing down walls.

Critics of the company love to toss around terms like "walled garden," in reference to Apple's paternal approach to interoperability. It's a fair criticism, especially when it comes to iOS. The App Store, sandboxing, and iCloud each put their own restrictions on how users can access software and data. But another way to see it is that Apple has always been a device-centric rather than a data-centric company.

Other players in the computer industry always saw a sharp divide between the hardware and the software, but Apple has always tried to take a holistic view, controlling as much of both the hardware and the software as possible. This approach only increased with iOS, which gave Apple even greater control of what software could be loaded onto the device, how applications could communicate with each other, and what users could (and couldn't) customize about their experience. That level of control made iPhones and iPads more approachable than any computing devices had ever been before. And Apple's device-centric approach filtered down to developers, who made apps that themselves felt like mini-virtual devices, each with their own unique designs, powers, and solutions.

But overtime, that device-centric approach has felt more and more limiting. Every time you tap on a notification and get pulled sideways into a new app, or you tap open in and find yourself flung in a different direction, you feel your head bump against the walls of the walled garden. Apple wants to hide the file system because ordinary users find it confusing, but is it really less confusing to open a photo in a photo editing app, make changes, and then have to export them as an entirely new photo to the Photos app?

Apple has finally decided to tackle these problems, making the walls of its many walled gardens rather more porous. The most obvious of these changes is a new iCloud document picker, which will allow iOS apps to select a file and then save it without creating second copies. This is the closest Apple has come to introducing a real file system to iOS, and without a demo, it remains to be seen what it will actually look like, but the keynote mentioned that iCloud will not be the only storage option for this document picker. Theoretically, customers could choose Google Drive, One Drive, or even Dropbox.

Other changes include interactive notifications, such as the ability to accept appointment requests, respond directly to messages, and apparently take action on third party notifications (though the only example was Facebook). So instead of having to bounce over to the app in question, entering its little garden, you can just interact with the information itself wherever you are. Another example is third party widgets in the Today view of Notification Center (something that carries over to the Mac). Again, you'd be interacting with the data of an app or the feature of an app without having to enter the the app itself. And Healthkit and Homekit, which were touted as rumors in the run up to the keynote, were described as aggregators of data from other apps. The data, liberated from its silos, can be collected, examined, and transformed with new meaning.

Apple also pulled down the walls between iOS devices and the Mac. There's a new feature called "Continuity," which gives devices a variety of ways to share data more fluidly. You will be able to use Airdrop to send data between Mac and iOS. You can also "hand off" tasks from one device to the next. Writing an email on your phone? Easily switch to writing it on your Mac. Get a call or an SMS on your phone? View the call or the message on your Mac. Each of these features frees the computing task at hand from its confinement to one specific app or one specific device.

But finally, the feature on almost everyone's iOS wish list came true. Apple introduced "Extensibility," a kind of inter-app communication that would allow apps to open up instances of each other's UI to take certain actions. The example Apple showed was of opening a photo in the Photos app and being able to use a filter from another installed app without leaving Photos. It isn't clear yet whether one third party app will be able to interact with another third party app, but that was the implication.

The larger implication is that developers can now begin to think about apps as either stand-alone powerful pieces of software or as extensions of other pieces of software. I don't really want to buy any more apps that let me add filters to my photos, but I might buy an extension to my main photo editing app that gives me extra features.

Power users are no doubt cheering all of these additions. For me, what's really exciting is not the features in themselves (though I am excited to try them) but the apparent change in philosophy, the willingness to trust the users and the developers. With iOS 7, Apple seemed to be saying that people are comfortable enough with touch interfaces that they don't need skeuomorphic designs anymore to make them feel comfortable. With iOS 8, Apple seems to be saying that people are comfortable enough with the various data they manage through their devices and their apps. That data can now begin to move more fluidly between those devices and apps.

Recently, in "Sharing the Ecosystem" I wrote,

I find it fitting that the number one request on most people’s lists for iOS 8 is better sharing of information between apps. What Apple needs is better sharing, period. Healthy ecosystems are all about sharing. “Better can’t be better if it doesn’t consider everything.” Just as Tim Cook sees the value in sustaining the world’s ecosystem, he needs to see the value in sustaining the developer ecosystem. It’s those developers who can provide the real return on investment, making both his products, and the world, better.

I came away from the keynote feeling that Tim Cook understands this. He chose to begin the keynote with a thank you to developers, and he ended it by asking all the Apple employees in the audience to stand up to receive recognition. For the last two decades, Apple was defined by one man's vision, even if there were many people behind that vision. Tim Cook wants to celebrate all the people working to make Apple better. I have rarely felt more hopeful about the company.