Back To Blog
Data, Intuition, and Steve Jobs
11.02.11

There's a great op-ed by Walter Isaacson -- author of the recent biography on Steve Jobs, simply titled "Steve Jobs" [link] in the New York Times [link]. In it, he cites Jobs talking about intuition:

[Trained in Zen Buddhism, Mr. Jobs came to value experiential wisdom over empirical analysis. He didn’t study data or crunch numbers but like a pathfinder, he could sniff the winds and sense what lay ahead. He told me he began to appreciate the power of intuition, in contrast to what he called “Western rational thought,” when he wandered around India after dropping out of college. “The people in the Indian countryside don’t use their intellect like we do,” he said. “They use their intuition instead ... Intuition is a very powerful thing, more powerful than intellect, in my opinion. That’s had a big impact on my work.”]

My first job out of college was as a product manager at amazon.com. My initial responsibilities were around data crunching -- everything from analyzing how people shopped to developing models estimating the impact of new features / initiatives. (My undergrad degree is in economics.) I later transitioned to a more traditional product management role and eventually pm'ed products ranging from large features like amazon.com's wedding registry to smaller initiatives involving improving the "attach rate" with which customers who bought electronics also bought accessories (the margins for accessories are significantly higher than for the base item.)

One area that made an extreme impression on me was the use, and abuse, of data at amazon.com. Many people were simply seduced by the value of numerical analysis. Finance was a dominant department in amazon.com -- to the extent that finance managers were typically in every key meeting. In fact, I was recruited to join the finance team with the simple premise that that's where the action was -- not product. And they were unfortunately right. So how was data abused? Nearly every decision was viewed through the lens of data. Superficially, this would seem like a good thing -- no longer making decisions based on intuition but instead using rigorous analysis. There are a number of problems with this. The first, and very significant, is that a lot of the people using the data were frankly not intelligent enough to be using the data. I hate to use so cutting a word, but frankly, they were bozos. I'll give an example. When we got a new VP of our group, Steve called me into his office to sit with him as he perused the site. He came across the search results for the electronics store. He had typed in "memory" and the search results were completely littered with phones. Now, basic intuition says that if someone is searching for "memory" -- they're likely looking for memory. Maybe memory cards for their camera. Maybe even memory for their computer. So why were phones showing up? Well, these phones had the word "memory" in the title -- basically saying how many numbers they were able to store or something like that. The search results had then taken all the possible candidates and then ordered them by something called "contribution margin" -- what they would deliver in margin to amazon.com's bottom line. No problem right? Just a bug in the system in an otherwise reasonable algorithm. Wrong. I explained to Steve that I too thought this was crazy and despite our complaints, nothing was done. My hope was that he would take up that battle. Why was this a battle at all? Data. The people running the team in charge of those search results basically claimed that this new rejigering of the algorithm resulted in more money for amazon.com. Forget about what terrible damage this might do to customer experience -- the correct solution was in the data. 

This was one of many examples across the company. (I should note that this example was from a decade ago and a few cursory searches shows that they have rectified the situation.)

Let me give a second (slightly different) example. I mentioned earlier that I used to build financial models / projections for various initiatives and projects. To be blunt, I prided myself on these models and their robustness -- I would do something that was a little unusual (but seems obvious to do) -- which was after the initiative or project launched, I would then do an analysis to see how close I came. Obviously this was good information for me going forward. By the way, no one cared about this analysis which once again speaks to the potential abuse of using data. After all, if you align your resources based on various models -- wouldn't it be helpful to know who was more likely to give accurate projections and who might be over or underestimating theirs?

One day, my boss calls me into his office and asks me about a particular model that I built. It turns out that this initiative was up for resources, it looked like it might get resourced (which was a somewhat rare event), but that they stopped short of doing so because my model didn't show a high enough return. Could I change the model so that it showed a higher return? My response? Somewhat tongue-in-cheek, I said that I could change the model to project anything I wanted. So why was this the case? The cold reality of model building is that, like many things, it's all in the details. I *could* change the model to project anything because with each model -- there are a handful of assumptions that are very sensitive. Very sensitive. However, those assumptions are not subject to strong intuitive feel. Let me give an example. Let's say the average selling price on a particular product is $25. If you change the average selling price to $30 -- that's a 20% bump. That's a large change and people can feel that change. They know if something is $25 or $30. However, let's say the conversion rate on that particular product is 5%. If you change the conversion rate to 6% -- that's also a 20% bump. However, people can't feel the difference between 5 or 6% -- or at least people who don't live and breathe something evey day can't. That's why it's so critical to have a very dispassionate person building your models. They'll know whether 5 or 6% is realistic and what to use. That 20% will drill down to the bottom line. Start fiddling with those numbers enough and suddenly your projections are 2-3x what they originally were.

I write this for a couple of reasons. The first is that data is easy to manipulate. If you use data not to get a good answer / more information -- but instead use it for internal political jockeying -- then the entire purpose of doing data analysis is not only pointless, but rife for corruption. If you approach data and say, "Give me your best assessment for what will happen." -- you instead work in a world where you empower people to try and give information to help make the best decision for the company and its customers. The second is that you have to be good to analyze data. You can't look at a piece of data and say "X is higher than Y" therefore let's go with X. The right response might have been, "Why are we analyzing this at all? The correct answer is to make search results as relevant to the user as possible and not to order them by contribution margin." This takes someone with taste. Someone with good intuition.

There's another great quote from Steve Jobs which I'd like to cite here:

[Ultimately it comes down to taste. It comes down to trying to expose yourself to the best things that humans have done and then try to bring those things in to what you're doing.]

Data is really valuable. I've benefitted greatly from robust analysis. But it doesn't help you in the ways one typically thinks it helps. Strong data modeling teaches you where the model is sensitive. What are the numbers you really have to get right. Figuring out the exact curve of customer adoption is silly -- it'll be what it is. If it's too low -- then you figure out how to increase it. It's that simple. Predicting the future isn't the goal. Making the future is.

Intuition and taste are so much more powerful gauges than data for creating great product. People who are guided strictly by data don't know what they're doing. Data informs the process, it doesn't drive it. The Steve Jobs quote says to me that it's our job as product managers to experience the world in such a way that we see and feel product. We understand and appreciate beauty. Then we bring that experience into our work so we intiutively know what's right and wrong when it comes to the thousands of decisions we have to make to ship a good (and hopefully great) product.

Comments