Following the announcement of the Apple iPad earlier this week, my friend Adam Pavlik, a life-long die-hard Apple fan and shareholder himself, posted to Facebook his thoughts on the new device. Now, when I say Adam is an Apple fan, that is slightly an understatement. While I got my first Mac in 2004, Adam grew up in a home with them and never owned another brand computer that I know of. When I commented on his comments, he commented on mine, and a back and forth of a caliber like I never anticipated ensued.
I am re-posting my conversation with Adam's permission here for two reasons:
- You never see a conversation like this on Facebook. (Frankly, I feel sorry for the other two guys who were tagged in the post because their inboxes were filled with our uber-nerd-fest.) After all, and this is a completely subjective and unscientific observation, most "dialog" on Facebook consists of "OMG lolz!!!1!!one!" and "ur dumb."
- Adam and I both make what I feel are valid points and managed to summarize a good portion of the news swirling around the media about the iPad. It saves me from writing a blog about it myself. My friend and roommate Josh and I for one have conversations like this all the time about myriad topics, including Apple of course. It's just that I've never seen an instance where the conversation was so well-documented and played out in a written form.
So without further adeu, here is the conversation itself:
Adam: If this product is even a mild success, the success of the internally-designed Apple chip (via PA Semi) will be THE story nobody realized was a big deal at the time. Interesting to see how the disparate strands are being drawn together.
I think there's a very limited market for this device (as you can see in the other link I posted), but its best chance is to simply be revolution by evolution: bringing together things (like multitouch, which everybody knows on the iPhone/Droid, and portable computing) that are familiar concepts, but in ways that people didn't realize were as convenient as they (hopefully) end up being.
What this thing has going for it, which nothing has really had going for it before, is a huge installed base: tens of millions of iPhones and iPod touches. The Newton never had that, Windows CE never had that, Windows Mobile doesn't really have it. Moreover, it is not the financial future of the company: it complements an already-existing, largely successful series of products for the company (iPhone/iPod touch). That, combined with the company's "cool factor," the fact that it's an extraordinarily attractive machine physically, and the fact that it's being hailed as the savior of various sorts of traditional print media, may build up enough convergence to get the snowball rolling down the hill unstoppably. But that's a lot of "mays."
Bill: Well Adam, once again you have summarized my own thoughts much more aptly than I've even done myself. I admit I did listen to the first part of the event live. It was from someone streaming it from his/her cell phone and it cut out just before Scott Forstall went on. I do plan on watching the whole thing tonight though.
Also on a related note, if you haven't seen it yet there is a great documentary called Objectified. It is about product design and features an incredibly rare interview with Apple's Jonathan Ive. The film was made by the same director who made Helvetica, if you're familiar with that.
As someone who owns a Kindle, I will say that I plan on selling it once I have my own iPad. I recently finished my first book on the Kindle and was actually quite annoyed at the experience; you can't easily page ahead to see how many pages are left in a chapter, for one. The best part of the Kindle are the blog subscriptions, which I no longer use and which will be easily trumped by the full web on the iPad.
Adam: That is nice of you to say, Bill. I don't mean to come across as a pure skeptic. What it reminds me of in some ways is the Mac mini. For a VERY long time, the John Dvoraks of the world (I use him as a fill-in for the sort of clueless pundit that places like C|Net employ) kept saying that Apple could take over the world if only they would offer a headless iMac so a consumer could provide their own I/O devices and just get the Mac CPU. Well, the Mac mini hasn't failed, exactly, but it didn't set the world on fire, either. I know a number of people who talk about what a great product it is and can rattle off all sorts of things it's "perfect for," but none of them actually own one. Techies are almost preternaturally unable to see the forest for the trees, and always focus on what makes for their ideological notion of what the *gadget* "ought" to be rather than what the *product* NEEDS to be.
Bill: You're welcome, Adam. I feel the same about the Apple TV. There certainly are a lot of skeptics about that too. But again, I own one of those too and can honestly say I really enjoy it. The video podcasts, movie rentals and the full iTunes library sync features have become a regular part of life for me. It's one of those products that serves a few useful features that aren't readily noticeable. The geeks and tech journalists criticizing the iPad really aren't seeing the whole picture. Apple didn't make it for hardcore uber-geeks, they made it for everyday people like my Mom and Dad. And for that market, I can certainly see it being useful since they don't need a full PC (or Mac for that matter).
Ok, seriously, time to go watch the "Stevenote."
Adam: I would say that this theme of sorts, that the product is aimed at the "Mom and Dad" market, does not hold water with me. Apple's products have worked because of broad-based appeal: the iPod, iPhone, iMac, and MacBook have been successful precisely because they are simply useful products, for almost anybody. Lots of very technical people own those products, including very technical people who insisted they'd never buy one for ideological reasons, but end up gravitating toward them because they're just more useful. A product that has a built-in class of people (powerful, influential people) who have little use for it is staring down a dead-end. The test is this: there needs to be a place for this product in the lives of a "culturally significant" number of people who already own an iPhone and a laptop. If that is true, it'll be a success. If it isn't, it'll flop.
Bill: Well, I just finished watching the announcement from Wednesday. Having now seen this in action (remember, I've only seen pictures and heard the audio stream), I can say that all the talk of it not being revolutionary is garbage. Its user interface enhancements specific to tablet computing are refinements unlike anything else on the market. For one, the way they eliminated the WIMP paradigm (window-icon-mouse-pointer) is pure ingenuity and if I have to say so, about damn time. Don't get me started on the modal and non-modal dialog boxes. Side note: remember, I went to grad school for this stuff, pardom the jargon. Now I just want the Apple store to get them in so I can try it out for myself.
Adam: I guess that surprises me, Bill. The OS strikes me as a refined version of the iPhone OS, optimized for the bigger space. It makes use of the extra space in clever ways, but it is not the sort of discontinuous leap that people mean when they use that word "revolutionary." It seems like a possible revolution-by-evolution; a series of subtle refinements that, when taken in the aggregate, amount to more than the total value of the individual improvements.
You are right that it is like nothing else on the tablet market. I have no doubt that it's the best tablet computer that's ever been made. But I have a hard time seeing clever refinements to a basic framework we already were familiar with (iPhone OS + multitouch) as a "revolution," except in the qualified way I mentioned above. And I have substantial doubts about whether the best tablet computer that *ever could be made* (let alone merely the best one ever made *to date*) has much of a market.
Bill: You're exactly right about how you read in to that: I meant that it's revolutionary in terms that it's especially good refinements to what we are familiar with. I'm also a big fan of eliminating the use of WIMP as much as possible :-)
Adam: You know more about it than me, but I see WIMP as being substantially more flexible. As I said when I posted the NYT write-up, every aspect of the UI on this thing is heavily engineered. Every. Last. Aspect. That makes me wonder how nimble it will be. To me, it looks like every UI aspect of every application requires something like a one-of-a-kind solution. I don't know how you can keep up with the changing ways users use computers doing that. Just in the last few years, Apple has gradually (but steadily) ripped up the UI in Finder, iMovie, iPhoto, and other apps to accommodate users' needs for more powerful search tools to zoom to the few items (pictures, documents, whatever) out of the bajillions they have on computers that are increasingly not seriously limited by storage capacity concerns. To me, that seems easier to pull off with a conventional WIMP interface to this untrained observer
That's just one example of the sort of basically unforeseeable way that user needs shift over time. There are others we haven't even thought of yet.
Bill: You make a good point there about each app having its own UI to learn. For the iPhone, Apple developed a thoroughly documented style guide to introduce developers to the new interaction methods. (Apple's Human Interface Group, HIg, is renowned, naturally.) It's mostly succeeded in keeping apps uniform so the user does not need to re-learn for each one. I would be very interested to see what Apple has done with the iPad but alas you have to join the developer program for $99 to get a copy. It will be incredibly interested to see if that success continues or not on the iPad.
One of the "one more things" I wish they had included was a standard format for newspaper and magazine publishers to use for their content. This would have ideally been in the vein of iTunes LP and been included in the iBooks store. Then again, I suppose they have to save something for later.
Regarding what you mention about the UI refreshes in iLife, you have a good point. The desktop metaphor with WIMP and all is ubiquitous this day and age. Most any computer user can sit down and be successful in using a moderately challenging app within an hour or so (this assumes they aren't afraid of experimenting, of course). What is great about the iPhone/iPad is that it is completely task driven, or rather that apps function like an appliance. It's said that Jef Raskin, who worked on the original Macintosh and wrote the Humane Interface, had thought of the personal computer as an "information appliance." With his UI work that eventually became part of the iPhone and now iPad, his concept of task-driven applications are much more evident than they are in a traditional desktop OS.
Adam: Oh yes, I know all about Jef Raskin. That guy . . . he was a different guy. That's all you can say. I knew that Apple's HIG people were top-notch, although I've read more than one complaint that the OS X UI sacrifices metaphorical consistency and other principles of the original Human Interface Guidelines for visual bells and whistles. I have read about the work that went into the original Mac HIG, and (as a Mac fan at a time when it was not popular to be one) it always was inspiring to me to see how much meticulous thought and energy went into whether a menu item, once chosen, should blink 1, 2, or 3 times (such fierce debates were sparked that they ultimately had to deviate from their usual tactic and leave it as an adjustable user preference).
Gruber (who I sometimes find insufferable) had some comments about the iPad HIG here: http://daringfireball.net/2010/01/various_ipad_thoughts
I would note that a task-based UI is exactly what I'm suspicious of. The WIMP/desktop metaphor at least SEEMS more versatile to me because it generally involves processes that people use to accomplish the tasks they want. But there's a layer between those that the user has to provide: connecting these processes to the user's tasks/goals. While that requires input/knowledge from the user, it also seems to me that it makes the device more versatile. And, of course, over time software manufacturers adjust the processes to adapt to the tasks users want to set their device to, but there's always a layer in between. In my mind, that layer in between is where innovation really occurs. Task-based computing, to me, seems limited by the imagination of the person who makes the tools. The dynamic I am familiar with has, to my mind, a creative tension in it that spurs innovation.
Bill: Those are all good points. I'd like to put more thought into a response (you have a lot there there to speak to!), but I'm at work and need to wrap some things up...
Adam: In my mind, it's the difference between a hammer and a nail gun. A nail gun is awesome at driving nails, but that's all it's really good for. A hammer, on the other hand, is designed for driving nails but is useful for pounding objects generally. I daresay that you would never have invented the nail gun unless someone had tired of hammering away at nails, and the existence of the hammer, and its versatility (its ability to be adapted to other pounding-based tasks) means that through its use, we may well think of other innovations (like the nail gun). If we lived in a world where the nail gun had come first and nobody had invented the hammer, would we think of those other innovations?
It's an imperfect analogy, I know.