Bill Buxton strikes again

A few weeks ago, I saw Bill Buxton give a talk at UW for the Puget Sound SIGCHI meeting.

The main takeaway for me was Buxton’s call to study and learn from the history of design. “Know the good parts of history and what to cherry pick from it.” For example, the original set of five-color iPods took many design cues from an early consumer camera product line. “[Apple design chief] Jonathan Ive knows the history.”

Buxton showed photos of what he called the first educational technology device: the PLATO IV from 1972. It included graphical output and a touch screen, and was apparently put in schools all over Illinois. The similarities to the iPad are striking. He demoed a watch from 1984 that includes a touch surface capable of doing character recognition on decimal digits. It sold for just $145 (in today’s dollars). Buxton also took a look at the first real smartphone: the “Simon” from 1993. It is shockingly similar to the iPhone, complete with a row of app icons on the home screen. The only app “missing” is a web browser (the html web was still a research novelty in 1993).

There were many other examples which I didn’t note specifically, many of them MIT Media Lab prototypes published in SIGGRAPH. Buxton also pointed the audience to archive.org for more, such as a video on input devices in 1988.

The second takeaway was Buxton’s theory of the “long nose”: it takes about 20 years to go from an idea to a $1 billion industry. In other words, “Any technology that is going to have significant impact over the next 10 years is already at least 10 years old.” So the important act of innovation is not the “lightbulb in the head” but rather picking out the correct older ideas that haven’t yet hit the elbow of the exponential curve. When change is slow, humans don’t tend to notice; but you can counteract that by explicitly measuring the change as technology progresses. What are the technologies invented 20 years ago that are about to become huge?

Still Magical

As part of testing our upcoming iOS 4.2 release of OmniGraphSketcher for iPad, I just threw together this graph — a more or less exact replica of a textbook economics diagram.  All on the iPad, without any fuss.

Economics diagram f on OmniGraphSketcher for iPad

I could email the PDF directly to a textbook publisher.

Despite the fact that I’ve been working on this app ever since the iPad was announced, the whole thing still kind of boggles my mind. Even though I know in detail how the app works, Apple’s term “magical” is the best way I know of to describe the experience of using it.

Chaos: Making a new science

If there was any doubt that science is driven by people politics as much as anything else, look no further than James Gleick’s 1987 book, Chaos: Making a new science.

The book chronicles the history of chaos theory; but “chaos” is also a good word to describe the scientific community’s embarassingly slow acceptance of the findings and tools of this new mathematical subfield.

The book held particular interest for me because of an unsolved mystery in a research paper I wrote in college, Weather forecasting by computer. Edward Lorenz, who published the first research on what would become chaos theory, calculated in the late 1960s that “even with perfect models and perfect observations, the chaotic nature of the atmosphere would impose a finite limit of about two weeks to the predictability of the weather.” Despite this, I was reading brash predictions in books published in the early 1980s that we would soon be able to forecast the weather months or years into the future. Why did it take more than a decade for this fundamental mathematical result to make its way even to experts writing about weather forecasting?

Gleick wondered the same thing. The fact that his conclusions took the form of an entire book is testament to the many factors at play. (I was a bit relieved to confirm that I wasn’t just missing something obvious.)

Part of the answer is that chaos theory was outside the scope of existing academic disciplines, almost by definition. It tried to make sense of problems that couldn’t be solved using traditional mathematics — the very problems that most researchers (and entire science departments) stayed away from because the chances of progress seemed slim. Over time, disciplinary boundaries developed such that most of these problems were not considered valid topics in physics, biology,… and even weather forecasting.

A second part of the answer is that many of the important results of chaos theory themselves defined limitations on what is possible to know or achieve, especially when seen through the lens of traditional approaches. Scientists and other leaders didn’t want to believe these pessimistic claims, and they were easy to ignore when coming from a suspicious fringe group of career-insensitive mathematicians.

A third part of the answer is that even when the essential properties of chaos theory had been well established by mathematicians, the theory was not useful to mainstream scientists until practical mathematical tools were developed. Several important mathematical results eventually helped to show how disparate data sets all displayed chaotic “bifurcations” and “period doublings,” for example. As scientists were given more concrete patterns to look for, evidence of chaotic behavior became increasingly visible to them.

And yet a fourth part of the answer is that the main tools used to investigate chaos theory — computers — were new and unfamiliar to mainstream scientists. Lorenz was one of very few theorists in the 1960’s who had access to expensive computer time (and the knowledge to use it). And although rigorous mathematical proofs were eventually found for many components of chaos theory, for many years the most important results were simply the outputs of clever computer programs. Running experiments like this via numerical simulation was a totally new approach. Scientists and mathematicians had every reason to be skeptical.

At the time Gleick’s book was published, chaos had finally become broadly accepted in science and had led to a few high-profile applications such as heart pacemakers. Yet even now, 20 years later, chaos theory is not part of the standard curriculum at any level of school. I studied it for a few weeks in high school as part of a special end-of-year diversion; and in college as an elective math course that was only offered one semester every other year. And I went to very progressive schools. When Steven Wolfram unveiled his “new kind of science”, non-experts missed the fact that he was talking about this same line of research. The new science is still in its infancy.

Acting

“…acting is an instrument of freedom, which enables people to realize that they are not imprisoned in themselves…”

-Theodore Zeldin

Automatic color temperature

Bill and I had an interesting conversation today about how computer displays (particularly on mobile devices) should automatically adjust not only the brightness of the backlight, but the color temperature of the pixels. For example, in a room with warm lighting, the “white” on the display should look reddish, the same color as the reflection of that ambient light off white paper.

See Bill’s blog post for the full story.

Sculley on Apple

At the risk of being like all the other bloggers, I feel the need to write down what I didn’t know or found most interesting about the recent interview with John Sculley, who was the CEO at Apple for about ten years.

First, Sculley made it clear that Apple has always been a digital consumer electronics company, and in many ways was the first such company. As digital components continue to become cheaper and more powerful, thus enabling better consumer electronics, it makes sense that Apple will continue to thrive. Another way of saying this is that Apple was so ahead of its time that even after 25 years the market conditions are only beginning to really align with their strategy.

Second, I often explain to people how Apple’s control of both hardware and software is key to their ability to innovate. There is a lot they simply couldn’t do if they didn’t control the whole pipeline. Sculley recounts an excellent example of this, dating back to the first Mac.

Sculley: The original Mac really had no operating system. People keep saying, “Well why didn’t we license the operating system?” The simple answer is that there wasn’t one. It was all done with lots of tricks with hardware and software. Microprocessors in those days were so weak compared to what we had today. In order to do graphics on a screen you had to consume all of the power of the processor. Then you had to glue chips all around it to enable you to offload other functions. Then you had to put what are called “calls to ROM.” There were 400 calls to ROM, which were all the little subroutines that had to be offloaded into the ROM because there was no way you could run these in real time. All these things were neatly held together. It was totally remarkable that you could deliver a machine when you think the first processor on the Mac was less than three MIPs (Million Instructions Per Second). (NOTE. For comparison, today’s entry-level iMac uses an Intel Core i3 chip, rated at over 40,000 MIPS!)

This approach continues to be important in every category of device Apple produces. For example, today’s iPhones and iPads have special-purpose hardware for video decoding (that’s why they can only play movies encoded in certain formats). Microsoft could not really enter the graphical operating system market until standard processors and graphics cards became powerful enough to do most of the graphics routines themselves. If Microsoft produces an innovative operating system that requires hardware that is too specialized or not readily available, the hardware manufacturers will say, “sorry, we can’t support it.” At Apple, Steve Jobs says, “We will find a way.”

There’s a great little quote about a meeting at Microsoft. “All the technical people are sitting there trying to add their ideas of what ought to be in the design. That’s a recipe for disaster.” Sculley thinks that’s part of the silicon valley culture started by HP, where engineers are most respected. At Apple, designers are on top. “It is only at Apple where design reports directly to the CEO.”

Sculley repeats over and over how good a designer Steve Jobs is. The following story is about a visit to the inventor of the Polaroid camera.

Dr Land had been kicked out of Polaroid. He had his own lab on the Charles River in Cambridge. It was a fascinating afternoon because we were sitting in this big conference room with an empty table. Dr Land and Steve were both looking at the center of the table the whole time they were talking. Dr Land was saying: “I could see what the  Polaroid camera should be. It was just as real to me as if it was sitting in front of me before I had ever built one.”
And Steve said: “Yeah, that’s exactly the way I saw the Macintosh.”

Dr Land had been kicked out of Polaroid. He had his own lab on the Charles River in Cambridge. […] We were sitting in this big conference room with an empty table. Dr Land and Steve were both looking at the center of the table the whole time they were talking. Dr Land was saying: “I could see what the  Polaroid camera should be. It was just as real to me as if it was sitting in front of me before I had ever built one.”

And Steve said: “Yeah, that’s exactly the way I saw the Macintosh.”

The item which I did not know (or maybe had forgotten) is that Apple’s Newton product played a central role in the creation of the ARM processor design, which is now used across the industry in most smartphones and other consumer electronics (including the iPhone, iPad, and Apple TV). Moreover:

The Newton actually saved Apple from going bankrupt. Most people don’t realize in order to build Newton, we had to build a new generation microprocessor. We joined together with  Olivetti and a man named Herman Hauser, who had started Acorn computer over in the U.K. out of Cambridge university. And Herman designed the ARM processor, and Apple and Olivetti funded it. Apple and Olivetti owned 47 percent of the company and Herman owned the rest. It was designed around Newton, around a world where small miniaturized devices with lots of graphics, intensive subroutines and all of that sort of stuff… when Apple got into desperate financial situation, it sold its interest in ARM for $800 million. […] That’s what gave Apple the cash to stay alive.

So while Newton failed as a product, and probably burnt through $100 million, it more than made it up with the ARM processor.

So the product that has become a “celebrated failure” of business school lore was actually one of the most successful products in the industry, when taking the long-term technological view. Not only did it save Apple from bankruptcy, it enabled an entirely new category of power-efficient, mobile computing devices. True, the marketing and business strategy used for the Newton failed. But the underlying technologies and vision not only saved Apple in the 90’s, but also formed the basis of its remarkable growth today.

Everything Apple does fails the first time because it is out on the bleeding edge. Lisa failed before the Mac. The Macintosh laptop failed before the PowerBook. It was not unusual for products to fail. The mistake we made with the Newton was we over-hyped the advertising. We hyped the expectation of what the product could actually [do], so it became a celebrated failure.

Apple has gotten much better at this over time. They are often accused of over-hyping their products, but if you look carefully, Steve Jobs and Apple advertising never claims that their products can do anything they can’t actually do. They say things like “1000 songs in your pocket” and show people dancing to those songs. Some people react with “oh my gosh, that’s what I actually want.” Other people say, “no thanks.”

Technologists complained when Apple described the iPad as “magical.” Obviously the device is not actually magical. But I think that word pretty well describes the actual reaction of many customers to the product.

Airline infographic

An interesting infographic is being passed around today from the New York Times. (Click the excerpt below for the full graphic.)

Airline info graphic

I suspect that much of this was hand-crafted using an illustration program. Notice how the merger at the right end of Delta’s row maintains the vertical center from before the merger, whereas in most other places the incoming merger adjusts the vertical center by adding on to the top or bottom of the previous bar.

High Dynamic Range

I’m impressed with the new “high dynamic range” (HDR) photography feature in iOS 4.1 for iPhone 4. The feature basically takes three versions of the image in quick succession, each using a different light setting. Software then combines the three photos using image processing algorithms. The goal is to avoid washed-out bright areas and dark, almost-black shadowed areas.

I took the picture below with HDR turned on. I did not use a tripod, did not set anything manually, and did no post-processing other than cropping. (Click it to see full resolution.)

Seattle skyline using iPhone with HDR

The plain, non-HDR version of the image looked pretty good too, but everything was more washed out, especially the buildings and sky. The trees were a bit brighter but didn’t look as rich. I think the HDR version looks astonishingly professional.

Math in Ancient India

I just checked my web server statistics and found that part of my high school research paper on the history of mathematics is getting well over a thousand requests a month.

The topic of that paper is “the usually unrecognized achievements of Vedic and Hindu mathematicians from 2000-300 B.C.” When writing it, I was surprised at how hard it was to find good research on the topic:

In Mathematical Thought from Ancient to Modern Times, [a “comprehensive” summary of] the history of mathematics, [author Morris Kline] included only half a chapter (out of 50 chapters total) on Indian math. Everything he said seemed to sneer at them, put them down, and belittle their accomplishments.

Despite this dearth of understanding, the facts were clear:

Besides using simple arithmetic operations like addition, subtraction, etc., Indians invented the decimal system and the idea of positional notation, both of which are still in use today. They also used the “Pythagorean” theorem and “Pascal’s” triangle long before either of those men were born!

Today, it turns out that if you google “Sulva Sutras” (the title of the web page getting most of the visits), my research paper is the first result, above Wikipedia and everything else! If you search for “mathematics in vedas”, I’m the third result.

True, this seeming popularity may have something to do with spelling inconsistencies — the Wikipedia article uses “Shulba Sutras” and is the first result if you search with that spelling. It’s also true that my paper was written in 1999 and has been on the web since 2003, so has had time to gather links from other websites (which influence Google’s ranking).

But we’re talking about the founding documents of mathematics! The origin of zero! The “Pythagorean” theorem, recorded hundreds of years before Pythagoras! And the most relevant article was written by a fifteen-year-old?

In my research paper’s conclusion (which is mostly too embarrassing to quote), I wrote, “I find it unbelievable how little work has been done in the field…. The vast majority of the work has been done only by Indians. Most of the books on the subject are written in Sanskrit or Hindi [and are ignored by] eurocentric scholars.”

Ten years later, my incredulity lives on.