Change

“To be alive is by definition messy, always leaning towards disorder and surprise. How we open or close to the reality that we never arrive at safe enduring stasis is the matter, the raw material, of wisdom.”

-Krista Tippett, Becoming Wise (p. 67)

End-user programming is still an experiment

John Gruber, on the departure of Sal Soghoian from Apple and the apparent dissolution of the macOS automation team:

Part of my argument for why I feel so much more productive on a Mac than an iPad revolves around the automation technologies that Soghoian’s group developed. […] I find this to be a profoundly worrisome turn of events for the future of the Mac. […] On a personal note, I’ve known Sal for a long time. I first met him at a WWDC in the early years of Daring Fireball.

I too met Sal at WWDC years ago. (I knew one of the engineers on his team from grad school.) Even in 2008 I wondered about the future of the automation team, for the simple reason that the WWDC session on automation technologies was always scheduled in the smallest room, in the last time slot, on the last day (when many attendees had already left town).

I see these automation tools as experiments of sorts, exploring which programming-like tasks users can accomplish without needing to actually learn to program. AppleScript, for example, adopted an experimental English-language like syntax that aimed to be more approachable than other programming languages. Apple’s more recent Automator app used more of a graphical, lego-block approach. Both of these rely on other apps to surface third-party functionality in a way that is accessible to the automation tool. It’s not clear how many people ever really used these tools.

Meanwhile, the Mac moved over to Unix and gained 30 years worth of command-line automation tools. Now it’s often easier to copy and paste a Terminal command from a web search than it is to set up an Automator workflow. And the developer community continued to grow and ship new automation-related apps and scripting languages, for everything from text editing to web design to server maintenance. So the Mac is not losing its ability to be automated — on the contrary, there are more ways to do it than ever before.

Now Apple has thrown its weight behind different, related efforts: it’s not hard to imagine Siri becoming capable of many of the things Automator could do (even Automator’s robot icon foreshadowed this). And perhaps Apple’s Swift Playgrounds app, designed to help anyone learn how to program, can be seen as an assertion that previous automation technologies were too limiting — you may as well dive in and learn to code.

I think it’s a testament to Soghoian’s commitment that the automation tools team lasted as long as it did — and that the Mac has such an abundance of automation tools today. I’m not really sure what the team’s dissolution means for the future, but I think the space is still ripe for exploration. I hope Apple, Soghoian, and the developer community will continue to experiment.

5-hour workday

“When I tell people my team only works five hours a day, their response is always, ‘That’s nice, but it won’t work for me.’ The 9-to-5 is so ingrained in their minds that they can’t imagine anything else. But you can reduce your hours by 30% and maintain the same level of productivity.”

-Stephan Aarstol, The Five Hour Workday

What is true?

The classic Zen koan goes: “If a tree falls in the forest and no one is there to hear it, does it make a sound?”

Today I ask: “If there exists a fact but no one believes it, is it true?”

Science seeks to uncover objective truths — facts that exist independently of any given person’s beliefs. But scientists are still human, and humans make mistakes (logical and otherwise), so you can never be absolutely certain about any given scientific truth, no matter how many experiments we run. (As a scientist or science-minded person, it’s easy to overlook this depressing fact.)

Meanwhile, if someone believes something, then the fact that they believe it is, in and of itself, true. This is why journalists report on what people believe, even if there may be compelling evidence that the belief is faulty. As Simon Sinek puts it in his classic TED talk: “People will do the things that prove what they believe.” If you believe that global warming is a hoax, or that immigrants are responsible for economic decline, you will vote for a candidate who appears consistent with those beliefs. So in determining the outcome of the election, the belief carries far more weight than what happens to be objectively true (which no one can be absolutely certain about anyway).

A belief is our own subjective experience of what is true. It’s what’s real to us. It’s internally certain. And as such it often carries far more power than what may or may not be externally factual.

In this sense, you could argue that beliefs are more true than facts. Beliefs are the truest thing there is for the person that believes them. Moreover, if you are interested in taking action in the world, and inspiring others to act, then knowing what people believe is usually at least as important as knowing the facts.

So if there exists a fact but no one believes it, is it true?

And if it’s true, does it matter?

 

 

Designs do not imply truth

“Often, what we have conjured [in the past] assumes the sheen of inevitability, as if its results were inalienable facts in the world rather than the product of someone’s ideas and actions. In other words, design solidifies, and naturalizes, things that start off as opinions, stories and traditions, supplying form to the fictions by which we live. We rarely stop to consider the faith-based proposition represented by our paper money or the imagined national narratives engendered by borders. Unlike words, the meaning of which can be debated, the objective materiality of designed objects exudes a unique power. Once established, it’s difficult to think outside the systems and structures these objects represent.”

-Michael Rock, “The Accidental Power of Design” (NY Times)

Not Knowing

“The vulnerability of not knowing is in fact the only portal through which breakthroughs occur.”

-Amy Whitaker, Art Thinking (2016)

Learning takes time

I recently found a sticky note from around 2004 on which I had written: “In the future, artificial intelligence may obviate the need for certain skills, but the act of learning is a very human process that proceeds at human pace.”

Since then I’ve been puzzling over what exactly I meant by that. (It’s a classic example of obvious or profound?) Here’s what I think I meant: The rate at which humans learn is essentially limited. Of course, the pace of learning does vary somewhat in different situations, but this variation tends to be within, say, an order of magnitude and there is no known way to get around these limits.

Whether by impatience or optimism, I frequently forget this. As soon as I learn something useful or exciting, I start trying to rapidly explain it to others, expecting them to learn it using far less time and effort than it took me. While in some sense it’s generous to assume that others are such fast learners, it’s also a form of hubris for me to believe that my explanation is somehow orders of magnitude better than the explanation given to me. It would be as if I somehow found a magical shortcut that makes teaching and learning easy.

None of this is to say that great teaching doesn’t exist — it does, and in every domain teachers have worked tirelessly towards making learning more efficient and more fun. Motivation, support, curriculum, collaboration, and many other factors help optimize human learning. The point is simply that these optimizations have diminishing returns. That is, there’s a theoretical maximum implying that it’s impossible to learn all of calculus in one week. (On the other hand, it is possible to teach in a way that is arbitrarily ineffective — say, by putting a student in a room with no supplies and yelling at them all day.)

If all of this is true, then we would not expect any educational technology to dramatically improve student outcomes, unless we were replacing dramatically ineffective prior teaching methods. In other words, to claim that a dramatic improvement is possible is to state that the current methods are dramatically poor — in which case, there are probably plenty of alternative improvements which do not require advanced technology.

The only way technology can dramatically “speed up learning” is by making certain skills obsolete, so that it is no longer necessary to learn them at all. For example, humans in the developed world can get along fine without knowing how to grow food or survive in the wilderness (or do long division). Modern technologies have obviated the need for those skills. Some people might still want to learn them, but for those who don’t, the learning time has been effectively reduced to zero — a dramatic advance! That in turn leaves time and energy for learning other things that have become more important (say, computer skills).

This is why my focus within educational technology has always been toward obviating the need for a skill rather than tweaking the learning process or transferring existing lessons to tablets and smart boards. This is why I’m so interested in Bret Victor’s work on graphical tools for math and programming that could obviate the need to learn traditional, difficult methods for algebra, calculus, and debugging. It’s why I’ve spent so much time working on tools that make it possible to do data analysis safely without learning all of the details of statistics and visualization techniques.

But perhaps my sticky note was simply a reminder to have patience for learning. Wisdom is prized precisely because it cannot be short-circuited.

The middle manager predicament

“Landing in the middle of the status hierarchy actually makes us less original. When [psychology researchers] asked people to generate ideas, their output was 34 percent less original after being randomly assigned a middle-manager role than a president or assistant role. In another experiment, merely thinking about a time that they were in a middle-status role caused participants to generate 20-25 percent fewer ideas […] than thinking about being in a high-status or low-status role.”

-Adam Grant, Originals (p. 84)

I suppose “death by middle management” is basically cliché at this point. But normally I just hear people joking about it. It occurs to me now that middle managers are a good case study for examining the problems with traditional management hierarchies, because middle managers are subject to the stresses both of trying to please those above and trying to be responsible for those below. (To put it another way, the system gives them less than one-person’s-worth of power, yet more than one-person’s-worth of responsibility.)

The way I interpret the experiment’s results above is that being in a power hierarchy at any level stifles creativity, but middle managers receive a double dose.