Most people don't make very much of their bread toasters. These small but hardy metal boxes often come at low prices (from $7) and are not terribly difficult to operate. All in all, this is not the best example of a sophisticated, complicated or inventive home appliance.
Enter Thomas Thwaites, an art student who made it his mission to re-create the basic bread toaster. Called 'The Toaster Project', Thwaites set out to build a common toaster from scratch, using only 15th century "pre-industrial tools and techniques". This would include mining for iron ore, smelting it to derive the toaster's metal parts and sourcing for other components like copper and mica.
Commendable as his ambitions might have been, Thwaites soon found himself cheating. He would use the internet throughout the project as his source of reference. When good old 15th century fire didn't work, he used a microwave to smelt iron. He also used hair dryers and leaf blowers throughout the process - tools that are arguably more complex than the basic toaster.
At the end, the final product was "half-baked". When connected to a car battery, the machine could purportedly warm bread but not produce toast. Hopeless as this result was, Thwaites' experiences in wanting to be the ultimate toast expert is not dissimilar to how people tend to view real-life issues.
For Tim Harford, author of popular books such as The Undercover Economist, Adapt and The Logic of Life, it is this over-simplification of the innately complex that perpetuates some of the most pressing social, economic and political problems that confronts us today: We think we know the toaster but we do not.
Speaking at a Foreign Correspondents Association talk at Singapore Management University (SMU), Harford joked that while there is no single person in the world who knows how to make metaphorical toasters from start to finish, people seem to expect leaders, experts and gurus to be our toaster messiahs.
Dear CEO, please fix the toaster.
People and societies have always traditionally looked to their leaders for answers. "We think if we find the right leader; a new political leader, a new CEO, head of department, editor, etc, that this leader will solve all our problems for us." These kinds of expectations were certainly present when Barack Obama was elected US President, said Harford.
People expected that everything will change and all of the problems left behind by the previous administration would be fixed by this new president - almost as if an "almost religious leader has come to solve America's problems and the world's problems". This "logic" has been applied on many a political leader who will, at the end, fall short of expectations.
"This isn't because we keep electing the wrong leaders. It is because we have an inflated sense of what leadership can achieve in the modern world," Harford wrote in Adapt. One might argue, however, that presidents and leaders do not operate on their own; that they have access to resources, to teams of expert advisers, etc; that this makes failure less acceptable.
Harford's response to such arguments is to cite the seminal work of Philip E. Tetlock, a professor of leadership at the University of California Berkeley. In a study where, over time, hundreds of expert predictions were held up against actual data, Tetlock found these so-called forecasts to be largely inaccurate. This was true across the domains of economics, politics, and the social sciences.
Expert forecasts may be more accurate in the areas of hard sciences, Harford argued, but when it comes to social, economic or political problems, "the experts can't do it". Because these systems are far too large and complex, it is unrealistic to expect full comprehension to the extent where accurate predictions can be made.
While Harford does not discount the value of expert advice, he finds it amusing that people and institutions seem eager to attach significant weight to what are essentially guesswork and premonitions. "You know you won't get a good answer, but it's strange we keeping asking the questions and asking for forecasts when these forecasts are always terrible."
Learning from failed toasters
Experts in science and technology fail quite often too. Take the example of the Gutenberg printing press; an invention which caused Johannes Gutenberg (and many others that followed) to go bankrupt. The Germany-based inventor had thought he might make money off his creation by reproducing the most popular book in Europe at the time: The bible.
It made perfect sense, Harford said. But on hindsight, the idea was a complete failure because nobody wanted to buy a mass produced bible; people wanted their bibles to be handwritten, which was the norm in 15th century Europe. So, despite a huge technological breakthrough, Gutenberg did not die a rich billionaire. In fact, the printing industry did not take off until much later, as subsequent producers adapted from the failings of those after Gutenberg.
"Failure is important; it is the process of replacing bad ideas with good ideas, good ideas with better ideas," Harford said to a group of executives at Google recently. The Googleplex audience looked unimpressed; as if to signal to him that he was stating the obvious, Harford recounted. "And this is how Google sees the world: If you want to succeed, fail faster."
Some of the internet giant's most recent underperforming products include Google Wave, Google Buzz and Orkut. In fact, Marissa Mayer, a Google vice president, was known to have once predicted that 60-80 per cent of the company's products will fail. She qualified by adding that failure is acceptable at Google because the company encourages a culture of risk-taking.
"That is how they do business... You can see them making all these small bets; little experiments to see what works. They are very much a failure-tolerant company and they are one of the most successful companies in the world," said Harford.
Fail productively
Google's liberal approach towards risk-taking may not apply for those outside of the creative, high tech and market-driven industries. Take a university, for example. If a particular area of research fails more often than it succeeds, should it be shut down? Can a politician be risk-taking when the price of failure is his or her job?
Failures can be deadly too. Here, Harford points to recent events in Iraq and the Fukushima nuclear plant where it would be exceedingly difficult to accept failure as part of a learning process or antecedent to future success. Harford thus offers up three broad principles for "failing productively":
(1) Try lots of things: "If ten per cent of things fail every year... you will need to do a lot of experiments."
(2) Failures should be survivable: "Any individual failure has got to be acceptable - because you have to keep going after you've failed"
(3) Know the difference between success and failure: "(This is) much harder than we think, especially for hierarchical organisations. An example is the US Army in Iraq. That was a situation where it was clear to people on the ground that there was a terrible failure occurring, but to the guys at the top, there was no problem and everything looked fine."
While the points above, in that order, certainly bring to mind Niebuhr's Serenity Prayer, Harford highlighted that they speak to one of our most basic human psyche: the ego. We avoid risks to avoid failing. When failure comes, we deny them to avoid criticisms.
All three principles can perpetuate in a vicious cycle too, he noted. "We like people to tell us everything's great. We seek out people who tell us that. We avoid people who point out problems and that makes it much harder for us to spot failures." The 2008 subprime crisis is an example of what can happen when denials fester on a massive scale.
Bringing this into a larger, macro perspective, Harford feels that progress in democratic societies ultimately relies on their openness to change. Hierarchies and institutions cannot be expected to experiment when the political risks are high. "We don't award this as citizens and we don't value this as voters," he noted.
On the contrary, citizens and voters will likelier reward rigidity. Two of the most successful post-WWII UK politicians have been Margaret Thatcher and Tony Blair, said Harford. Thatcher had once notably said, "You turn if you want to. The lady's not for turning." Blair had similarly said, "I don't have a reverse gear."
"Now, if I were to sell you a car and say that it doesn't turn or that it doesn't reverse, you probably won't buy that car. But British voters bought into these politicians. They won six election victories (all the while) boasting about their inflexibility; that they couldn't change direction; that they couldn't go back," he exclaimed.
Sure, one might argue that politics is about inspiring confidence. And so we arrive again at society's idea of worshipping a messiah who sees all, knows all and must fix all. "If the world were a simple place, that would be fine," said Harford. But deep complex problems are not solved by rigidity and inflexibility.
"We solve problems through experimentation, and if we don't get it right the first time, we solve problems through ever correction... trial and error, and then we adapt," he said. "And until we take that more seriously; until we support it as voters, customers and members (of society); we won't be able to solve the problems that face us."