I was having lunch in a Persian restaurant in London with my Nigerian friend Obi the other day when I had an epiphany of sorts. It wasn't a realization about globalization, though by the opening line of my post it could very well have been. Rather, a lightning bolt struck when Obi described to me the qualities he looked for in potential recruits for the unusual investment firm in which he is a partner. His company is "the largest - and sometimes only - foreign portfolio investor in many of the markets of sub-Saharan Africa." Obi and his colleagues spend their time analyzing corporate, economic and political information in that area of the world, in the hopes that this will help them make money for their clients.
I should point out at this juncture that Obi is not your normal investment analyst or fund manager. Far from being maniacally focused on making money, he is more of a Renaissance man of rare ability and kaleidoscopic interests. Obi is a medical doctor who can extemporize for hours about politics - from the sub-Saharan variety to Britain and America's arcane political systems - who discovered a zeal for corporate finance while doing an MBA at London Business School. We met there, and I've enjoyed his company tremendously - and been enlightened by his broad ranging insights - ever since.
While he and I chatted over lunch that day, I asked him what types of people he sought out to join his firm. After all, few people had his rare combination of medical, business and geopolitical knowledge; surely, there weren't many other 'Obis' out there to snap up for the company. Given that, I expected him to say that they hired either people familiar with Africa and its' public companies to whom they could teach corporate finance skills, or clever manipulators of the Black-Scholes options pricing model (an abstruse corporate finance tool) who could be instructed in the ways of the Saharan continent. To my great surprise, Obi responded that while both financial analysis abilities and knowledge in African affairs were valued assets in a potential recruit, the most critical requirement for him was an appreciation for ... history.
I was astounded, and it must have been apparent by my reaction since Obi went on to explain why. It all made perfect sense after he did, too. In a nutshell, the landscape of public companies in Africa today resemble, in an approximate sense, the state of corporate America at the beginning of the 20th century. The uber-corporations that stand astride the globe today began the last century as family-owned companies started in small towns (such as Sam Walton's Bentonville, Arkansas-based Wal-Mart). As they grew into the behemoths they have become, they passed through predictable - and well documented - stages of development and growth. As Obi pointed out, looking at Africa in 2007 is not unlike looking at America in 1907. Consequently, predicting the commercial 'future' in Africa - identifying the right companies and industries in which to invest - has a lot more to do with history and less to do with finance or, to a certain extent, local realities. Obi punctuated his point by adding matter-of-factly, "face it, Ion: there is very little today that is actually new." Cue the lightning bolt here ...
Of course, he was right, and not just about history's applicability to business but to all aspects of life. Every generation tends to think - often through an intoxicating combination of naivete and self-absorption - that the challenges they face are completely unique and unparalleled. While most people concede the existence of historical antecedents, few actually accept the proposition that progress is more of a wheel than a line - that when we experience change it is more often than not as a repetition of history rather than a completely new story.
Take globalization, for example. Authors have written reams about the impact of this 'new' phenemenon. One of my favorite authors, Thomas Friedman, wrote two excellent tomes on the subject, The Lexus and The Olive Tree and the more recent The World is Flat. Both books gained renown for performing the literary equivalent of capturing the lightning of globalization in a bottle. Friedman - and authors like him - have done much to help the lay person understand globalization, but they have also contributed to the common misperception that somehow this is a new development. Globalization - even as we know it - has been a rising tide of change at least since Jean-Baptiste Colbert, the architect of Mercantilism, imported Venetian glass and Flemish tapestries to France in the mid 17th century. Some more imaginative historians trace its' roots all the way back to the Mongol Empire and the cross-continental capital and culture flows that stemmed from the Silk Road trade. Regardless of its' actual starting point, globalization is neither new nor a particularly revolutionary phenemenon. However, the concept is often denuded from proper historical context and portrayed in the media today as both unfamiliar and unprecedented.
The vast majority of 'analysis' about the Internet is another example of a-historical hyperbole. Don't get me wrong: I'm obviously not one of those deluded souls who believes that the Internet is a passing fad. However, I do submit that the changes wrought by the Internet are both more familiar and less fantastical than the conventional wisdom would have us believe. The Economist's recent survey on the Internet and new media (here) correctly put the Internet 'revolution' in perspective. Far from rejecting its' legitimacy as a revolutionary force, the survey made the point that we've experienced such a revolution before and that this one would follow a similar pattern. The magazine likened the Internet's arrival to the emergence of moveable type in the middle 15th century, an era where a new technology democratized knowledge (the Gutenberg Bible), "turbo-charged an information age" (The Renaissance) and set in motion forces that would reverberate centuries later (the modern ubiquity of mass media).
What The Economist did was to root a contemporary event or phenomenon into proper historical context. In effect, it subjected the Long Tail to the Long View, and in so doing gave us a critical and much-needed perspective on the present. As both the examples of globalization and the Internet show, what society perceives - and anoints - as new and never-before-seen is often neither. What too frequently is lacking is the intellectual reflex and rigor to look at the 'new' through the prism of the 'old'. Society would benefit, it seems to me, if more futurists acted as archeologists and the past informed more of our knowledge of the present.
Fortunately, History has been staging a bit of a comeback lately. With many of the complex conundrums that confound us today - from terrorism and the degradation of the environment to the war in Iraq - people appear to be turning to history to help make sense of it all. Little by little, historical perspectives weave their way into the public discourse on our modern maladies. Al-Qaeda is less and less seen as a terror cell that came to life on 9/11, but more properly put into context as a movement that arose from the 1980s Soviet occupation of Afghanistan and catalyzed again by the stationing of American troops in Saudi Arabia at the end of the first Gulf War. The phenomenon of global warming was catapulted into the public consciousness by Al Gore's powerful documentary An Inconvenient Truth because it demonstrated that the Earth's temperatures were rising at a rate unseen in human history. The civil unrest in Iraq was once viewed as either the death throws of a recently-toppled totalitarian regime or the desperate final match strikes of a foreign-backed insurgency seeking to enflame the country. Today, it is seen through the prism of a millenia-old schism in Islam between Sunni and Shi'a - the contemporary boiling-over of a sectarian struggle that simmered for decades under Saddam's iron rule.
Even self-avowed anti-historians and current affairs columnists are looking to the past for answers now. The famously anti-intellectual George W. Bush was recently reading, at Henry Kissinger's suggestion, Alistair Horne's A Savage War of Peace, about France's experiences in a guerrilla war against Muslims in Algeria in the mid 1950s, to help frame his foreign policy towards Iraq. In a stark illustration of how 'hip' historical analysis has become, New York Times opinion writer Nicholas Kristof recently reached as far back as Virgil and the travails of Thucydides to offer the appropriate historical analogy to W's Iraq adventure.
History, like the height of hem lines, comes in and out of fashion. After all, it wasn't that long ago (1989) that Francis Fukuyama famously wrote about the 'The End of History'. George Will more recently mused sardonically about the Fukuyaman viewpoint - and the atypical period of peace, progress and prosperity that spawned it - as our collective 'vacation from history.' That holiday ended abruptly, on or about September 11, 2001. Thereafter, the world faced sufficiently chilling and complex challenges that we have increasingly turned to history for context, comfort and courses of action. This is as it should be. George Santayana's famous dictum that "those who cannot remember the past are condemned to repeat it" is as true today as it was a century ago when it was written. These days, however, his aphorism could be updated to say that those who do not listen to the past are condemned to misunderstand the present. Simply put, yesterday gives us perspective on today and a chance at properly explaining tomorrow. After all, as my friend Obi put it so well, there is very little that is truly new anymore ...
The Power of Perpendicular Thinking
Do you know why we have a 40 hour work week? The answer might shock you. Its origins date to the Industrial Revolution (not the Internet Revolution - the industrial one). Back in the middle of the 18th century, it was common for people to work 10 to 12 hours, 6 days a week. After almost a century of effort by organized labor to coalesce around a common, acceptable work standard, the US Congress ultimately passed the Fair Labor Act of 1938 and formalized our current work week.
Tim Ferriss famously upended this concept with his book and 'lifestyle design' manifesto, The Four Hour Work Week. In it, he pointed out how anachronistic this concept of 'work' was in light of our contemporary, always-connected knowledge economy. He went further, remarking that all of the 'common sense rules' of the real world are actually "a fragile collection of socially reinforced illusions." But he must have been exaggerating, right? Surely this was an example of argument-by-anecdote, and that most - if not almost all - of what we're told, taught or too terrified to question is really true ... isn't it? Let me answer that question with another question ...
Have you ever wondered why we take summer vacations? Because we needed our kids to help plough corn fields.
The practice is an outdated legacy of the agrarian economy. Summer vacations were not originally 'vacations' at all. They were meant to allow farmers' children (a critical part of the work force at the time) to pitch in during harvest season. Small family farming began to recede in the 19th century - around the time the Industrial Revolution began, and the 40 hour work week took hold - and yet we continue to live with the legacy of both a decade into the 21st century.
This isn't just a quaint custom that we've preserved past its' sell-by date, either. As Time Magazine pointed out in a cover story this past August, The Case Against Summer Vacation:
"when American students are competing with children around the world, who are in many cases spending 4 weeks longer in school a year, larking through summer is a luxury we can't afford. What's more, for many children - especially children of low-income families - summer is a season of boredom, inactivity and isolation ... Dull summers take a steep toll, as researchers have been documenting for more than a century. Deprived of healthy stimulation, millions of low-income kids lose a significant amount of what they learn during the school year."
It's called the 'summer slide', and its most pernicious impact is that the problem it creates compounds year after year. By ninth grade, "summer learning loss could be blamed for two-thirds of the achievement gap separating income groups." Whereas the 40 hour week is merely an inconvenience foisted on an unsuspecting workforce, the unintended consequences of summer vacation are bad at best, and a key driver of social inequality at worst.
Noticing a pattern here? The institutions that form the heart of of our day to day existence - working 9 to 5, weekends off, summer vacations - are actually throwbacks to other times, anachronisms from other centuries. What we regard as the benign realities of life are actually harmful, outdated societal habits.
You might be tempted to dismiss this as cantankerous post-modernism, but people have been warning us about this for some time now. A century and half ago, Oscar Wilde pithily summed up what could be the proto-philosophy behind this thinking when he dryly pointed out that "everything popular is wrong." From across the Atlantic, his contemporary Mark Twain added, as if he were nodding in assent: "When you find yourself on the side of the majority, it is time to pause and reflect."
The 40 hour work week and the summer vacation are examples of what some people call the 'QWERTY Effect' (named after the curious reason why most Anglo-saxon keyboards have letters that spell out Q-W-E-R-T-Y in the upper left-hand corner). In the era when people wrote on typewriters with keys (look at the picture below to see what they look like, digital natives!), the mechanics of the earliest machines were very delicate. In fact, the main design challenge was to prevent users from typing too quickly and jamming the type-bars together. In 1873, Christopher Latham Sholes managed to slow down the speed at which people could type by laying out the keys in one of the least efficient ways possible, and the QWERTY layout was born. Fast forward almost 150 years, and the keyboard on which I'm typing (should I write 'keying'?) this post on still maintains the QWERTY layout (even though there's no type-bar or ribbon to be found anywhere).
The QWERTY keyboard has become a metaphor for practices that persist well past their point of relevance, and for the power of high switching costs (an economic term quantifying the resistance to change) in preserving outmoded but deeply ingrained activities. When I realized that so many aspects of my life - from how I work to how I type - were based on old habits, I began to wonder what else we've been conditioned to accept as 'fact' without critically assessing it first. It was then that I discovered the power of what I've come to call 'Perpendicular Thinking'.
In geometry, a perpendicular line is one that meets another at a perfect, 90 degree right angle. Aside from the association with the word 'right' (as in correct, not conservative!), I describe the approach this way because we need to develop the reflex of coming at any commonly-held practice from an angle - and preferably one of professional scepticism.
Let me be clear: I'm not advocating blind, contrarian opposition. Indeed, this would be as short-sighted as taking what's given to us as a given. Instead, I believe that we owe to ourselves to question what we're told, investigate if it's still true, and make a personal, conscious determination about whether or not we choose to embrace or reject it.
Perpendicular Thinking is not about adopting conspiracy theories, either. I don't believe that we are being misled by unseen Masters of the Universe. Rather, this is more likely a convergence of two intellectual sins: sloth and inertia. We are too lazy to question what we are told, and too hidebound by tradition to change the way we've always done things. But while there is no secret 'star chamber' pulling society's strings, we are still being negligent if we don't examine what we're being sold. We owe it to ourselves to question conventional wisdom; and once you do so, one can't help but realize that we're surrounded by 'truths' that are not actually true. Let's look at just 3 examples of modern myths: you must buy, not rent; the secret to success is to work harder; and it is possible to do two things at once (well).
1) Renting v. Owning
Consider the axiom that you should own property. For generations, people were taught that owning one's home was the epitome of the American Dream. This irrationally exuberant (to borrow Alan Greenspan's colourful phrase) idea spread elsewhere - as far away as Iceland and Ireland, we learned later - and a worldwide property bubble ensued. The Great Recession of 2008 brought real estate values crashing down, and with it the now clearly foolish idea that everyone should strive to have their own suburban McMansions.
But we should have realized this sooner; as early as 2005, The Economist had written an editorial (a leader, in their parlance) arguing that, even in the midst of the boom, renting was a more sensible option for most people. The reasons were simple: a renter got more house for their money, and they had the flexibility to deal with changing economic circumstances and job opportunities by being mobile (a fact that would prove critical after many lost their jobs 3 years later). The argument, while cogent and compelling, ran counter to the prevailing 'wisdom' of the moment, and few followed The Economist's sage advice. Five years on, another magazine, Time (a consistently contrarian weekly, it seems) ran a different cover story entitled: "Rethinking Homeownership: Why owning a home may no longer make economic sense." If only we had been paying attention ....
2) Less is More
In 1906, the Italian economist Vilfredo Pareto discovered a fascinating 'power law' of life: that roughly 80% of the effects come from 20% of the causes. The Pareto Principle, or the 80 | 20 Rule, as it's more commonly known, has proved to apply to everything from profits (80% of a company's profits come from 20% of its' customers) to personal style (20% of your clothes will be worn 80% of the time) . Richard Koch even wrote an entire book on the idea, methodically enumerating the dozens of applications - and implications - of this concept.
What makes this a particularly 'perpendicular' idea is its counterintuitive message that less is indeed more. In an era where people foolishly boast about 80 hour work weeks (forget 4 hour ones!), the underlying message is that we ought to live to work. The power of the Pareto Principle lies in the revelation that we need to work smarter, not harder, in order to succeed. Applying the 80 | 20 Rule to all aspects of your life will revolutionize the way you approach problems, even as it reverses one of our society's central scripts.
3) The Multitasking Myth
Modern life has left us with many misconceptions, perhaps the most pernicious being the conceit that we can do many things at once. This stems from our increasingly symbiotic relationship with computers, and the tendency to converge their functionality with ours. As their processing power increased year after year, computers gained the ability to run multiple programs concurrently. However, while our machines were getting more powerful, our minds were not. Humans, alas, don't benefit from regular hardware upgrades. Nevertheless, we have come to believe that we can operate as efficiently as our devices do, and the result is the myth of multitasking.
Don't let its' benign moniker fool you: multitasking is to the millennial generation what marijuana was to the sixties generation. It is the defining - and destructive - opiate of its time, and it is responsible for far more than some jumbled e-mails or errors on an excel spreadsheet. First, while we believe that we are getting more done in less time by balancing three things at once, research tells another story. Studies demonstrate that when we are 'multitasking', we aren't actually performing each activity concurrently (as PCs do) but rather toggling from task to task. Each attention change brings switching costs (there's that word again), which ultimately means that it take more time to do multiple things simultaneously than it would to do them sequentially (one after the other). In other words, multitasking is not more efficient than single-tasking; the activities take longer and, unsurprisingly, are more prone to error.
It gets worse. Multitasking is actually a life-threatening habit :texting while driving is more dangerous than drunk drinking (Studies reveal that a person who is texting while driving at the speed of 35 mph will cover 25 ft before bringing the car to complete halt as compared to a distance of 4 ft which a drunk driver would cover at the same speed).
Finally, multitasking is even adversely rewiring our minds. Don't believe me? Read The Shallows by Nicholas Carr and you'll be more frightened by reality than you were by 'Paranormal Activity 2'. Carr's book is a cri de coeur on how technology is transforming us - for the worse. We are losing our ability to concentrate and luxuriate in deep reading, opting instead for what Tim Ferriss calls the cocaine pellet dispenser of digital distractions. We contemplate much less; we skim and scan more. Worst of all, our brains are being scrambled by the very practices that we adopt to keep up with the frenetic pace of modern life. Multitasking is yet another big lie we tell ourselves, but - much like the 'summer slide' - the consequences are compounding in our minds every day.
Perpendicular Thinking is part of my personal Pop Philosophy
Perhaps now you understand why it's so important to 'be perpendicular'? I have definitely tried to embrace this philosophical approach. In fact, looking back on my life so far, I've noticed that I've often followed - for better and for worse - my own north star in both actions and thoughts. Indeed, I was perpendicular before I discovered perpendicularity.
I left Montreal right after graduating university to pursue an adventure in politics in Washington, DC - as the only Canadian on Capitol Hill, no less - at a time when most of my friends were still clinging to the safe harbour of home. Then, 6 years into a reasonably successful career as a Congressional Press Secretary, I pivoted out of politics and into business school - but in London, England for good measure. I spent the next decade in media and telecommunications, half in Europe and half in North America, before making my latest transition into consulting, speaking and writing. My career path has not been haphazard, but it certainly hasn't followed the linear direction that some of my friends' trajectories have - either in geography or in profession - either. The road I've taken has had its fair share of right angle turns along the way.
In my intellectual journey, I've followed a similar path. Many of the ideas that have caught my eye or sprung from my imagination share the common gene of contrarian thinking. I've written about being alone together, the sex appeal of single moms, the significance of exits over entrances, the 'return' of history and the importance of making magnificent mistakes. Even the raison d'étre of this blog - to think seriously about not-so-serious things - is a manifestation of my penchant for perpendicularity.
Along the way, I've taken inspiration from a number of my friends who have had the courage to walk alone in one form or another. I think of my mate Hendo, who stepped away from a lucrative career trading derivatives to work for the World Wildlife Fund, or Ben, who started a new media company at a stage in life - just married, bambino on the way - when others are normally looking to join an established enterprise. I remember a pivotal talk with my great friend Mike Rodman, who told me that, after careful deliberation, he and his wife had decided that they weren't going to have children. It struck me that this was the first time that anyone in my circle had taken the other side of the parenthood argument. Mike's thoughtful, sincere and coherent position for not having kids was the first instance that a friend openly spoke of the 'heretical' idea of choosing not to found a family. I respect his decision, but I admire him even more for having the confidence to think on his own, and the courage to take the road less travelled.
Perpendicular thinking is not just useful as a philosophy of life, however. It can also be a sound business strategy. Henry Ford was fond of saying that if he had just listened to his customers, he would have produced a better horse and buggy. Ford realized that if you waited around for someone to ask for your product, you might be too late. Jeff Bezos understood this when he built Amazon in an age when bricks and mortars bookstores - and businesses - were the way people overwhelmingly shopped.
Legendary investor Warren Buffett thinks much the same way. He points out that "to make big money in the investment world, you have to learn to think independently; to think independently, you need to be comfortable standing alone. I buy stocks when the lemmings are headed the other way." Buffett walks the walk, too: he bought into Berkshire Hathaway when no one wanted it; he bought into American Express when no one wanted it; he bought into the Washington Post Company when no one wanted it; and most recently, at the bottom of the recent Great Recession, he bought into Goldman Sachs when no one wanted it. He knows the time to buy a stock is when everyone else is selling it - not when everyone else is buying it.
Steve Jobs is cut from the same cloth. He looked at a music industry ravaged by piracy and not yet awash in MP3 players, married an easy, affordable (and oh by the way legal) digital music delivery system (iTunes) to an elegant, eminently portable and oh-so-cool device (iPod) and changed the game forever. As one of his ads famously advocated, Steve Jobs thought differently.
Henry Ford, Jeff Bezos, Warren Buffett and Steve Jobs all proved what business writer Alan Webber once noted: "Counterintuitive can be a great economic model." He should know, too: he went from being editor of the Harvard Business Review to reinventing the business magazine when he launched Fast Company in 1995. These men all parlayed perpendicular thinking into profits, but to reduce this to a novel commercial approach would be missing the point. We owe it ourselves to follow their lead, but also Mark Twain's and my friend Mike Rodman's as well. We need to become more comfortable about going against the grain.
Living a Perpendicular Life
So what does this all mean for you? How can you adopt these principles and live a perpendicular life?
Too many of us are afraid to be ourselves, so we follow the crowd off the cliff. Robert F. Kennedy once observed: "There are those that look at things the way they are, and ask why? I dream of things that never were, and ask why not?" I'm not usually one to disagree with a Kennedy, but I take issue with the predicate: not enough people go through life really asking 'why', in my opinion. That is at the heart of perpendicular thinking - to come at conventional wisdom obliquely, from an angle, and develop the habit of asking why, or why not? So let me leave you with some suggestions:
Don't subcontract critical thinking to society. More often than not, conventional wisdom is more reliant on convention than sagacity.
Be consistently counterintuitive. Always stop and think when someone says that you're 'supposed' to do something. Are you buying a house because you really want to own, or because you've been told that it's what responsible, smart people do?
Flip the script. Learn from children (as Adora Svitak's delightful TED Talk wisely counsels). Make your mistakes magnificent, by focusing on understanding your own failures rather than trying to reverse-engineer others successes. Plan your exits as carefully as you do your entrances.
Be an innovator rather than a follower. It worked for Warren Buffett and Steve Jobs; what could it do for you?
Contemplate - perhaps even take - the road less travelled. It may turn out that you're the only one who knows what they're doing.
Be uncommon. Always try to be in a 'Category of One' in business and in life. A noted businessman and philosopher once said, "you want to be considered the only ones who do what you do." His name: Jerry Garcia. His achievement: turning The Grateful Dead into the most successful touring band of all-time.
Dare to be 'un' popular, in the Wildean sense of the term. Don't worry about what others are doing; worry about whether you're doing the right thing.
Finally, adopt the 'X-Files' approach to life: Trust no one and nothing. I wonder what QWERTY practices are you tolerating in your life? Find out.
If you start to ask questions like this, you will be amazed by some of the answers. Make them part of your life philosophy, and tap into the power of perpendicular thinking. But don't just take my word for it ...
Posted at 05:01 PM in History, Media, Modernity, Pop Culture, Pop Philosophy, Social commentary, Social trends, Web | Tech | Permalink | Comments (4) | TrackBack (0)
| Reblog (0)