Sensemaking is retrospective

This article originally appeared on the Amsterdam Ad Blog.

In advertising, to reach a conclusion as a result of a logical thought process in which one thing led to another is seen to be highly responsible – whereas to acknowledge that you had the inspiration first and then test it only afterwards is to risk being accused of post-rationalisation.

But why shy away from post-rationalisation when it is a fundamental skill of our game – the art of applying understanding and coherence to an intuitive leap?

Sure, ours is an industry that’s suspicious of science. This is because science is used to dismiss many of our promising ideas. There is a name for this kind of science – ‘bad science’.

Rory Sutherland of Ogilvy describes bad science as the urge to apply the same kind of mathematical certainties found in physics and engineering to solve complex mechanisms like the psychology of consumer behaviour – a practice we also call marketing.

But real scientists are dismissive of this kind of bad scientific thinking. Those who embrace it are the non-scientists, whose backgrounds include management, finance, economics and so on. Such people suffer from a trait which real scientists refer to as ‘physics envy’ – the need to appear more mathematically linear than their disciplines require.

Hence, business schools are rife with this kind of bad scientific thinking. And so are businesses themselves, from corporations, consultancies to adtech, whose managers become ‘psychologically blind’ in their obsessive pursuit of logic and certainty.

Bad science makes them absorb all the methodology and linearity of science whilst rendering them incapable of appreciating the quality of anything that cannot be expressed in numerical terms. It produces the kind that attempts to codify creativity and judge advertising purely on metrics.

Jeremy Bullmore of WPP finds that a crucial problem with bad science is that it churns out people who are made to believe that ‘the scientific method used in the act of discovery eliminates luck, guesswork, wild hypotheses, human prejudice and naturally, post-rationalisation.’ Which leads them to devalue the role of creativity – the very force that has driven the biggest leaps within our industry.

By contrast, JWT’s Stephen King observes that the smartest thinkers apply a different kind of science which Sir Karl Popper refers to as ‘good science’ – driven the spirit of adventure, imagination and enquiry. Popper recognises that science is a process of elimination – it cannot prove anything but only falsify ideas that are thought to be true.

It took nearly three thousand unsuccessful attempts before Thomas Edison created a bulb that would actually light up. First came inspiration; then came experimentation and a process of elimination; last came logic, which was applied to explain why the light bulb worked. Edison recognised that in thinking by intuition, logic comes last.

Similarly, asked about his amazing ability to consistently identify the best moves, Magnus Carlsen, who became the world’s chess champion at 19, said that he usually knows immediately what move to make next, not having to figure things out in the way the rest of us do. On being reminded of the number of times he had taken up to an hour between moves, Carlsen replied, ‘…well, because I have to, you know, verify my opinion; see that I haven’t missed anything.’

Cedric Villani is the equivalent of Magnus Carlsen in the world of mathematics. He has won the Fields Medal, the Nobel Prize for mathematics, awarded to the best in the world once every four years. Villani, who insists mathematics isn’t about numbers but concepts, explains how the highest form of mathematics is performed. He says, like detectives, mathematicians use intuition to guess the solution and then logic to prove it.

Edison, Carlsen and Villani remind us that there are two parts to any creative process, which Bullmore identifies as ‘discovery’ and ‘proof of justification’ – and both need to be kept apart.

For instance, although penicillin may have been discovered accidentally, ‘a lot of retrospective digging had to be done before it was released onto a trusting public.’ Nobody said, ‘Here, take these tablets. I invented them yesterday. Trust me, I’m a scientist.’ Similarly, ‘This is a very good idea because I thought of it’ will almost always fail to win budget approval. Therefore, while discovery can be undisciplined, the act of discovery can never be its own justification.

Unfortunately, post-rationalisation causes the wrong belief that progress arrived at thorough tinkering or imaginative leaps is somehow cheating. When in reality, it is how most significant ideas, from light bulb, penicillin, DNA, plastic, X-ray, microwave oven, airplane, Viagra, Post-it Notes to the Internet, were created.

So just as scientists have skilfully led us to believe that almost all discoveries were arrived at through a sequential process, in ad agencies we have to post-rationalise our best ideas before we can sell them. This is because, in reality, the truly unconventional, behaviour-shifting ideas are often products of our unconscious – and this is simply too random and frightening for our physics-envious audience to accept.

Good agencies understand what Bill Bernbach meant when he said that ‘persuasion is an art, not a science.’ You simply will not arrive at great advertising through the rigorous application of logic. But good science, as Sutherland puts, will help you understand and explain what is good about your idea – after you’ve had it.

Bullmore reflects Edward de Bono’s philosophy that it is sometimes necessary to get to the top of the mountain in order to discover the shortest way up. The point is, no matter how you look at it, post-rationalisation is true, and nobody gains by pretending it isn’t.

Therefore, post-rationalisation isn’t just essential, but a fundamental skill of our game. And it absolutely deserves respect for applying understanding and coherence to intuitive leaps.

Don't apologise, adland

This article originally appeared on the Amsterdam Ad Blog.

It is often said that advertising makes people want more things. Sure, advertising can do that. But it very rarely does. In fact, it is more likely to have the opposite effect: make people happier with less; which is the more lucrative use of advertising itself.

Why spend $1000 or $10000 on a watch that does nothing more than a $100 watch? The answer, of course, is status, figured Thorstein Veblen, an eccentric economist known for his satirical portrayal of the upper class in the 1899 classic, The Theory of the Leisure Class.

Turns out, people “conspicuously” spend on certain types of items, such as silverware or oversized houses, just to show their place in society. These items, called “Veblen goods”, are desired because they’re expensive (pleases the rich) or exclusive (pleases the snobs).

A “Veblen effect” exists when people reject a perfectly good solution, like a $100 watch, simply because it is “too common” or “not costly enough.” Rory Sutherland of Ogilvy points to Stella Artois’ slogan that tries to create a Veblen good with two words, “Reassuringly expensive.”

Advertising is basically criticised for amplifying Veblen effects: prompt people to want things they “don’t need” or “can’t afford”.

Except this age-old belief isn’t entirely true.

Firstly, Veblen effects don’t depend on advertising. People have used lavish items to show status since before ancient days: tribal elite held women and slaves as trophies; the Romans threw gladiatorial spectacles; Cleopatra dissolved pearls in her drinks etc.

Secondly, Veblen goods don’t depend on advertising. Rolex watches, Rolls Royce cars and Reinast toothbrushes sell not because of advertising, but for their rarity and price. Veblen goods, for the most part, are not promoted through ad agencies at all, but PR.

Thirdly, almost all advertising that ad agencies produce is for things people need, or would be very reluctant to lose: insurance, detergents, food, broadband, washing machines, travel, beverages etc. And given the choice, which ultimately benefits people, it is advertising that helps them pick the wheat from the chaff.

Quite fittingly, ad agency Young & Rubicam ran an ad that read: “Yes, advertising does sell things that people don’t need…TV sets, radios, cars, ketchup, mattresses, and so on. People don’t really need these things. They don’t really need art or music or cathedrals…They don’t absolutely need literature, newspapers and historians. All people really need is a cave, a piece of meat, and possibly a fire.”

Put simply, advertising rarely adds to Veblen effects. In fact, as Sutherland explains, it is far more likely to have an anti-Veblen effect: create products that become “social levellers”.

Coca-Cola is the world’s best selling soft drink that is accepted by everyone when it should be rejected for being so common. Nike focuses on individual glory whilst stimulating the desire to belong to an egalitarian “Nike community”. Apple’s democratic values began with the slogan, “The computer for the rest of us;” whose products, except the gold watch (still not purely Veblen), are affordable to most people. And anyone can wear the working man’s fabric, Levi’s denim; a benefit reserved for mass advertised brands.

So advertising often works not by persuading people to trade up, but to retain their mass tastes instead. It favours an egalitarian society in which the rich people essentially buy the same things as the rest. “The President can’t get a better Coke than the bum on the street,” said Andy Warhol, the American artist who saw advertising as anti-elitist.

Brands that depend on mass advertising, like Coca-Cola, Nike, Apple, Levi’s, McDonald’s, Google, Sony, IKEA, Vodafone, Dove, Colgate, Volkswagen, Philips, Heinz, Apple, Virgin, ING, Marlboro, Kellogg’s, Heineken, Nestle etc., have it in their interests to be commercial and democratic. It’s the things that aren’t advertised that create social inequality.

Therefore, to bash advertising for dividing society or persuading people to want expensive options is unfair and wrong.

Likewise, advertising is conveniently blamed for peddling the desire that new is better.

It might seem like our upgrade culture is the result of “planned obsolescence”: a strategy to make products with shortened life spans deliberately. But only brands with no competition benefit from such inefficiencies. For categories in which switching is an option – Apple to Samsung or vice versa – brands that peddle new upgrades unnecessarily lose consumers.

Instead, advertising mocks planned obsolescence; it redefines status; conditions us to resist Veblen effects; makes things palatable to the masses; and instils pride in our position as the middle class, which is discerning of price versus quality.

Advertising presents a sensible lack of pretense; a no-nonsense practicality that respects the intellect of its audience, whilst seeing the likes of Volkswagen’s “Think Small” and “Live Below Your Means” as the epitome of great advertising.

Besides, advertising can act as a psychological primer for important social change. Which means, we can tell people it’s cool to wear recycled clothes; save energy; eat food waste; stay comfortable; or even resist consumerism.

So, don’t apologise, adland.

Thanks for persuading people to be happy with less, rather than making them want more. We need more advertising, and far more categories of spending where mass advertising creates brands that become socially acceptable to all, destroying price discrimination and pretentiousness.

Stick to your guns.

Waste is magnificent

This article originally appeared on the Amsterdam Ad Blog.

Waste refers to the portion of advertising budget that seems lavish and unnecessary. It represents the world’s most widely held beliefs in advertising. That half of all advertising dollars are misspent. Jeremy Bullmore of WPP calls it a “superstition”. Not just because there’s no evidence that the retailer John Wanamaker actually said that. But in a hundred years, there’s no evidence that companies that spent twice as much as they needed to on advertising suffered as a result. Yet the belief remains; that waste should be cut.

There’s only one problem: it’s the waste that adds to the effectiveness of an ad.

Take a peacock’s tail. It seems lavish, serves no functional purpose, and is counterproductive to its survival. But the tail also sends out a signal to peahens. “I’ve survived in spite of this huge tail; hence, I’m fitter and more attractive than others.” This is what biologists call “the handicap principle.” Across the animal kingdom, such lavish displays signal biological fitness.

Similarly, people are attracted to brands that invest in lavish displays, such as Superbowl commercials or recession-time advertising, because such extravagance signals a successful brand. Rory Sutherland of Ogilvy points to Darwinian psychology, which shows that people attach huge significance to “brand-bling, confidence, display and conscious waste.” Which implies that companies with money to spare and expensive reputations rarely produce bad products. The relationship between waste and effectiveness is the subject of Tim Ambler and E. Ann Hollier’s paper, suitably titled, “The Waste in Advertising Is the Part That Works.” Hence, the investment, the waste itself, is what makes the ad reliable.

That’s why, when our industry compared itself with Hollywood, and aspired for “magnificence”, it created advertising that had impact. Sure, many called it wasteful and indulgent. But it got people talking. Cabbies spoke about it. Kids spoke about it. Sometimes prime ministers spoke about it. It made brands famous. People buy brands they’ve heard of because fame is a huge source of reassurance. Brands like Coca-Cola, Levi’s, BMW, Heineken, Nike and Virgin are famous because their lavish exposure exceeds categories and audiences deliberately. Everyone knows them, not just those who buy or use them.

And then marketers rushed to adtech; whose breed of snake oil salesmen treated advertising as a science, replacing creativity and intuition with rationality and data. Their weapon of choice, targeting. An elixir that attempts to limit the exposure of advertising to those already or potentially in the market for any given product. Ridiculous enough to compound people’s existing decisions by repeatedly showing them what they’ve already decided against.

Adtech’s promise of effectiveness, and claim to know which half of the advertising budget is wasteful proved too tempting. For the chance to be successful with limited or no advertising at all seemed too good to be true. Why would a marketer continue to spend huge amounts of money to reach millions, many of whom may have no interest in the category, let alone the brand?

A case to cut the waste in advertising.

But numbers prove otherwise. Real brand growth, across almost all categories, comes not from loyalists, but from occasional users. This is the subject of Professor Byron Sharp’s book, How Brands Grow. For instance, most of Coca-Cola’s sales come from occasional drinkers; the ones who buy it just once or twice a year. Surely, these people aren’t likely to do anything, let alone engage, with the brand online. Hence, adtech’s claim to fame, to cut waste by specifically targeting people, such as loyalists, based on their online behaviour, is not only statistically ineffective, but kills the very signal a brand needs to send out to everyone in order to find fame.

So just as animals use wasteful characteristics to signal their biological fitness, excesses in advertising signal brand strength. Precisely the part that adtech wants to cut. The writer Don Marti notes that targeting not only “burns out” a medium, but also turns an ad into the “digital version of a cold call.”

Admen like Bullmore recognise that publicists have known this instinctively. When the Beatles came to the US in 1964, their manager, Brian Epstein, didn’t set up a series of targeted interviews on fan magazines. Instead, he got them three appearances on the Ed Sullivan Show, with an audience of about seventy million for each show. Only a fraction of them might have gone on to buy a Beatles record or a ticket. But it’s unlikely that Epstein thought of this exposure as wasted.

It’s become more obvious to spot the bluff on adtech’s delusional claims. Which are clearly not as effective as they may seem. But the damage is done. It has successfully replaced our industry’s once exciting, Hollywood-esque aspirations of creating exposure and fame with a boring, unambitious aspiration of being accountable.

Perhaps it’s time we reconsider our industry’s aspirations, and convince our friends that cutting the waste in advertising means cutting the part that works. The part that actually builds brands.

Don’t lose your feathers. They’re what makes you dazzle.

The illusion of price

A McDonald’s burger that cost just 20 cents a few decades ago now costs around $5. Putting astronauts on the moon that cost $25 billion a few decades ago now costs almost $160 billion.

It seems like everything is more expensive these days. It seems like our earnings don’t match the prices we’re expected to pay these days. It seems like yesterday’s prices are a bargain compared to today’s. But, for most cases, this isn’t true. It is rather, at best, an illusion.

We define prices by money; which doesn't have a constant value. As prices go up over time, the value of money goes down. Therefore, the prices we pay aren’t an accurate guide to the real cost of things.

A better way to understand the real worth of things is to measure it against something that doesn’t change — time. Time, in terms of real productivity, is determined by hours and minutes of work. Which is why, Benjamin Franklin famously said, ‘Time is money.’

Hence, the real price of what we buy will make more sense if we marked it, not against money, but against the length of time we have to work in order to earn the money for it. Henry David Thoreau sums it up well, ‘The cost of a thing is the amount of what I will call life.’ Productivity determines the value of our time, i.e. our wages.

So, if we convert prices into hours and minutes of work, the cost of just about everything we consume today hasn’t actually gone up, but gone down; some to the point of a bargain.

Numbers from the U.S Bureau of Labor Statistics, Department of Housing and Urban Development, Motion Picture Association of America, Hertz Corporation and Retail and Consumer Reports help calculate the evolution of change, from the 70s till now.

The time taken to buy eggs has gone down by 45%. The time taken to buy an air conditioner has gone down from 45 hours to 23 hours. The time taken to provide a three-square-meal has gone down from 2 hours and 22 minutes to 1 hour and 45 minutes. The time taken to buy a movie ticket has gone down from 28 minutes to 19 minutes. The time taken to buy a house with a 5% down payment has gone down by 9 months of work. The time spent on the job to rent a car has gone down by 37%. The time taken to buy a colour TV has gone down from 1 months’ work to 3 days’ work. The time taken to buy a microwave oven has gone down from 97 hours to 15 hours. Be it food, clothing, petrol, electricity, cinema tickets, laundry, phone calls, housing or the likes, it now takes us less time to afford them than ever before.

We’re better off as a result of our wages; which, as a result of productivity, has gone up faster than prices. Thanks to the companies and industries that employ us. But, economists Michael Cox and Richard Alm point in ‘The Myths of Rich and Poor’ that, capitalism’s critics wrongly believe the economy benefits the wealthy at the expense of the poor. They explain that economic progress emerges from a system of price discrimination; against the wealthy, not against the working class. While the rich take advantage of the masses in most economics systems; under capitalism, it is the masses that benefit at the expense of the rich. The rich, who pay for most early fixed costs, making things available to the masses; know that real money doesn’t lie in selling to the rich, but bringing products within the reach of the masses.

Therefore, without the rich, fewer things would get to the rest of us. There is relentless competition between companies and industries on price and quality. And those who benefit from such competition are people; who get the best of both worlds, improved products for less effort. People get better value, more for money and more money for time. What were once luxuries have now become everyday necessities. As a result of which, many new industries and jobs have been created. Which explains real capitalism.

With our hourly output rising over time, the cost of production has fallen, pushing wages up. The idea of working less to buy more explains why we can afford to consume so much more than before.

Despite getting and spending more, many of us see a world only of rising prices. But fail to recognise and appreciate how much further each hour of our work can get. By simply marking the real cost of our consumption against ‘time’ instead of ‘money’, we can free ourselves from price illusions.

Now or later

The decision to pick now over later is a phenomenon that’s prevalent across everything, from biology to economy. Genes must decide whether to build strong young bodies now at the expense of weak ones later. Plants must decide whether to use up their resources now or save them for the future. Bears must decide whether to overeat now or face a deprived winter later. People must decide whether to splurge everything or set aside funds for expenses later. The psychologist Steven Pinker observes that at every moment, consciously or unconsciously, we’re choosing between good things now or better things later.

The economist Thomas Schelling explains that people behave as if they have two selves. One wants good things now and the other wants better things later. One wants a cigarette and the other wants clean lungs. One wants a dessert and the other wants a lean body. One wants to watch television and the other wants to read economics. The two selves are always fighting for control. An imminent reward engages the self that deals with immediate sure things. A distant reward engages the self that deals with uncertain future things. Schelling believes that one outranks the others, as if the whole person was designed to believe that a bird in hand is worth two in the bush.

Even the most rational of us end up defaulting to good things now over better things later. Our preference for immediate certainty over distant uncertainty is evolutionary. As nomadic tribes, our ancestors couldn’t really store things for longer. So they had to be on the move, searching for greener pastures. And with the imminence of death, lower life expectancy and an inability to accumulate possessions, the payoffs for consuming something immediate over later was higher. As a result, an urge to indulge now was built into our emotions.

So people have a tendency to prefer smaller payoffs now over larger payoff later. With instincts evolved to appreciate things without delay or risk missing out altogether. Which is why we disregard long-term disasters in favour or short-term happiness. This is what economists call ‘hyperbolic discounting’.

The same reasoning explains why people are happier to make commitments long in advance provided that commitment doesn’t require immediate action. The same reasoning causes people to underestimate the future consequence of things like having unprotected sex, losing temper, procrastinating, buying junk food or taking drugs. So powerful is this instinct that businesses benefit from it everyday. For instance, the finance industry owes its fortunes to hyperbolic discounting. Because borrowing money and paying interest are actions that spend future resources for benefit in the present.

Picking good things now over better things later is not treated as a sign of intelligence. Because it highlights our weak will; that of poor self-control. And by undervaluing the future we fail to anticipate it and plan accordingly. We give in despite knowing that it may not be the most rational thing to do.

But we try hard to defeat our own self-defeating behaviour. Through ways, which Schelling calls rightly as strange. Like putting the alarm clock away from our beds so we have to walk up to switch it off, setting up a bank account that prevents us from cashing out, keeping snacks out of reach or setting our watches ten minutes ahead.

No matter how hard we try, our brain, which is very good at doing many things well, is surprisingly bad at planning for the future.

Why rituals matter

Ever wondered why Corona comes with a lime? We might think that the culture of mixing beer and lime must be a Mexican thing. Or that the lime’s acidity must kill off any bacteria formed during its packaging. But it actually came about as a bet. A random one between a bartender and his mate. Could the bartender get patrons to ask for a lime in their Coronas simply by seeing him do it a few times? The 30-year-old bet eventually became a global ritual.

Why is Magners served on ice? Many Irish pubs didn’t have fridges in the county of Tipperary. So patrons decided to pour it over ice and cool it down themselves. The ritual helped cut down the sweetness of cider and improve its taste. So bartenders started serving Magners bottles over pints with lots of ice. It's now referred to it as ‘Magners on ice.'

Consider the Guinness ‘wait’ ritual. The bartender first pours three-quarters of it by titling the glass. He then stops and waits for it to settle. Finally, he tops the rest into the same glass that’s now held straight. It takes about 120 seconds to pour a perfect pint of Guinness. The wait seemed too long for people at pubs across the British Isles. Guinness was seeing its sales slide. So it launched an ad campaign with the line ‘Good things come to those who wait.' Showing the beauty of its artful pour. The wait seemed cool all of a sudden. Guinness had something unique. It had a ritual.

Rituals and superstitions make things we buy memorable. Even science tells us that they connect with us emotionally. Not as rational actions that need reasons. But as beliefs that make us behave in a certain way. It starts with change. Things are changing fast, from economics to technology. Even the speed at which we’re walking is – British Council's global study revealed we're walking 10% faster than we did a decade ago. Or the speed at which we’re talking is – watching classic films will reveal how far we’ve come. Change basically brings about uncertainty. Stress takes over when we’re no longer in control. The more unpredictable the world becomes, the more control we want of our lives. And the more anxious and uncertain we feel, the more rituals we follow.

British churches saw a huge rise in attendance during the 2008 recession. Areas hit by scud missiles saw a rise in superstitious beliefs during the Gulf War. Even the most rational of us fall pretty to this kind of thinking. No matter how rational we might be, rituals are something we all follow everyday. Right from the time we wake up to the time we sleep.

BBBO's global study identified the four stages of universally consistent daily rituals. First, we start by ‘preparing for battle,’ as we wake up from the bed to face the day. The stage includes checking emails, texts, weather, news, brushing, bathing, shaving etc. Then, we enter ‘feasting,’ as we meet over meals. This including eating together with friends, colleagues or family. The social act unites us with our tribe. Soon, we’re ‘sexing up’ with an indulgent series that transform us from our workaday-selves to our best-looking, confident-selves. This involves primping, grooming, asking others for reassurance and validation. Finally, we finish by ‘protecting ourselves from the future.’ This involves the acts we perform before going to bed at night – turning off computers, lights, charging phones, checking on kids, pets, locking the doors and windows, packing up bags for the next day etc. The same round of rituals starts all over again the next day. Basically, these rituals put us in control, or at least give us an illusion of doing so.

There are rituals we don’t even know we’re aware of. Things like going out of our way to avoid walking through a narrow lobby. Hugging or kissing foreigners based on local customs. Uttering the word ‘bless you’ when someone sneezes. Buying the newest, most complex-sounding anti-wrinkle creams despite knowing they’re pointless. Do we eat our Big Macs with one or two hands? Do we eat our French fries before or after the burger, or alternating bites? Do people twist, lick and then eat Oreos or dunk them into milk first?

Then there are less productive rituals based on superstitions. Norwich Union's study found that the number of car accidents in London and Berlin have doubled on Friday the 13th. Number 4 is considered unlucky in China for it reads as ‘si’ and sounds as ‘shi’, meaning death. Many hotels in China don’t even have fourth or forty-fourth floors. But the same country likes number 8, as it’s closer to a word that signifies wealth. Which is why the Beijing Olympics started on 8/8/08 at 8:08:08 pm. NestlĂ©’s Kit Kat was a huge hit in Japan for sounding like ‘kitto-Katsu’, which translates to ‘win without fail.’ Michael Jordon never played a game without wearing his old, university shorts. So he coveted them with longer shorts starting off a trend in the sport. Serena Williams never changes her socks during tournaments. Sachin Tendulkar always sat on the left-side of the team bus.

Being obsessed with rituals is the same as being obsessed with brands. Both involve habitual, repeated actions that have little or no logical basis. Both start with the need to control in an overwhelmingly complex world. Both tend to accumulate and become part of our lives. Rituals may be everywhere around us, but it's easy to miss them. In Together, the sociologist Richard Sennett explains three ways to spot and understand rituals better.

Rituals keep routines fresh. They start out as habits before becoming routines. As routines, they rely on repetition for intensity. Which is, doing the same thing over and over again. The exercise often dulls our senses. But when the same thing helps our concentration better, it becomes a ritual. Listening to a song over and over helps us concentrate on its specifics. Specifics like lyrics, tone, length and pause are then ingrained into us. This sort of ingraining defines a ritual. Precisely what religious rituals intend. Performing a chant over a thousand times will ingrain it into our lives. Unlike habits or routines, which remain stuck in the first stage of learning, rituals self-renew and keep things fresh.

Rituals turn objects, movements or words into symbols. The point of a handshake is more than just feeling someone else’s skin. It signifies an achievement or relationship. A stop symbol doesn’t just warn us of the dangers ahead but also tells us what to do.

Rituals make expression dramatic. Walking down the street is nothing like walking down the aisle as a newly married couple. Every step feels immense.

Right-brain revival

Humans tend to pick sides. Splitting everything into two groups. And see life in contrasting pairs. Good versus bad. God versus evil. East versus West. Mars versus Venus. Left versus right. Logic versus emotion. Yet, in most cases, we don’t have to pick sides. And it’s often dangerous if we do. Say, logic without emotion is cold. Emotion without logic is weepy. The yin always needs the yang. It’s the same with our brains too. Left brains need right brains and vice versa. Only by using them together turns brains into real thinking machines. Using either one leads to absurdity.

But some people are comfortable with logical, linear reasoning. A form of thinking that is functional, textual and analytic. They tend to become lawyers, accountants and engineers. The left brain directs their thinking. And has led to the Information Age that started in the 1970s. So it’s overemphasised in schools and prized by firms. Encouraging left-brain results.

Others are comfortable with intuitive, nonlinear reasoning. A form of thinking that is simultaneous, visual and aesthetic. They tend to become artists, inventors and entertainers. The right brain directs their thinking. It is underemphasised in the Information Age. So schools neglect it and firms disregard it. Undervaluing right-brain results.

It is these individual inclinations that go on to shape families, institutions and societies. While both ways are needed to form productive, fulfilling lives, there remains a tilt towards the left – our society seems to prize left-brain directed thinking.

For much of the last century, parents fed their kids with the same advice. Get good grades; go to college; and purse a profession that’ll offer a decent standard of living with some prestige. Kids who were good at maths and science were told to become doctors or engineers. Kids who were good at English and history were told to become lawyers. Kids who were slack on verbal skills were told be become accountants. As computers arrived, kids were told to take up high tech. The rest flocked to business schools, finishing with MBAs. Lawyers, doctors, accountants, engineers and executives were, as put by the father of management, Peter Drucker, classified as ‘knowledge workers’. For Drucker, these people were simply paid for putting to work what they learnt in school rather than for their skills. What set this group apart was their ability to grasp and apply theories. They excelled at left-brain directed thinking. Entrance exams for such professions measure what is essentially undiluted left-brain directed thinking. These are filled with linear, sequential exercises that are bounded by time. With rewards based on logic and single correct answers. Most developing countries are producing plenty of such left-brained knowledge workers.

So, twentysomethings in developing countries are doing what was, until recently, mostly done in the United States. They are doing them just as well; if not better, and just as fast; if not faster, for the wages of a typical McDonald’s worker. This includes software and operations work for everything from banks to airlines. Half of Nokia, Sony and GE’s software is developed in India. Seems feasible for a country producing half a million engineering graduates in a year. Number crunching work for financial firms like Lehman Brothers, Morgan Stanley and JP Morgan Chase are done by Indian MBAs. Most low-level editorial work for Reuters is done there too. This is difficult news for the white-collared, left-brained worker of the developed world. Also, machines proved that they could replace human backs in the last century. While technology is proving it can replace human left brains in this century. So any job that can be reduced to a set of rules, or a set of repeatable tasks, is at a risk.

For much of history, our lives were defined by scarcity. Today it’s not scarcity, but abundance. Abundance defines our social, economic and cultural lives. It has lifted our standard of living to remarkable levels. It has made us rich. But left-brained thinking has also produced an ironic result in its triumph. It has lessened its own significance. While placing a premium on less rational and more right-brain sensibilities. Such as beauty, spirituality and emotion. We no longer create products purely on price or function. We create them on meaning and beauty. This has led to a new middle class-obsession with design.

Abundance has brought beautiful things to our lives. But material goods haven’t made us any happier. This is the essence of the paradox of prosperity. Which is why more people, free from prosperity but not fulfilled by it, are searching for meaning instead. As social critic Andrew Delbanco puts it, the most striking feature of contemporary culture is the unslaked craving for transcendence. It is this craving that’s made yoga mainstream business. Just as books on meditation and spirituality are too. Meaning and purpose in life is now more crucial than ever.

We’ve progressed from a society of farmers to a society of factory workers to a society of knowledge workers. Now, we’re progressing yet again, as put by Daniel Pink in A Whole New Mind, into a society of creators and empathisers, of pattern recognisers and meaning makers. In other words, we’ve moved from an economy built of people’s backs to an economy built on people’s left brains to an economy built more on people’s right brains.

Columbia University’s medical students are being trained in narrative medicine. For it turns out that a patient’s story could sometimes tell more than diagnostics. Yale’s medical students are learning the art of observation from their art school. For it turns out that painting helps pick subtle details of a patient’s condition. Japan tops the world’s scores in maths and science. Yet it’s recreating its education system to foster greater creativity, artistry and play. Which is why Japan’s most lucrative export isn’t automobile or electronics today, but pop culture.

Asia has turned MBA graduates into this century’s blue-collar workers – people who entered the work force with great promise, only to see their jobs move overseas. While corporate recruiters are now looking to art schools for talent. America’s art and design industry now employs more people than in law or accounting. Global giants like Unilever employ painters and poets to inspire the rest of their staff.

In a world tossed by abundance, Asia and automation, as identified by Pink, left-brain directed thinking remains necessary but no longer sufficient. We must become more proficient in right-brain directed thinking and its aptitudes. Just as the factory workers of the Industrial Revolution learnt to bend pixels instead of steel, today’s knowledge workers need to command a new set of aptitudes. They’ll need to do what workers abroad can’t do equally well for much less money. This means using right-brain aptitudes to forge relationships rather than execute transactions. Applying aptitudes like empathy, design and play over logic and function. Tackling novel challenges instead of solving routine problems. Synthesising the big picture rather than analysing a single component.

Back in the savannah, our ancestors weren’t plugging numbers into spreadsheets. They were telling stories, putting play into seriousness, showing empathy, designing innovations and doing things with purpose. It is these abilities that have defined what it means to be human. These are fundamental human attributes. But after a few years in the Information Age, these muscles have shrunk. It’s time to work them back into shape. For right-brain directed thinking, once disdained for its artistry, empathy and longer view, is what decides who gets ahead now.

Lure of unreason

The philosopher Thomas Aquinas found that the pursuit of wisdom, through reason, is the most perfect, sublime and profitable of all human pursuits. As did Voltaire, Paine and Aristotle, who all maintained that happiness, from the use of reason, is the ultimate goal of life. For many such great thinkers, reason is the greatest good to which humans can aspire. The trouble is, many of us are notoriously good at abandoning it.

Reason is a faculty that separates us from all other living things. It is basically a cause, explanation or justification for an action or event. But as a powerful tool it's demanding. Like many power tools, it’s difficult to use well too. And we’re not up to the necessary mental exercise.

We often fall short of being reasonable for many reasons. Sometimes we’re beaten by emotion. Sometimes we fall prey to logical fallacies. Sometimes figuring out pros and cons seem trivial. Sometimes we just don’t want to be. But mainly because many of us aren’t as good at it as we might wish. Even when the ability exists, some are just not inclined to use reason well, or especially often, or to the determination shown. Therefore, when it comes down to the things that count — whether to marry someone, have kids, or commit to a political ideology – most of us don’t bother counting.

Humans are, by nature, characterised by reasoning above all else. And yet, much of impressive thinking has been used to debunk its role. Because emotion seems sexy – it’s cool, exciting, juicy and heart-thumping. Whereas reason, by definition, is unsexy – it’s boring, dry, head-pounding and number-crunching. So a rejection of reason seems reasonable. Consider how rare it is for someone caught in the grip of strong emotion to be overcome by a fit of reason, but how often it goes the other way. Even trying to be reasonable can trick us into emotion, usually negative and troublesome ones. And when faced with difficult situations, in which reason alone may be unhelpful, we shift gears into yelling, crying and so on. Strong emotion can be wonderful, especially when it involves love. But it can be terribly horrible, when it calls for hatred, fear or violence.

Even the most rational of humans aren’t strangers to unreason. Pythagoras sacrificed a bull to Apollo just after inventing his geometry theorem. Isaac Newton tried to explain the predictions in the Book of Daniel despite discovering gravity. Blaise Pascal gave up mathematics to pursue religion after founding the Probability Theory. He even famously said, ‘the heart has its reasons that reason does not know.’ As did the English poet Henry Aldrich, who in his ‘Reasons for Drinking’ said, ‘sometimes we make up our minds first, and find reasons only later.’ When reason and unreason clash, people sometimes choose the unreason to be stubborn, to disagree, or just because.

People don’t just want to abandon reason; they’re not up to the task either. Let’s take the Game theory of mathematical economics. Founded by John Nash – the subject of the film The Beautiful Mind – it assumes that people seek to maximise their positive outcomes. However, we don't often see logic-based maximisation of outcomes in practice. We turn to bounded rationality instead of true rationality. A finding put forward by the Nobel Prize-winning psychologist Herbert Simon. Which means the capacity of the human mind to formulate and solve complex problems is very small compared to the size of the problems whose solution is required for objective rational behaviour in the real world.

Reasoning is also bounded by practical constraints. Which means, when it comes to evaluating outcomes in real-life situations, people are more likely to pay attention to the basic distinction of ‘satisfactory or unsatisfactory’ rather than to go for the perfect maximisation of Game theory. And so emerged the term ‘satisficing’ – a hybrid of satisfying and sufficing. Instead of obtaining perfection, people are likely to sample their opinions and choose one that is satisfactory because it suffices. Which is why people looking for a new car, or a house, or even a mate, nearly always stop looking after finding something or someone who meets certain simple, realistic requirements.

Perhaps abandoning reason may be a kind of strategy - an unconscious one played out by evolution. As Simon found, our mind is indeed capable of solving many of the problems posed by the real world, not simply because the world is big and the mind is small, but because the mind did not develop as a calculator designed to solve logical problems. It evolved for a limited purpose; to maximise our evolutionary success. This is its most important, natural function, not reason. And as the evolutionary psychologist Leda Cosmides found, it is not adapted to solve rarified problems of logic. There’s nothing in the biological specification for brain building that calls for a device capable of high-powered reasoning, solve abstract problems, or provide a clear picture of the outside world beyond what's needed to enable its possessors to thrive and reproduce. Our rationality is bounded not merely by our inherent limitations as small creatures in a large universe. It is bounded more so by what our brains were constructed - that is, evolved - to do.

Reason may seem like a complicated logical analysis. But as David Barash puts in The Survival Game, at it's heart it's simply a means to an end. Which is perhaps why animals often act more reasonably than humans. As a means to an end, reason is only a guide to achieving potential outcomes. Sort of like a road map. And like any map, it can only tell us how to get to somewhere, not where to go.

Thinking beyond purpose

There is a difference between existence and essence. Best understood by the classical thought experiment of humans versus hole-punchers. Before the hole-puncher we got at the stationary, there was the hole-puncher factory. The factory was built on certain specifications that someone designed. And before the hole-puncher itself, there was the idea of paper, holes, punching holes in paper, and making a machine for that purpose. As soon as the hole-puncher exists, it’s playing the part assigned by its designers. We buy it and put paper into it. It then punches holes into the paper. This is its essence; it’s purpose. To use it as a paperweight or a hammer is to go against its essence. The essence of the hole-puncher precedes its existence. To the hole-puncher, and almost all other machines, essence precedes existence. To the humans, existence precedes essence. But only some of us understand this.

The human being ‘exists, turns up, appears on the scene, and, only afterwards, defines himself,’ said the French philosopher Jean-Paul Sartre. And the science writer Brian Christian, in The Most Human Human, explains that what really defines us is that we don't know what to do. Each of us, individually, have to make it all up from scratch. We don’t know what we’re supposed to do, where we’re supposed to go, who we are, where we are, or what comes next. This, to most of us, is stressful. Existence without essence is stressful.

Which is why many, including the subscribers of intelligent design, put essence before existence. Because it’s easier to believe that the human being is a designed thing. A thing with a purpose. Getting people to say things like, ‘Bolt was born to run.' It is therefore critical to understand that while our body parts may have an essence, we ourselves don’t — our bones and blood have a function, we don’t. Aristotle sums it in his view that hammers are made to hammer and humans are made to contemplate. Which means, essences or purposes aren’t discovered because they don’t exist ahead of us. They are, instead, invented.

We invent purposes, just as computers do. The computer was initially built as an arithmetic organ. It now turns out to process almost anything. The computer is probably the first machine to precede its essence. It was first built, and then came what to do with it. Apple’s ‘There’s an app for that!’ marketing rhetoric proves just that. Our relationship with technology has evolved accordingly. We don’t decide what we need first and then go buy a piece of technology to do it. We buy a piece of technology first and then figure out what we need it to do. With both humans and computers, the idea of existence precedes essence.

But when existence precedes essence, there’s a struggle to define the essence. Because it appears that the human being is nothing at all. And that goals really don’t matter. But because our lives are filled with forms of games. It helps to look at goals through games. A game is basically a situation in which a clear or an agreed definition of success exists. For a private company, there may be a number of goals, any number of definitions of success. For a publicly traded company, there is only one: returns. Not all business is a game — although much of big business is. In real life however, as observed by Sartre, there is no real notion of success. If success if having the most number of Facebook friends, then our social life becomes a game. If success is gaining entry into heaven upon death, then our moral life becomes a game. Life itself is no game. There is no finish line. The Spanish poet Antonio Machado puts is well, ‘Searcher, there is no road. We make the road by walking.’

So, while games have a goal, life does not. Putting us into an anxiety of freedom. Where anything that provides temporary relief from existential activity becomes life. This is why games are such a popular form or procrastination. And this is why, on reaching our goals, the risk is that the reentry of existential anxiety hits us even before the thrill of victory. To the point that we’re thrown immediately back on the uncomfortable question of what to do with our life.

Zen philosophy observes that we’re always trying to do something, trying to change something into something else, or trying to attain something. Trying to find the real essence. When the really beauty of essence goes beyond the essence itself; the existence. Beyond the idea of trying to attain something.

By putting existence before essence, we’re celebrating true originality and authenticity. Which is why Bertrand Russell stressed that men and women have need of play, of periods of activity having no purpose. Which is why Aristotle understood that the best form of friendship is the one with no particular purpose or goal. Which is why besides us, we regard dolphins and bonobos — animals that have sex for fun — as the smartest.

Hip and its hipsters

It seems like everyone knows what hip is. Or at least, everyone seems to pick it when they see it. And yet, no one is quite clear about what hip is. Even the Tower of Power tried a song on it. They called it ‘What hip is?’ But landed quite like many would. ‘Hipness is — what it is! And sometimes hipness is — what it ain’t!’

A way to understand hip is to look back at its origins. The word hip comes from hepi, which means ‘to open one’s eyes.’ And the one who has his eyes open is called a hepicat. Or, as we’ve come to call, a hipster. Hepi and hepicat are from the West African language of Wolof.

Hip meant enlightenment. More so to those West Africans taken to America as slaves. Being hip to them was about being aware. Sort of an understanding to help negotiate the new world. The only way to negotiate was to be smarter. They had to find a way to outwit their powerful oppressors. They had to become tricksters. They became hipsters. Hip has stood for rebellion ever since. One that values autonomy over wealth. With freedom from the demands of money. A system devoid of hierarchies. An alternative system. A culture.

The culture of hip has consistently rediscovered autonomy. To create circumstances that spark changes in society. And, as as explained by John Leland in Hip: The History, has converged six times so far. During these convergences, a criteria for what’s cool and what’s not emerged. The first foundations of hip were set by great thinkers like Walt Whitman, Ralph Melville and Henry Thoreau in the 19th Century. Their philosophy on autonomy preached nonconformism, civil disobedience, homoeroticism and the sensuality of new. It set the tone for today’s hipsters — from skaters and ravers to indie-rockers and tech-geeks. The second convergence of hip happened during the 1920s population movement into cities. The third happened during the the bebop and Beat Generation post-World War 2. These intellectual movements rejected mainstream in favour of love and happiness. The fourth happened during the 1970s fall of urban neighbourhoods. The fifth happened during the rise of the silicon valley and the Internet. These movements brought about new forms of creative media like graffiti, skateboarding, breakdancing, punk and hip-hop music, and geeky exploration into a virtual bohemia. The sixth is now.

In all these convergences, hip remained the darling of outcasts, outlaws and outsiders. Whose smartness hip relied on to save the mainstream of its own limits. Its ever-reliable crew included the likes of artists, poets, gamers, geeks, gangsters, gays, rebels and dropouts. They fed each other and formed inner circles of hipness. These are the hipsters who sell the next new to the world. All whilst pushing the hip to move on beyond the new.

By keeping change constant, hip creates ever-new needs to buy. Therefore what’s hip today will be mass tomorrow. Much to the pleasure of corporations. That have made it their most desirable proposition. To the point of being glued to anything even remotely hip. Hip sells cars, soft drinks, clothes, computers, gadgets, skateboards, booze, drugs, shoes and shades. Hip shapes how we drive, whom we admire, and whose love we yearn for. Like the adland that grew alongside it, hip creates value through image and style. In its emphasis on being watched, it predicts the modern mediascape, and values people not for what they have, but for what they stand as images.

Once just opposed to mainstream values, hip today is a step ahead. Living by creativity, putting off marriage, travelling between countries and continents, and seeking sensory euphoria is how everyone lives today. For all its professed disregard for wealth, rules and hierarchies, hip wouldn’t have thrived unless it was turning a profit. Because movements that don’t turn profit have short lives. Even institutions such as religion aren’t exempt from the rule. For it’s by preaching obedience and delayed gratification that religion churned out a productive workforce to stay on. Similarly, although hip seems to encourage idleness and underachievement, it has always helped when the economy needed something to boost consumption. The hip convergences, particularly during the manufacturing and technology booms, appealed to the bohemian rallying cry — out with the old, in with the new — and introduced radicalised consumerism. Where religion created workers, hip created consumers.

Hip today is what it’s always been. It’s still an alternative system. It’s still about enlightenment, based on contradictions and anxieties. It’s trendiness is still a by-product, not a goal. And it’s still not simply the sum of what’s hip now. However, the cultural anxieties that produce it have moved, if not faded. The syntheses now are global, rather than local. And the information is overwhelming rather than pinched.
Hipi or hepi — to see or open one’s eyes — is essential for negotiating modern life. The shelf life of trends may have reduced, but the premium of knowledge is greater than ever. In a society run on information, hip is all there is.

Although, after three centuries of hipi and hepi, the old binaries of black and white, alternative and mainstream, no longer go very far. Even businesses have come to sell themselves in just two strengths: hip and hipper. When hip itself has reached a new meaning. Moving from hip to ‘hip’. The inverted commas, as put by Leland, mean, ‘We’re too hip to care about this hip stuff, but, you know, isn’t that pretty hip?’

Influencing society

The population of India reached half a billion in 1974. Most couples had at least three children at the time. It seemed like a huge problem. There was a need for extreme measures. The government forced a new law. Men with three or more children had to be sterilised. Those who failed to do so were to be arrested. Vasectomy camps were opened across the country. Force tactics were employed. These included holding rations, licences and medical treatment. Despite public outrage, 8 million people were sterilised in the following year. The government lost support for driving people against their customs, wishes and beliefs. Eventually, it had to abandon the program amidst violent protests. India’s population kept on rising.

India’s population is still rising. Except in the South Indian state of Kerala, which addressed the brief differently. Kerala understood that the problem was a with the definition of the problem itself. Which is, family planning; more so against the backdrop of an Indian culture symbolic of large families. Family planning was seen as a matter of personal freedom. Forcing people to let go of their individual right was a difficult proposition. People's individual desires and the country's desired social outcome were far too contradicting. Trying to change individual psychology on the issue was not only a difficult one, but a lengthy exercise. Therefore, by definition, enforcing family planning was a difficult problem to solve. The problem had to be changed to something that could be solved.

Kerala then changed its strategy to wipe out illiteracy instead. With the hope of influencing collective patterns, if not individual desires, and link it back to the social outcome — to ultimately make the three-child norm unattractive. Kerala redefined the problem by shifting the focus from family planning to education. By education, it didn’t mean family planning education, but general education. General education that included simple reading, writing and basic mathematics. A problem that couldn’t be solved was now changed into a problem that could be.

A large number of illiterates were tracked down; two-thirds of which were woman. Small teams of volunteers were set out to teach them basics. Since most of them were farmers, classes were held at farms, cowsheds and courtyards. In just three years, Kerala became the world’s only 100% literate state. Soon, the three-children norm became two; and amongst highly educated couples, one. Without using any force, Kerala was able to solve the problem that the rest of India couldn't. It set up an example for the rest of the country. And brought about a philosophy that made it almost embarrassing to have any more than two children, if not one. It is now the only state in India with a stabilised population growth. A few other states are following suit.

The link between family planning and education in itself is no mystery. As women have become more educated, birth rates have fallen. Education lets them pursue interests outside home, in work, and otherwise. We’ve seen this in Western countries over the past 50 years. Kerala’s transition, however, is a mystery for its suddenness.

Mark Buchanan, in The Social Atom, explains a way to understand a sudden link between individual desires and social outcomes — such as in the case of Kerala — is to think in patterns, not people. Patterns exist collectively in society and keep reinforcing themselves. For instance, none of us live in isolation; because of which we're always influenced by the actions of others.

Similarly, when everyone else is educated, and when life comes to depend on education, then what was formerly an understandable decision to forgo education now becomes obviously unattractive to everyone. The suddenness of Kerala's transition is a product of this collective pattern at play. Which means education became self-sustaining to encourage family planning; not because people might have changed as individuals; not because their beliefs might have changed; not because of their human psychology; but more so because of the logic of collective patterns. By linking contradicting individual desires and social outcomes through a collective pattern, Kerala was able to execute a social miracle in short time.

Trapped in common sense


It doesn’t matter if we’re being squeezed in against others on a crowded train. But it does matter if someone stands close to us when the train’s empty. It doesn’t matter when we’re facing each other in enclosed spaces. But it does matter if someone stands facing us in a lift. It seems like we’re wired to some kind of rules that encourage us to spread out when there’s space. These are unwritten rules and are everywhere around us. No matter where we live, our lives are guided by these rules. A reasonable person is expected to know them. At times, a reasonable person is also expected to ignore them.

Unwritten rules are informal rules. Then there are written rules that are formal rules. Formal rules seem less important than informal rules because we like to break them. Informal rules seem more important than formal rules because we like to use them. We use informal rules to solve many of our problems, from personal to social.

We can refer to these unwritten, informal rules as common sense. Common sense is so common that we see it only when it’s missing. But it helps many of our daily activities. It is common sense that tells us to be clean, behave appropriately, pick clothes or maintain relationships. It is common sense that tells us when to follow rules, when to ignore them and when to challenge them.

There are two defining features of common sense. The sociologist Carl Taylor put these forward in 1946. The first feature of common sense is it’s overwhelmingly practical. The second feature of common sense is it’s driven by spontaneity. It is these features that make common sense different to say, sciences or mathematics. Common sense is more about providing answers to problems and less about getting to them. Common sense is therefore more about what and less about why. The power of common sense lies in its ability to deal with situations on their own terms. Which means, common sense doesn’t always reflect a problem, but simply attempts to deal with it. This calls for a couple of issues.

Firstly, this is why common sense is hard to be taught. Those who lack common sense therefore don’t get it as to why they need to pay attention. They don’t seem to understand what is it that they don’t get. And because it’s not clear as to what it is that they didn’t get, it’s hard to help them. And it’s surprisingly hard to explain what is it that they’re doing is wrong.

Secondly, common sense, by principle, isn’t always right. It’s a mixed bag of intuition, experience and knowledge. The contextual nature of common sense means these factors don't always end up in the best of proportions. Which means, our common sense isn’t always right. It is bound to make mistakes. It is important to recognise how common sense fails to be able to plan the future better or understand the present better. It turns out that common sense tends to fail in three forms. The sociologist Duncan Watts explains them in Everything is Obvious.

Firstly, the way we understand individual behaviour is flawed. When we think about why people do what they do, we tend to focus on rewards and motivations but not on reactions. No matter how careful we are, in putting ourselves in someone else's shoes, we're likely to make mistakes predicting how they’ll behave outside a given situation.

Secondly, the way we understand group behaviour is worse. When people meet at parties, markets or gigs, they connect. As they connect, ideas, opinions and thoughts are exchanged; and eventually, in some way or the way, influence each other’s views. Opinions are formed and decisions are made as to whether something’s right or wrong, fair or unfair, cheap or good. These influences tend to pile up in unexpected ways. Therefore, group behaviour seems to reflect more about what’s ideal than what’s happening.

Thirdly, we learn less from history than we think we do. This misunderstanding changes the way we see the future. Whenever something exciting, dramatic or terrible happens, we look for explanations. But since we look for explanations only after something has happened, our focus is on what has happened rather than what might have happened but didn’t happen. So we deceive ourselves into believing we can make predictions that are impossible, even in principle.

Common sense is a great way to make sense of the world, but not necessarily understanding it. It provides ready explanations and helps us through our lives everyday. But since we don’t understand why, we end up deceiving ourselves with plausible-sounding stories instead. So it weakens our motivations in treating social problems the way we treat problems in medicine, engineering and sciences. Common sense, although hard to imagine, restricts our understanding of the world.

The next big culture

A caterpillar happily eats its way through the leaves that surround it. It eats about a hundred times its weight every day. It goes on eating for about three weeks until it grows pretty big. It then slows down and begins spinning a chrysalis. Chrysalis is a shell-like cocoon in which insects develop into an adult. Inside the chrysalis, tiny new cells appear and multiply on the caterpillar’s body. The caterpillar’s immune system reacts to these new cells as foreign and destroys them. But the cells multiply so rapidly that they begin to link themselves together. Eventually the immune system gives up. Soon the caterpillar melts and liquefies. Then the tiny cells recycle the liquefied mass into a new entity. The new entity is a butterfly.

Just as the caterpillar held the blueprint for the butterfly all along, every culture retains parts of the one that preceded it. Hence, cultures don’t start from scratch. They build the new by simply rearranging the old. At the moment, we’re transitioning between the new and the old. We can refer to them as the two types of global cultural systems - control culture and integrative culture. Philip Slater is the man behind these definitions. He explains them broadly in The Chrysalis Effect. We’re transitioning between control culture and integrative culture. Since these two systems have such opposing values, the transition is a struggle.

Control culture is about eight thousand years old. In spite of dying, it’s still strong and recovering. It celebrates a state of power in which life revolves around getting and maintaining control. So much so that it separates the mind and body from feelings and nature. It is built on a dependence of authoritarian rule and sees order as something that has to be imposed upon. It prepares a fixed mind with a static vision of the world and the universe. Control culture is a macho culture where parents raise their boys to be ‘from Mars’ – stoic, rigid and aggressive. And because men are trained to be insensitive, women take up areas that men neglect – love, cooperation and nurture. They are trained to be ‘from Venus’.

Integrative culture is still in its youth, and is growing stronger by the day. It celebrates interdependence and permeates artificial walls and boundaries. It is built of democratic ethos and sees order as something that evolves, just like in does in nature. Integrative culture is about embracing and integrating diversity. It made serious inroads only in the last fifty years or so. The women’s movement, ethics movement, gay rights movement, ecology movement, sexual relations movement, the rise of the Internet, the global economy, international law, attitudes to religion and science, and organic farming are some examples of integrative culture at play.

Just as much as integrative culture is growing so is the rise in fundamentalism and control culture. This means we’re being dragged in opposite directions. Slater explains the state of our transition quite accurately. He points out that we’ve never been more concerned over our environment yet never more destructive of it; never more distrustful of technology yet never more dependent on it; never more opposed to violence yet never more fascinated with it; never more ego driven yet never more hungry to lose ourselves in something beyond ego; never more health conscious yet never more unhealthy; and while we’ve never had more ways to connect with each other, we’ve never felt more disconnected. They all tell us one thing – we’re in transition; a rather turbulent transition in which old habits seem irrelevant or destructive while new habits feel awkward and uncomfortable.

Besides, control culture has created a common definition of what we see when we look at something or what we hear when we listen to something. It has trained us to see the world through a predefined lens. This means we don’t experience the world directly. So we attack innovations that seem morally infectious and views that challenge our assumptions.

Perhaps it’s evolutionary architecture at work. Frank Herbert, in his best-selling 1965 science fiction novel Dune, identifies that evolution is a continuous process of integrating dissimilar elements to create richer wholes. It seems that this turbulent process of integrating conflicting elements of control culture and integrative culture is a fitting attempt to form richer entities.

Just as the caterpillar held the blueprint for the butterfly all along, we need to recognise that integrative culture, even when fully established, will still have traits of control and competitiveness. These traits, although diminished in value, will not be entirely disregarded. Could this be the kind of richer entity that Herbert was suggesting - one that demands the sort of acceptance to integrative culture that control culture enjoyed for thousands of years?

As hunter-gatherers for millions of years - much before control culture took over - our survival depended more on cooperation than anything else. Given that the values of the richer entity is far more inclined to integrative culture, the transition is an inevitable development - more so, a necessary one. For it will help us look beyond predefined lens and cultivate a productive approach to solving the world’s problems – from environment to economics, if not entirely being able to solve it. And while it may or may not create more happiness, it will certainly bring about stability.