What is the best strategy for getting ahead in business? Does being a good guy really pay or is it better to be cunning, even sneaky, in one’s own self-interest?
We were delighted to discover that Game Theory explores and provides mathematical evidence for the answers and reaffirms our approach to doing business. This is because Dolphin Bay has become tired of businesspeople trying to be clever by using strategies that manipulate or misuse others which, in our experience, breaks down trust.
The Game Theory experiment we shall refer to applies not only to business but to any human endeavour that involves relating to other humans, from personal relationships (“must I be the one to do the dishes yet again?”) to the grand scale of world conflicts and everything in between.
Game theory is the branch of maths concerned with strategies for dealing with competitive situations where the outcome for one player depends on how others react.
“Since both sides had nukes, neither could have used them… they would have been better off cooperating and agreeing not to develop the technology any further.”
A YouTube video optimistically called “What Game Theory teaches us about life, the universe and everything,” addresses the most frequent problem in game theory: what is the best strategy for interacting with others?
It is a question “one that pops up everywhere,” and will even determine the difference between “flourishing, and destruction of the planet,” says science communicator Derek Alexander Muller on the video on his YouTube channel Veritasium, which has over 16 million subscribers.
“Figuring out the best strategy can mean the difference between life and death, between war and peace.”
He starts with a story. At the start of the Cold War, America had discovered that the USSR, too, was developing nuclear bombs. Both superpowers were growing their arsenals, and the future of humanity was on a knife edge.
Something needed to be done, and fast, but what? Some thought America should become “an aggressor for peace” and launch an unprovoked nuclear strike against the USSR. Others felt this would be a disaster. Which side was right?
Mathematicians at a US-based thinktank called the Rand Corporation were exploring this question and turned to Game Theory as part of their research. Meanwhile, other mathematicians at Rand had invented a new game for two players called The Prisoners Dilemma. “Thousands of papers have been published on the Prisoners’ Dilemma, which pops up everywhere,” observed Muller.
The aim of this game is to get as many coins as possible. For each move, the players can choose to cooperate with each other (in which case they each get three coins) or “defect” in their own self- interest (in which case the defector gets a whopping five coins, scoring more than through cooperation, but the opponent gets nothing).
The most profitable option seems to be to defect. However, if both players defect, they get only one coin each.
It’s a dilemma that the Cold War exemplified perfectly.
“It was this approach (of defection or acting in their self-interest) that led US and USSR to develop tens of thousands of nuclear weapons each, more than enough to destroy each other many times over,” said Muller. “But since both had nukes, neither could have used them… they would have been better off cooperating and agreeing not to develop the technology any further.”
As we know, the two superpowers opted to cooperate, agreeing to gradually reduce their nuclear arsenals but inspecting each other’s progress carefully before making each new move, to be sure that the cooperation was genuine.
How does the game play out?
Political scientist Robert Axelrod wanted to establish, unequivocally, the best strategy in the Prisoner’s Dilemma. He invited the world’s leading games theorists to submit strategies that would play against each other.
Fifteen strategies were loaded onto a single computer, cooperating and defecting in various degrees and combinations. Some always cooperated or defected; others defected after time and in different circumstances.
The whole tournament was repeated five times over, with 200 rounds each, in a bid to reveal reliable patterns.
The results surprised Axelrod. “I had imagined that like with computer chess, you’d need a pretty complicated programme to play a sophisticated game… but it wasn’t like that at all,” he said. “The crazy thing was that the simplest programme ended up winning, a programme called Tit for Tat.”
This programme started with cooperating, then defected if its opponent did, but cooperated again as soon as its opponent started doing so.
Overall, when games against other players were considered, Tit for Tat gained more points than any other strategy – although when the players became locked in a pattern of defection, both did poorly in the end.
After Axelrod published his findings, he announced another tournament, which gained far more entries, with even greater complexity. Again, the “nice” strategies did much better, and Tit for Tat was the most effective.
Four winning qualities
The best-performing strategies shared four qualities; Axelrod observed.
First, they were all “nice,” in that they were not first to defect. The various “nice” strategies, of which Tit for Tat was one, made up 14 out of the 15 top strategies.
The second most important quality was being forgiving – retaliating, but not holding a grudge in subsequent rounds. A “maximally unforgiving” strategy called Friedman, which would defect “without mercy,” fared badly.
These solutions – that it pays to be nice and forgiving – came as a shock to the experts who submitted the strategies, as many of them had tried to be tricky and subtle.
Interestingly, when Axelrod “ran the numbers” he found out that another strategy, which only defects after the opponent defects twice, would have won the tournament over Tit for Tat, if it had only been submitted.
The third quality was being retaliatory: if your opponent defects, strike back immediately; don’t be a pushover. To always cooperate, would be to become a pushover. “Tit for Tat, on the other hand, is very hard to take advantage of,” said Axelrod.
The fourth and last quality was being clear. Programmes that “you couldn’t figure out, because they were too complicated – it was very hard to establish a pattern of trust with them (and the competitors thought they may as well defect).”
Nice, forgiving, provokable and clear – it’s a lot like the morality that has developed in cultures around the world, notably the “eye for an eye” morality of the Old Testament, observed the mathematicians in the video.
“But over thousands of games, playing against a variety of strategies, Tit for Tat always won.”
It depends on the context
This was no single best strategy for every game; it always depended on the other strategies you’re reacting to, observed Muller. For example, when Tit for Tat played against the bullies who only defected, it came last.
But over thousands of games, playing against a variety of strategies, Tit for Tat always won.
“What if the world you started off in was different? What if there was only a small cluster of Tit for Tat players in a very nasty world?” asked Muller.
Over time, they would build up a lot of points, and start taking over the population. “A little island of cooperation can emerge and spread – and will eventually take over.”
As long as one strategy performs better than others, it can take over a population.
Axelrod’s theories have been applied to areas from science to international conflicts.
Can misunderstandings ruin everything?
What about small errors of misperception? For example, in 1983, the Soviets confused sunlight reflecting off high-altitude clouds, for a ballistic missile from America.
“These are life and death decisions. They could mean the death of the whole planet. It’s perhaps a misnomer calling it Game Theory,” said Professor Steven Strogatz, Professor of Applied Mathematics at Cornell University, wryly in the video.
In fact, building in a little extra forgiveness helped to iron out potential misperceptions that a cooperative move was negative, he observed. “Instead of retaliating after every defection, you only do so nine out of every 10 times.”
This is a stark contrast to the bullying strategy of always defecting: an approach that cannot lose, only draw or win, but which nevertheless performs poorly over time.
“When many people think about winning, they think about beating the other person,” said Prof Strogatz. “In games like in chess or poker, this is true (because) these games are zero sum.
“But most of life is not zero sum. You don’t get your reward from your opponent – you can get it from your banker, except that your banker is everything around you.”
- MORE THAN JUST BUSINESS - December 9, 2024
- GLOBALISATION GIANT CONTINUES TO ROAR - December 9, 2024
- PAINSTAKING FINANCIAL MANAGEMENT BUILDS RESILIENCE - December 9, 2024
Comments are closed.