Ah, rhetoric! Ars bene dicendi, as the ancient Romans would put it. In short, the theoretical know-how and the practical skill of effective text production and its (usually oral) delivery.
Used and abused over the centuries, it is often nowadays equated with manipulation and bombast. But let us have a closer look. What does “effective” in this context really mean?
Webster’s tells us, for instance, that effective is what’s “adequate to accomplish a purpose; producing the intended or expected results”. Basically, rhetoric in and of itself is neither a good thing, nor a bad one. It all depends on the purpose it chases. According to Plato, speaking the truth (dialectic, or the ars vera dicendi) is the only valid purpose that can give rhetorical efforts substance and justification.
I tend to agree with him. But here’s the deal. Communication satisfies a variety of practical needs on a daily basis. Language has the power to play tricks on us not because speaking effectively is vicious, but because we are simply too irrational. Our reasoning is often weaker than we think and we also rely on intuition, or shortcuts, much more often than we like to admit. The decisions we make and the responses we give depend greatly on momentary mindsets, influences from the immediate environment, complexity of task at hand, emotions, past actions, what other people do, how the choice set is formulated and much more. There is a very interesting class on these issues by Dan Ariely – Professor of Psychology and Behavioral Economics at Duke University – on Coursera.com right now. It’s called A Beginner’s Guide to Irrational Behavior. Can we blame people that they are trying to accomplish their goals? Or should we just learn to pay more attention, examine everything more carefully and take more time to reflect? Understanding rhetoric can help you deconstruct many fallacies and spurious arguments.
In my homeland, there is a famous anecdote which testifies to the power of rhetoric and of effective phrasing.
The story goes like this: In a monastery there were two new monks. They had only joined recently and hadn’t yet been able to kick their old worldly habits. They were both avid smokers. Now, in this abbey, every afternoon all the monks were supposed to go for a walk in the garden and pray. Our two young monks were dying for a smoke. They simply couldn’t take it anymore. So each of them, separately, went to the abbot to ask for permission. Later in the afternoon, they met again in the garden. The first one was morose and depressed, for he had not received the abbot’s permission to smoke. To his dismay, as he entered the garden, he saw the second one walking about puffing away happily at his cigarette with a serene face. He was up in arms:
‘How is this possible?’!, the first one shouted. ‘How come the abbot forbade me to smoke, but allowed you to do it?’
‘Well’, the second one replied. ‘What did you ask him?’
‘I asked if I could smoke while praying in the garden. And you?’
‘Well, see, I asked him if I could pray while smoking in the garden.’
There is also a famous example by Tversky and Kahneman, two behavioral economists, who presented a group of people with a dilemma and got two entirely different answers simply based on the way the choice was formulated. This is called “the framing effect”.
A group of people was presented with the following problem:
“Imagine that the US is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that exact scientific estimates of the programs are as follows:
If Program A is adopted, 200 people will be saved.
If Program B is adopted, there is a 1/3 probability that 600 people will be saved, and a 2/3 probability that no people will be saved.”
What do you think most people chose? Which program would you choose? Wait! Don’t scroll down yet…
The vast majority of respondents chose Program A.
Now another group of similar respondents was presented with the same problem, but phrased differently:
“If Program A is adopted, 400 people will die
If Program B is adopted, there is a 1/3 probability that nobody will die and a 2/3 probability that 600 people will die.” (source Tversky and Kahneman (1981))
Any guesses about what people chose this time? What would you choose?
That’s right. Most likely, Program B.
Please note that the formulations are almost perfectly equivalent and also, for those of you versed in economics, that – strictly rationally speaking – program A and program B are also equivalent in terms of average value, because:
100% x 200 = 200
1/3 x 600 + 2/3 x 0 = 200 +0 = 200 !
This experiment proves that people, even economic agents, are not – as generally assumed by neoclassical economics – rational. Even though there is no substantive difference between the two formulated versions of this problem, they evoke different associations and evaluations – and it turns out people are more afraid of additional loss of life.
QED. Words are more important than we sometimes give them credit. After all, it was through the word that God created the world – and even today, the words we use generate realities in our and in other people’s minds. So use them carefully and respectfully – and if possible – with a sense of responsibility!