Author: Renee Ho
This is the first of two posts that investigate the Center for Global Development’s recent blog post, The Political Paradox of Cash Transfers.
I am of the species Homo sapiens, not Homo economicus.
As a human, I make mistakes. My decision-making is full of cognitive biases. I use mental heuristics—little short cuts— to process the massive amounts of information I receive each day. I don’t perfectly weigh costs and benefits to maximize utility. And even when I make a calculated decision, I fail to follow-though and do what I know I need to do.
You’re also human. And as a result of our humanness, we need to be careful how we ask for feedback.
Recently, the Center for Global Development (CGD) conducted a poll in Tanzania: would Tanzanians prefer to use the country’s gas revenues for individual cash transfers or government spending?
It turns out that the responses vary depending on how you frame the decision and the context in which the decision is made.
The psychology of choice
In his blog, The Political Paradox of Cash Transfers, Justin Sandefur explains that a majority of polled Tanzanians (62%) supported the use of gas revenues for direct cash transfers when the idea was presented as a stand-alone option.
That is, when there was no alternative presented.
But when presented as a choice between two things—using gas revenues for either cash transfers or government services—two-thirds preferred the latter of government services.
What’s a policy-maker supposed to do?
Behavioral economics anticipates these discrepancies. It shows that while human behavior is not rational, it can be predictable if we understand the heuristics and biases at play.
Adding choice results in different decisions and in this case, different feedback. More choice often doesn’t result in a better decision. Think about the last time you went to the market to buy jam and felt a little overwhelmed.
We need consider: if presenting a choice for constituent feedback, how many options are there? Which do you present first? Is there a default option?
It’s called choice architecture and how we present information matters.
The power of the familiar
Sandefur also suggests that, “Perhaps respondents preferred the known to the unknown: schools and health clinics have, after all, been built in most villages in rural Tanzania in the last 30 years.”
In other words, why not go with something familiar? Especially if the alternative is perceived as a little uncertain and therefore risky?
From behavioral psychology we know that humans use simplifying tactics—heuristics or short cuts—when making judgments or decisions under uncertainty. One of tactics is the “availability heuristic” (which also results in bias): we use knowledge that is readily available—what easily comes to mind.
Moreover, we mentally anchor to what is known.
This might explain why incumbents are often re-elected. We don’t calculate pros and cons. Instead, the lazy mind slips into the familiar, and relies on the past to determine the future.
Groupthink and conformity
Perhaps the Tanzanians in the poll just needed more time to think. So think they did.
The CGD organized a deliberative poll. Deliberative polling tries to get at an informed decision versus a haphazard uninformed decision. Participants in a deliberative poll are provided balanced information and facilitated sessions to discuss the issues among peers and “experts”.
Guess what? With the group that participated in deliberative polling, the rate that preferred government services to cash transfers increased further: 77% instead of 66%.
It seems shocking to the CGD but to behavioral psychologists, this result makes a lot of sense.
In his recent book Wiser, Cass Sunstein explains why groups go wrong when they deliberate. He discusses two types of influences on group members:
Information signals, which lead people to fail to disclose what they know out of respect for information publicly announced by others.
Think of the “halo effect”: leaders project a little something that makes them seem unusually smart. Individuals in groups will follow them even if against their own best judgment.
Social pressures, which lead people to silence themselves to avoid various penalties.
Think about the last meeting you were in—did you avoid speaking up because you might get the disapproval of your peers? Did you not want to seem foolish or disagreeable?
Beyond deliberative polls, think of all the focus group discussions that qualitative researchers conduct. If we’re honest with ourselves, we know that these surveys are full of groupthink biases.
Behavioral economics can explain how we unintentionally bias the feedback that’s given. And there are dozens of other mental heuristics and biases that go unmentioned here.
The field is trending and despite all of the discussion around its “(libertarian) paternalism”, we have to see its value in informing how we get feedback.
That is, if we’re going to get feedback at all (arguably the opposite of paternalism), we might as well do it right.
Understanding all of the predictable patterns of human behavior, and seeing them as perfectly normal, can ultimately help us get better feedback.