But a new behavioral economics study shows that the “solution” may actually be part of the problem.
“None of us are saying that transparency is a bad thing,” Daylian Cain, a behavioral economist at Yale University, told the Boston Globe. “But almost always, it fails to work as well as we think it does.”
The first study asked people to serve as experts giving other study participants advice on how to estimate the price of a house. When the experts were paid higher according to how high the estimator guessed, the experts gave worse advice.
No surprise there, says the Globe. But when the researchers required the experts to disclose their conflict of interest, the experts’ advice got even worse.
“After having behaved honestly and virtuously, you then feel licensed to indulge in being a little bit bad,” said Don Moore at the University of California Berkeley, who collaborated with Cain on the study.
In a separate series of experiments, conducted at Duke University’s Fuqua School of Business, tested whether patients would be able to make better, more informed decisions after being told of their doctor’s potential bias.
People said that if a doctor prescribes a drug but discloses that she has a financial interest in the company that makes it, they’d be less likely to take the drug. But in practice, people were actually more likely to comply with the advice when the doctor’s bias was disclosed. [The researchers] say that people feel an increased pressure to take the advice to avoid insinuating that they distrust their doctor.”
People were more likely to make better decisions (i.e. to discount biased advice from a doctor) if the disclosure came from a third party, if they weren’t made face to face, or if there was a “cooling off” period before the person was asked to make a decision.
We don’t really want to speculate on what this might mean for the FCC’s blogger rules or Michael Arrington. But it’s likely to cause waves down the line.