For a while I have wanted to write something about what psychologists call “cognitive dissonance”. The way I think about cognitive dissonance is that individuals tend to look for confirmation of their already held views.
To give an example imagine that I think that all Muslims are potential terrorists (I do not think so…) then I would take any news story about war or terror in the Middle East as confirmation of this bias, while I would tend to ignore all other information. That would be cognitive dissonance.
Politics and cognitive dissonance – partners in crime
It seems to me that cognitive dissonance is very common in political discussions. Just take a look at your Facebook feed. Your libertarian FB friends will put out stories about police brutality to show that the government’s use of unlawful violence is widespread. Your socialist friends will put out stories about the evils of multi-national corporations that exploit Chinese workers and your conservative friends will put out stories about immigrants who have committed crimes.
I am not a saint. I do exactly the same thing. I tend to read news stories, which tend to confirm my views rather than challenge my views. That is just how it is, but I am also fairly aware that this is how it is, while I think most people don’t really think much about it.
The reason cognitive dissonance is so widespread in the world of public political “thinking” in my view is that the individual cost of cognitive dissonance is very small. For the average Dane the cost of thinking that “all Muslims are terrorists” or that “all Romanians are criminals” is very small. You get some utility from having an idiotic view and you also get some utility spreading it to your Facebook friends who have similarly idiotic ideas and “likes” what you write, but you don’t feel an urge to spread views that might challenge your own and your friends biases. Somebody might tell you that you are an idiot for believing in gravity.
Rational politicians will happily play along. After all it would be rather costly for the average politician to speak out against the average voter’s wrongful biases. Voters normally don’t tend to vote for people who tell them that they are wrong.
This is of course a variation of Bryan Caplan’s rational irrational voter. What Bryan has argued in a number of papers and in his great book The Myth of the Rational Voter is that the cost for the individual voter of having wrongful biases is small. As a result voters for example tend to be nationalistic and protectionist, while economists tend to be “cosmopolitarians” and pro-Free Trade.
Markets make cognitive dissonance very costly
Imagine on the other hand a situation where having cognitive dissonance will be very costly to you. Lets say you are convinced that you can fly. That conviction can be deadly – I am on the top of the Empire State building. There is a queue to the lift. Why wait? I can just jump out from a window and fly down. And now you are dead…
I guess there is natural selection here – people who don’t believe in gravity or think they can fly end up killing themselves, while those us who understand basic physics tend to live a bit longer.
This is exactly how markets are dealing with cognitive dissonance. Lets take the example of financial markets.
Anybody who have spend a bit of time on a trading floor will tell you about the typical trader – the homo tradicus. The only thing important for the homo tradicus is his P/L – his profit and loss. His P/L on a second-by-second basis also tells him whether his view of the world – this trading position – is right or wrong. This tends to strongly reduce cognitive dissonance – suffering from cognitive dissonance would fast wipe out the homo tradicus.
This is likely also the reason why many traders seem so horribly (but rationally) “inconsistent” when you talk to them. It is very common than when you talk to a trader he will tell you how successful he has been buying dollars and then two days later he will tell you that he has been a seller of dollars all along. A successful trader will never fall in love with a certain position and he will know when to cut a loss.
A successful trader would follow Keynes’ dictum “When the fact change. I change my mind”.
This I in general believe goes for markets – markets will force economic agents to be unbiased contrary to in politics where cognitive dissonance actually seems to make you more successful.
Monetary policy and cognitive dissonance
I believe that cognitive dissonance also is highly relevant for the decision making process in monetary policy. Just take the concepts hawks and doves. A hawk (dove) is a monetary policy maker who in general believe that monetary policy should be tighter (easier).
But does it make any sense always or nearly always being in favour of a tighter or easier monetary policy? If on average the central bank hits its target over time then logically half of the time monetary policy needs to be tightened (eased), which would warrant monetary policy makers to be hawkish (dovish).
However, observing central bankers it is pretty clear that they tend to be very biased. Just take somebody like Dallas Fed president Richard Fischer. Since 2008 he has been consistently hawkish. As inflation (and inflation expectations) has more or less consistently been below 2% and unemployment has been high we today know that Fisher’s hawkish stance has been wrong. Had he been a trader he would probably long ago had been without a job.
Interestingly enough back in 2008-10 Fischer was warning about inflationary risks from Fed’s policies. Today Fisher is warning about bubbles. He has maintained his hawkishness, but changed the reasons for being hawkish. That to me is a very clear indication that Fisher is suffering from serious problems with cognitive dissonance.
Fisher is not unusual. In fact I believe that Fischer is a pretty “normal” central banker. Most of them suffers from cognitive dissonance exactly because the individual cost for them of being wrong is very low.
Dealing with cognitive dissonance in monetary policy making
There is numerous ways of reducing cognitive dissonance in monetary policy making.
The obvious possibility is simply to take away central bankers’ discretionary powers and instead leaving the implementation of monetary policy to the markets. This would essentially be what we would have if the central bank for example implemented monetary policy by “pegging” market expectations for inflation or nominal GDP growth as suggested by Robert Hetzel and Scott Sumner.
A slightly less revolutionary suggestion could let “good forecasters” have more votes in the monetary policy making body. Lets take the Fed’s FOMC. Here the members could be asked to make forecasts on the Fed’s two more or less explicit targets – unemployment and core inflation.
Then on a rolling six or twelve months basis each FOMC member would be ranked based on their “forecasting accuracy”. The two worst forecasters would then loose their vote on the FOMC for a period of for example six month.
This would introduce a cost – a “shaming cost” – for being biased. Obviously the individual FOMC member could still vote as they please and there wouldn’t necessarily have to be a consistency between their voting and their forecasts, but it is likely that too large inconsistencies between voting and forecasting would cause quite a bit of embarrassment for the most biased FOMC members. This I believe would do quite a bit to reduce cognitive dissonance among the FOMC members.
It should be noted that the FOMC actually has moved a bit in this direction over the last couple of years, but in my view the cost to the individual FOMC of being biased is still very small and cognitive dissonance therefore seems to dominate monetary policy making.
PS this post is actually a bit of an attempt to start dealing with my own problems with cognitive dissonance. I am not sure that I am succeeding, but I am at least trying to think of methods to get around cognitive dissonance problems.
UPDATE: A couple of readers have suggested that I use the term “cognitive dissonance” in a wrong way. I surely acknowledge that I am a bit (maybe a lot) sloppy with the term here and it might have been more telling (or correct) to use the term “confirmation bias” instead or even Bryan Caplan’s term “rational irrationality”. This does, however, not change the conclusions.