Blog posts

image

April 24, 2013

A Must Read from Chris Dunford on Research-Practitioner Relationships

By Timothy Ogden

A regular theme in our writing is about the need for the microfinance industry to learn from and adapt to the needs of poor households. A few weeks ago, a new paper appeared based on an interesting attempt to test whether MFIs are interested in generating and using rigorous evidence. The researchers sent emails to 1,419 MFIs inquiring about their interest in "a partnership to randomly evaluate their programs." There were three different emails sent however: 1) a neutral email, 2) an email that emphasized positive findings from other studies of microfinance, and 3) an email that emphasized "null" findings from other studies of microfinance. 

Unsurprisingly, the positive emails had double the response rate of the negative emails. The authors interpret this finding as evidence of confirmation bias among MFIs--they are only intereted in good news that backs up their existing beliefs, and less interested in learning how to improve. 

When Chris Blattman blogged about the paper, Chris Dunford, former president of Freedom from Hunger, responded at length. His response is a must-read (for a list of papers on savings that Chris thinks are must-reads, check out his expert recommendations here). Do read the whole thing, but here is an excerpt:

Think about it. Some academic researchers are offering to help you learn about the effectiveness of your program. You are interested because you have lots of questions and you seek answers to improve your program performance, both for your clients and for your MFI. Then you consider how likely it is that these unknown academics will really help you learn. You think back on what you’ve heard about similar research and researchers and their public reports. You think about the tepid message in the email, especially if you got the one that is explicitly negative about microfinance effectiveness. Likely to be helpful enough to justify the cost and aggravation of having these folks from New York City (think of the brand issues there!) camped in your operational midst for months?

Come on! What has this got to do with “confirmation bias” or “aversion to learning?” Perhaps more researchers should put more energy into learning how to work with practitioners rather than belittling them with this kind of arrogant presumption.

Many of the other comments on Chris Blattman's blog post are also worth reading. 

 

 

 

 

Blog Tabs