The faiV

Week of May 8, 2017

Unintended Consequences Edition

1. American Inequality: The exceptionalism of the United States in promoting home ownership as the signifier of middle class status and/or upward mobility, and a generally accepted keystone of building wealth has persisted despite the Great Recession/housing crisis. But that doesn't mean that things haven't changed--the availability of housing that costs less than 30% of a household's income has dramatically decreased. Matt Desmond, author of Evicted, writes in the New York Times magazine that the American emphasis on home ownership has become one of the primary engines of inequality. Non-profits--or at least how we measure and fund them--are another (unintended) engine of inequality. In New York state, non-profits pay wages just above retail and food service (and 80 percent of these workers are women, and 50 percent people of color).

2. Our Algorithmic Overlords: The goal of machine-learning and using algorithms to analyze data is to yield better decisions, at least better than human beings would make given biases and the challenges of causal inference. A(nother) new book looking into the way this works is Everybody Lies. I haven't read it yet, but I'm looking forward to it. In the meantime, there's an excerpt in the Science of Us, taking a look at one of those areas that humans always struggle to make good decisions: who is credit-worthy. The substitution of bias against minorities (or at least people different from the loan officer) and the poor for careful judgment is well documented and wide-spread. Netzer, Lemaire and Herzenstein turn the machine loose on data from Prosper, an online platform for peer-to-peer lending, and find that the words that borrowers use are predictive of repayment behavior. You should read the whole excerpt because it does focus on the unintended consequences of using machine learning and big data. I, of course, immediately wonder how quickly borrowers and lenders will adapt to the findings.

Meanwhile, here's a Quora forum with Jennifer Doleac on the American criminal justice system, which dwells a lot on how machine learning is affecting decisions in another area humans have a lot of trouble with: who's guilty and who is a threat for recidivism. And of course, on the unintended consequences of our efforts to punish people. And here's a speculation that Donald Trump is a dynamic neural network/machine-learning algorithm with narrow goals. Here's an alternate version of the same argument, which in addition to being even more frightening, provides additional insight into the potential unintended consequences of data analysis without theory (of Mind).

3. Digital Finance: The item on Prosper and algorithms determining credit-worthiness based on language used by borrowers is about digital finance of course. But in the domain of more traditional ways of thinking about digital finance, here's a story about M-Pawa in Tanzania, interesting for it's integration of savings, lending and education. The bottom line: more savings, larger loans, better repayment. In other news, M-Pesa is supporting proposed regulations for cross-platform transfers in Kenya. And MicroSave has some ideas on how to enable digital finance among the illiterate, since traditional approaches to inclusion through digital have the unintended consequence of excluding the illiterate.

More specifically on the "unintended consequences" theme, though having relatively little to do with digital finance, here's some new research on how global de-risking in banking has cut the number of correspondent banking relationships (what makes cross-border payments even somewhat efficient) have declined by 25% since 2009, pushing whole regions out of the regulated banking sector.

4. Finance Frames: I couldn't come up with a pithy and clear intro to this item, so we're stuck with 'finance frames.' The point is that how we think about finance--the mental frames and analogies we use--have an often unintended impact on what happens in the real world. Here's a Twitter exchange that started from a discussion about how investment advice is provided to retail investors in the US (are financial advisors like store clerks?) but quickly moved on to something more globally relevant: how much financial advice is or should be like medical care. The exchange is a bit difficult to follow, but it's worth it.

I struggle with the appeal to the medical care analogy for a number of reasons, not least of which is that the comparison to health tends to idealize the provision of medical care. In fact, medical care the world over is delivered poorly, with bad or conflicting incentives, rife with misinformation and poor decisions. It's why when someone asked "do you really want a doctor that can't afford a Ferrari?" my answer is "Hell, yes." If the medical field is what finance is aspiring to, or taking it's lead from...


5. Charity and Philanthropy: Many years ago, one of the first things that got me some attention writing about charity and philanthropy was an on-going critique of "embedded giving", the jargony term for purchases that include a donation to charity. I even created a scoring mechanism for judging the campaigns! How naive I was back in my youth. A new paper from Gneezy, Gneezy, Jung and Nelson yet again proves why such schemes are suspect: they can drive up profits for businesses while driving down the amount donated. In this case people paid significantly more for products with a charitable donation but did not distinguish between 1% or 99% of the proceeds going to charity. If you were as cynical as I am, you would dispute that this is an unintended consequence. 

And here's Larry Kramer, president of the Hewlett Foundation on the unintended consequences of philanthropy's fad toward "big bets."

Economist William Baumol died last week. He did a lot of work on entrepreneurship but is probably best known for what he called "cost disease" which explains why the costs of goods and services can rise quickly in sectors with little productivity ga…

Economist William Baumol died last week. He did a lot of work on entrepreneurship but is probably best known for what he called "cost disease" which explains why the costs of goods and services can rise quickly in sectors with little productivity gains when there are large productivity gains in other sectors. One way of thinking about this is that we're spending too much time automating the wrong jobs (and relates back to "hell, yes" above. Source: Vox