1. Our Algorithmic Overlords: I've long argued that teaching kids to code is as much of a waste of time as financial literacy. The simplified version of the argument is that most people are terrible programmers and computers are already better at coding than the average human. As a consequence I emphasize to my own kids and to others who are blinkered enough to ask my advice, that learning how to communicate/write is a much more important tool for the future (yes, yes, cognitive dissonance). While I still think I'm right about the first part, it turns out I'm wrong about the second part. Yesterday OpenAI "released" work on an AI system that writes shockingly good text. I use scare quotes because, in another sign of things to come, OpenAI has only published a small subset of their work because they believe that the potential malicious use of the technology is great enough to restrict access. There are a bunch of news stories about this. Here's Wired, for instance. But the most interesting one I've come across is The Guardian because they had the algorithm write an article based on their lede. Let's stick to the disturbing for a bit, because it's that kind of day. The World Food Program has formed a partnership with Palantir to analyse its data on food distributions, apparently with the main motivation being to look for "anomalies" that indicate that aid is being diverted or wasted. The idea of handing over data about some of the world's most vulnerable people to a private company that specializes in surveillance and tracking of people hasn't gone over well with a wide variety of people. As background, here's an article about what Palantir does for their biggest client, the NSA. Sometimes it seems like some people at the UN look at the one world government kooks and think, "What could we do to make their conspiracy theories more plausible?" On a more theoretical level, Kleinberg, Ludwig, Mullainathan and Sunstein have a new paper on "Discrimination in the Age of Algorithms," arguing that despite fears of algorithmic discrimination, proving discrimination by algorithms is a lot easier than proving discrimination by humans. Of course, that requires putting regulations in place that allow algorithms to be examined. I'm going to flatter myself by pointing out it's similar to an argument I made in my review of Automating Inequality. So I feel validated. Speaking of transparency, regulation and of algorithmic surveillance, here's David Siegel and Rob Reich arguing that it's not too late for social media to regulate itself, by setting up something like FINRA (Financial Industry Regulatory Authority, which polices securities firms). It's an argument that I would have given short-shrift to, but the FINRA example is credible. Finally, I'll be dating myself in the Graphic of the Week below, but here's another way to figure out how old I am: when I was an undergrad, most of the "power imbalance" between developing countries and private firms literature was about GM. Here's a new piece from Michael Pisa at CGD on the new power imbalance and it's implications: the relationship between developing countries and tech giants.
2. Digital Finance: That feels like as reasonable a transition as I'm going to get to new data from Pew on the global spread of smartphones. Given limited consumer protections, regulatory and enforcement capability, and "digital literacy" in many developing countries, I will confess this worries me a lot, cf Chris Blattman's thread on "creating a 20th Century...system in an 18th Century state." Here's a particular instance of that concern, tieing together the last few items: the rapidly growing use of "alternative credit scores" using things like digital footprints and psychometrics. You can make an argument that such things are huge boon to financial inclusion by tackling the thorny problem of asymmetric information. But there are big questions about what such alternative metrics are actually measuring. For instance, as the article above illustrates, the argument is that in lending, character matters and that psychometrics can effectively evaluate character. But it doesn't ask whether character is in-born or shaped by circumstance? No matter which way you answer that question, you're going to have a tough time arguing that discriminating based on character is fair. And that's all before we get to all the other possible dimensions of opaque discrimination. The growing use of alternative data is starting to get attention from developed world regulatory agencies, but the first frontier of regulation is likely to be from securities regulators. I don't think they are going to be particularly interested in protecting developing world consumers. I guess that idea about self-regulation is starting to look more appealing, particularly if it's trans-national. Meanwhile, the frontier of digital finance is advancing rapidly, even without alternative data. Safaricom introduced what is here called a "overdraft facility" in January, but I think of it more as a digital credit card. In the first month it was available, $620 million was borrowed. The pricing seems particularly difficult to parse but that may be just the reporting. One of the very first things I wrote for FAI was arguing for development of a micro-line-of-credit. Now that it's here, I confess it makes me very nervous.
3. Financial Inclusion: That's not to say that digital tools don't hold lots of promise for financial inclusion, just check the Findex. This week CGAP hosted a webinar with MIX on "What Makes a Fintech Inclusive?" There are some sophisticated answers to that question with some good examples, but I often return to the simplest answer: it cares about poor and marginalized people. And so I especially worry when I see answers to that question that lead with tech. The financial inclusion field as a whole has been in something of a slow-moving existential crisis for the last few years. The best evidence of that is the number of efforts to define or map the impact of financial services and financial inclusion, several of which I'm a part of. Last week I linked to an IPA-led evidence review on financial inclusion and resilience. The week before that to a Cochrane Collaboration review of reviews of evidence on financial inclusion. This week, the UNCDF and BFA published their take on pathways for financial inclusion to impact the SDGs (full report here). I could say I expect there will be more, but I know there will be more in this vein, if I can finish revisions, etc.