MAY 21, 2018 | Merritt Baer
In the citizen/consumer world, I see two phenomena colliding: Chinese “social credit,” and the EU’s General Data Protection Regulation (GDPR). Both seem to be arriving in the United States in some form. Are either of them getting us closer to the relationship we want to have with consumers, government, and technology?
Chinese social credit refers to the fact that “platform” owners-- in China, companies like Baidu, Tencent, and Alibaba—have access to vast and varied data about consumers, across applications. These companies own so many parts of the technology and information landscape, from banking to home-ownership to healthcare and travel, that they can use the data under their own umbrella of applications, to cobble together a picture of a person. This picture is a “social credit score.” In China, companies explicitly state that this score is a moral and fiscal judgment from which the companies extrapolate every other determinate, including who a person should marry. Add to that machine learning algorithms, and it looks something like the similarly themed Black Mirror episode. One woman tweeted that while on the train, the announcer threatened that breaking train rules would “hurt our personal credit scores.”
For Americans who feel grateful not to be subject to this scrutiny: don’t. While Americans are not sanctioning the same level of government involvement in a “social credit score” movement, we already have this system in “lite” form. In some ways, this is a more coherent version of a long line of quasi-private entities in the US doing quasi-public functions. FICO scores started in 1956 with “Fair Isaac and Company.” Equifax was a quasi-public private entity, creating credit scores without individual consent. And today, Google, Facebook, and other tech behemoths operate multiple applications across a platform, aggregating emails and search terms and cell phone records. I was in Mark’s year as an undergraduate student at Harvard, so it’s personal to me—I was in a version of facebook even before I had a choice. To some extent, he and other platforms are creating the questions we now live in.
Meanwhile, the European Union will implement this month a “General Data Protection Regulation” premised on the notion that government can and should limit the reach of companies into citizens’ data. In a funny way, the US is simultaneously getting this, too, in “lite” form. First, multinational companies are already rushing to comply with the regulation-- meaning that companies who do business in America as well as the EU are likely to simply raise their practices to the bar required by GDPR. But perhaps more interestingly, we’re also seeing uniquely American conversations about the depths to which companies should own consumer/citizen information.
GDPR creates regulations around storage and re-use of data. It defines consent online as “freely given, specific, informed and unambiguous.” It provides heavy penalties for breaches (perhaps too heavy and out of whack, in my estimations.) Companies must appoint a data protection officer. Data must be encrypted. In other words: processing personal data must be compliant.
It doesn’t do what I believe we are reaching for, which is to limit the role of companies in weaving government-level control with technology platforms’ ability to know all.
One way that American companies are confronting this conundrum is by creating their own moral compasses: Amazon has 14 leadership principles, around which it orients its own true north. Google, at least in notion, promises, “Don’t be evil.” But companies can’t be solely responsible for quilting together the bones of our values systems. They literally have a legal fiduciary duty to prioritize their bottom line.
Facebook is having a reckoning – as I’ve described in an earlier post, content moderation is increasingly clearly within their purview (even if it is a decision not to moderate). This junction is more specific: what duty, if any, do companies owe to consumers not to become entwined with government? And even if they do not: what duty, if any, do companies owe to consumers to not extrapolate data into a mosaic of intimacy that can be wielded against you? This is likely to be defined through statute, which means it’s Congress’ domain.
Though Facebook had a provably manipulative effect on the 2016 election, maybe Facebook’s invasive practices are dismissible as social media, not life or death. But what happens when a technology platform influences your healthcare, your retirement, your ability to buy a house? When is it inherently a governmental issue? What are inherently citizenry rights, not consumer?
We should use this high water mark as a time to hold to the light our congressional candidates and other government officials. Views on technology should be a fundamental part of a candidate platform. When Senator Lindsey Graham said to Mark, “If that was my employee, I would have fired him,” an obvious reaction was, “Cool. You’ve never run a technology company.” The Senate can and should hold responsible tech companies and their CEOs, but we need our lawmakers to host real conversations about parsing out the good, not posturing for a short clip on CNN2.
There is a shared responsibility that we ought to own, from each of our vantage points. Whether we’re working on smart cities or child services, corporate decision-making or government scalability, we need to have these conversations. Other countries have started moving in directions that we may want to avoid, integrate, or riff upon. Let’s choose representatives who are grappling with the issues in ways that reflect a more perfect union.
Image credit: WhiteDragon/Shutterstock
The views expressed herein are the personal views of the author and do not necessarily represent the views of the FCC or the US Government, for whom the author works.