What Responsibility Does Facebook Have to Moderate Content?

APRIL 30, 2018 | Merritt Baer

Last week I wrote about upcoming roadblocks in prosecuting child pornography in a coherent form, now that technology allows for “deep fakes .” These questions sit interestingly within the ways that we share and distribute content online. In other words, even if we think that Facebook went too far with Cambridge Analytica, maybe we would want child pornography investigations to go extremely far—even when not conducted by the police. 

While we seek to enforce laws that protect children, the enforcement of child protection laws may take the form of greater surveillance on platforms. Could we identify deep fakes if we knew more about the production of the photo? Could we identify more about the production of child pornography writ large and on a case-by-case basis? Could law enforcement trace them better and if they could, would we tolerate more data trails on that justification?

Recently, there was a story in “The Intercept” claiming that Facebook was helping ICE agents to track immigrants. In fact, the subpoena they excerpted was a legal subpoena on a child exploitation case. As Turkish tech expert Zeynep Tufekci writes, “most people would probably want companies to comply with legal subpoenas on child abuse cases. Most probably would not want social media data to be used to hunt dissidents.” Thus, “There are trade-offs. If they don’t retain any data, there is nothing to turn over in child-abuse cases or if dissidents are targeted.”

Most of us feel some burn when it comes to social media platforms transacting personal data as their stock in trade-- but child abuse images are always an outlier that makes us question how far we are willing to go to prosecute.

And as we get closer to a point where we may transact all data across proprietary platforms, we see a new incarnation of some older questions. For example, what is the role of government in your living room? This question goes back to the 3rd Amendment, which is to say, can the government quarter troops there?

The National Center for Missing and Exploited Children (NCMEC) has long been an example of a government-private hybrid. Under 18 USC Sec. 2258A, “communication service providers and computing service providers”-- in other words, google, yahoo, and other providers-- must surrender any known or suspected information related to child pornography to NCMEC and from there, law enforcement.

At the same time, Section 230 of the Communications Decency Act shields “publishers and speakers” who want to have some control over moderating comments on their platforms, without civil liability for making wrong judgment calls. Multiple lawsuits allege that Section 230 enables and provides legal shield for sites like “Backpage,” where children have been trafficked for sex.  Backpage was recently shut down for facilitating illegal acts, including not just sex trafficking but also adult prostitution.

As a result of these high profile cases and a desire to make social media sites accountable for their content including bot-originated content, we are eager to reform this key piece of legislation.  Accordingly, the Stop Enabling Online Sex Trafficking Act passed last month. But it is worth recalling the impetus for Section 230 in the first place: to allow responsible moderators to do some ;moderation, rather than being hands-off to avoid liability.

And after all, shouldn’t we expect Facebook and Twitter to do something more than what they have, when it comes to Russian bots influencing our elections and dynamic pricing on social media including Facebook that favors controversial candidates like Trump?

Ultimately, it’s fair that we continually evaluate what level of law enforcement oversight we want in our social media, email, and other avenues of communication. Child pornography is a crime that lives in the extreme shadows of laws-- but it is important as a case study in itself and as a measure of how far we are willing to go to lawfully patrol some of the annals of communication. That which can be monitored and measured for purposes that rankle us may also be used for purposes on which we agree universally to go quite far in prosecuting-- and the questions must look the same under both scenarios.

Image credit: Den Rise/Shutterstock


The views expressed herein are the personal views of the author and do not necessarily represent the views of the FCC or the US Government, for whom the author works.

Ready to join the Fels community?

Apply now

Want to know more about what Fels has to offer?

Request information

Our team is here to help.

Contact us