Facebook CEO Mark Zuckerberg says social media platforms, as well as the internet in general, need to have more oversight and perhaps some government regulation.
In an op-ed published in the Washington Post and on his own Facebook page, Zuckerberg said he’s spent the better part of the last two years thinking about the issue and concluded that there are roles for private companies and public institutions in regulating content.
“I believe we need a more active role for governments and regulators,” Zuckerberg wrote. “By updating the rules for the internet, we can preserve what's best about it - the freedom for people to express themselves and for entrepreneurs to build new things - while also protecting society from broader harms.”
The soul-searching also follows the bloody terror attack on a mosque in New Zealand last month that was live-streamed on Facebook. Australia has already taken a hard line on violence on social media, warning that executives could be prosecuted if they fail to remove terrorist content from their platforms.
Zuckerberg says he sees regulation taking shape in four areas: blocking “harmful” content, protecting election integrity and privacy, and facilitating data portability.
“We have a responsibility to keep people safe on our services,” Zuckerberg wrote. “That means deciding what counts as terrorist propaganda, hate speech and more. We continually review our policies with experts, but at our scale we'll always make mistakes and decisions that people disagree with.”
At the same time, Zuckerberg said he agrees with members of Congress who have said that Facebook has too much power over speech. Facebook has created an independent review body to hear user appeals of Facebook actions. The company is also working with French officials to enhance the effectiveness of review systems.
Global reach becomes an issue
Part of Facebook’s problem is that its users span the globe and reside in countries with different laws and customs concerning free speech. Zuckerberg also notes that different social media companies have different policies about what is acceptable.
“One idea is for third-party bodies to set standards governing the distribution of harmful content and measure companies against those standards,” Zuckerberg wrote. “Regulation could set baselines for what's prohibited and require companies to build systems for keeping harmful content to a bare minimum.”
Last week, Facebook announced that it would be banning white nationalist pages from its platform. Previously, the platform classified white supremacy as hate speech but did not extend the same designation to white nationalism or white separatism.