Frances Haugen, the Facebook whistleblower who has exposed the company’s alleged inaction to fight misinformation and hate speech, has urged European lawmakers to seize a “once-in-a-generation opportunity” to set global standards and inspire other countries around the world.
Appearing before the European Parliament on 8 November, Haugen told lawmakers that if the EU rules are done right, “you can create a game-changer for the world, you can force platforms to price in societal risk to their business operations so the decisions about what products to build and how to build them are not purely based on profit maximisation, and you can show the world how transparency, oversight and enforcement should work.”
Following her damning revelations about how Facebook amplifies harmful and divisive content to make more money, MEPs agreed to ask Haugen, to present her findings to the European Parliament and explain how her insights into the tech behemoth could apply to the EU’s upcoming content moderation rules, the Digital Services Act (DSA).
The parliament is currently working on the fine print of the DSA, a Bill that aims to impose new restrictions on how tech companies like Facebook and Google police content on their platforms.
The American data engineer fielded questions from MEPs that focused on how to make the platforms more accountable and to ensure that risk assessment and risk mitigation provisions in the proposed Digital Services Act (DSA) are strong enough to avoid abuses, polarisation, and address risks to democracy.
Members also asked Haugen for her views on regulating not only illegal but also harmful content, on content moderation tools, and whether targeted advertising should be banned.
In her replies, Haugen emphasised the importance of ensuring that companies like Facebook publicly disclose data and how they collect them (on ranking content, advertising, scoring parameters for example) to allow people to make transparent decisions and prohibit “dark patterns” online.
On countering disinformation and demoting harmful content, Haugen stressed that Facebook is substantially less transparent than other platforms and could do much more to make algorithms safer by setting limits on how many times the content can be re-shared, increasing services to support more languages, transparent risk assessment, making platforms more human-scaled and finding ways for users to moderate each other rather than being moderated by artificial intelligence.
The whistleblower also mentioned how crucial it is for governments to protect tech whistleblowers, as their testimonies will be key to protecting people from harm caused by digital technologies in the future.
Haugen also commended lawmakers for their content-neutral approach but warned against possible loopholes and exemptions for media organisations and trade secrets.
“The devil will be in the details. For example, if you write a broad exemption from transparency for anything classified as a ‘trade secret,’ the companies will say everything is a trade secret,” she told lawmakers.
Companies should not be allowed to use trade secrets as “an excuse” to refuse access to data, she added, highlighting the importance of greater access to Facebook’s inner workings for independent researchers, investigative journalists and NGOs.
Haugen also warned that news media shouldn’t be exempted from the Digital Services Act, because the rules need to be neutral to fight harmful content such as disinformation. “Let me be very clear — every modern disinformation campaign will exploit news media channels on digital platforms by gaming the system,” she said.
As the European Parliament considers the amendments to the legislation, press freedom organisations sent an open letter to Members of the European Parliament urging them to seize the opportunity of the Digital Services Act, particularly in Article 29, to create a diversified and decentralised environment for recommender systems (those algorithms aimed at suggesting relevant items to users such as movies to watch, text to read, products to buy, and so on).
That way, platform users can opt-in to such profiling systems and would ensure that the user is fully aware and explicitly consenting to this beforehand. This type of measure would address the core element within Facebook that promotes the way that toxic content currently spreads.