قالب وردپرس درنا توس
Home / Technology / New rules challenge Google and Facebook to change the moderation

New rules challenge Google and Facebook to change the moderation



In recent years, content moderation has reached a critical point. We've seen all sorts of ugliness thrive on platforms like Facebook, Twitter, and YouTube, whether it's coordinated harassment, scams, political influence, or bizarre algorithmic buddies. At the same time, inconsistent and sometimes excessive moderation has become an increasingly partisan political issue, with conservative celebrities appearing in front of Congress to make bleak claims about censorship. In both cases, the loss of confidence caused by a lack of transparency is noticeable. Much of the global speech takes place on closed platforms like Facebook and YouTube, and users still have little control over the rules that apply to this speech.

Today, a coalition of nonprofit groups attempting to fill this gap with a list of Basic Moderation Standards called the Santa Clara Principles on Transparency and Accountability in Content Moderation has been designed as a set of minimum standards for handling user content on the Internet. The end product is based on work by the American Civil Liberties Union, the Electronic Frontier Foundation, the Center for Democracy & Technology and the Open Technology Institute of New America, as well as a number of independent experts. Together they call for more thorough notification when an item is dismantled, a stronger opposition process and new transparency reports on the total number of vacancies and accounts.

They are simple measures, but they give users far more information and opportunities than they currently have on Facebook, YouTube and other platforms. The result is a new roadmap for moderation of platforms ̵

1; and an open challenge for any company that moderates content online.

Santa Clara Rules Each time an account or other content is removed, the user is given a specific explanation of how and why the content was tagged, with a reference to the specific policy that it violated , The user could also challenge the decision and present a new human moderator to a complainant. Companies would also submit a regular moderation report based on the latest reports on government data requirements, listing the total number of flagged accounts and justification for each flag.

"What we are talking about is the internal law of these platforms," ​​says Kevin Bankston, director of the Open Technology Institute, who has been working on the document. "Our goal is to make sure the process is as open as possible."

So far, companies have commented on the new guidelines. Google and Twitter did not want to comment on the new rules; Facebook did not respond to multiple requests.

But while companies still have to act on the Santa Clara rules, some are even able to take similar action. For the first time, Facebook released its full moderation guidelines last month that set specific rules for violence and nudity that have determined decisions for years. The company has also launched its first formal appeal process for users who believe that they have been mistakenly blocked.

YouTube is closer to compliance, though there is still a lack of transparency. The platform already has a notification and appeal process, and its policies have been public from the start. YouTube released its first quarterly moderation report in April, which lists the 8.2 million videos removed in the last quarter of 2017. But while the report extends the guidelines for human flags, there are not the same details if the content was featured on the automated systems that account for the majority of content removal on YouTube.

The Santa Clara document is limited to process problems and avoids many of the most sensitive questions about moderation. The rules do not talk about what content should be removed or when a particular post can be considered as a threat to user security. It's also not about political speeches or snippets about news, such as Twitter's controversial world politics .

But many of the experts involved say that the rules more than a minimum of standards are a definitive list of claims. "I have criticized some specific policies – from nudity to terrorism," says Jillian C. York, who worked on the rules for EFF. "Ultimately, though, I do not believe that moderation of content is going to take place anytime soon, so it's a good start to communicate it through transparency and a fair process."


Source link