Frances Haugen says Facebook’s actions are dangerous. That is why.


In his testimony, Haugen repeatedly mentioned how the phenomenon has worsened in non-English-speaking areas due to Facebook’s wide distribution of different languages.

“In the case of Ethiopia there are 100 million people and six languages. Facebook only holds two languages ​​to be honest,” he said. “This approach to focusing on language, AI-related systems to save us will not fail.”

He continued: “Investing in non-traditional means and reducing the platform not only protects our freedom of speech, it also protects people’s lives.”

I will explore this in another article from earlier this year on the failure of major types of languages, or LLMs:

While LLMs have a language barrier, Facebook relies heavily on them to change content around the world. Battle of the Tigray[, Ethiopia] started in November, [AI ethics researcher Timnit] Gebru saw a tower struggling to find a handle on a number of false positives. This is an indication of the perpetual patterns that researchers have observed in moderation. Areas that speak non-native languages ​​and Silicon Valley live in high-risk digital environments.

Gebru realized that this is not where the problems end, either. False stories, hate speech, even threats of execution cannot be changed, they are removed as training data to create the next generation of LLMs. And those nations, repeating what they have been taught, are ultimately reintroducing these evil languages ​​online.

How does Facebook content relate to the mental health of young people?

One of the most shocking revelations from Facebook’s Facebook Files was an in-depth Instagram survey, which found that its platform promotes mental health among young girls. “Thirty of the young girls reported that when they feel sorry for their bodies, Instagram causes them to feel more pain,” the researchers wrote in a photo from March 2020.

Haugen connects this phenomenon with the practice of courtship, which he told the Senate today “is making young people more prone to anorexia.”

“If Instagram is a big help, have we seen the best years of youth health in the last 10 years? No, we’ve seen a lot of suicides and frustrations among young people,” he said. “

In my remarks, I heard from a former AI researcher who also saw this appear on Facebook.

A team of researchers… found that users who tend to post or engage in sweets — a potential sign of frustration — may start using harmful substances that also increase their mental health.

But like Haugen, the researcher found that leadership did not want to make changes over time.

The team will help to adopt the types of content that these users use to stop promoting commitment, so that they can be exposed to frustrating things. “The leadership question was: Should we be eligible to participate in the discussion if you find that someone is at risk for depression?” he remembers.

But anything that diminished participation, even for reasons such as not magnifying one’s frustration, created chaos among the leaders. With their job descriptions and salary adjustments to get the job done, the employees diligently learned to leave those who received the pushback and continue working for those who were ordered from top to bottom….

The former employee, does not allow his daughter to use Facebook.

How do we fix this?

Haugen opposes violating Facebook or repealing Section 230 of the US Communications Decency Act, which protects technology platforms from being held accountable for distribution.

Instead, he advocates for the possibility of the practice of extremism in Section 230 for the adoption of algorithms, which he says has “completed the role.” He also recommends a return to the Facebook feed of the time.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *