14.03.2021., 19:24
|
#28
|
McG
Datum registracije: Feb 2014
Lokacija: Varaždin
Postovi: 8,258
|
Citiraj:
How Facebook got addicted to spreading misinformation: The company’s AI algorithms gave it an insatiable habit for lies and hate speech. Now the man who built them can't fix the problem.
Citiraj:
Joaquin Quiñonero Candela, a director of AI at Facebook, was apologizing to his audience. It was March 23, 2018, just days after the revelation that Cambridge Analytica, a consultancy that worked on Donald Trump’s 2016 presidential election campaign, had surreptitiously siphoned the personal data of tens of millions of Americans from their Facebook accounts in an attempt to influence how they voted. It was the biggest privacy breach in Facebook’s history, and Quiñonero had been previously scheduled to speak at a conference on, among other things, “the intersection of AI, ethics, and privacy” at the company. He considered canceling, but after debating it with his communications director, he’d kept his allotted time.
The reason is simple. Everything the company does and chooses not to do flows from a single motivation: Zuckerberg’s relentless desire for growth. Quiñonero’s AI expertise supercharged that growth. His team got pigeonholed into targeting AI bias, as I learned in my reporting, because preventing such bias helps the company avoid proposed regulation that might, if passed, hamper that growth. Facebook leadership has also repeatedly weakened or halted many initiatives meant to clean up misinformation on the platform because doing so would undermine that growth.
But anything that reduced engagement, even for reasons such as not exacerbating someone’s depression, led to a lot of hemming and hawing among leadership.
With their performance reviews and salaries tied to the successful completion of projects, employees quickly learned to drop those that received pushback and continue working on those dictated from the top down.
Zuckerberg’s obsession with getting the whole world to use Facebook had found a powerful new weapon.
|
|
Izvor: MIT Technology Review
|
|
|