The Week in Tech: Facebook’s First Step Toward Treating Our Data Better


Researchers who examine disinformation informed me that the swift motion by the platforms demonstrated some progress. “In the past it was denial,” stated Sinan Aral, a professor on the M.I.T. Sloan School of Management, concerning the previous responses of tech platforms to misinformation. “Then it was slow reaction. Now things are moving in the right direction.” But, he added: “I wouldn’t say it’s where we want it to be. Ultimately it needs to be proactive.”

That’s not straightforward to realize for a lot of causes. A take a look at the Chinese content material that Facebook and Twitter responded to exhibits that not all disinformation is made equal. Russia’s techniques, used to intrude with the 2016 and 2018 elections in the United States, had been offensive, centered on so-called wedge points to “widen the middle ground” and make it more durable for individuals “to come together to negotiate,” stated Samantha Bradshaw, a researcher on the Oxford Internet Institute. China’s have been defensive, “using the voice of authoritarian regimes, for suppressing freedom of speech” and “undermining and discrediting critical dissidents.”

I requested Professor Aral which sort of misinformation was more practical. “Let me be very clear,” he stated. “We have very little understanding about its effectiveness.”

There’s no consensus on methods to monitor it, or measure its impression. In massive half, that’s as a result of social media platforms have been reluctant to share particulars about how their algorithms work, or how content material is moderated. “Some of these really basic stats, researchers still don’t have access to,” Ms. Bradshaw stated.

Only by higher understanding how misinformation works will we be capable of work out methods to overcome it. And except we wish tech platforms to unilaterally clear up the issue, they might want to hand over some data to make that occur.

If the conclusions of these two tales appear in battle, that’s as a result of they’re. Social networks are underneath strain to higher defend consumer knowledge. They’re additionally being requested to open up so we will perceive how they’re tackling points like misinformation and hate speech.

Professor Aral referred to as this the “Transparency Paradox,” a time period he coined in 2018. “The only way to solve it,” he stated, “is to thread the needle, by becoming more transparent and more secure at the same time.”



Source link Nytimes.com

Get more stuff like this

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Leave a Reply

Your email address will not be published. Required fields are marked *

Get more stuff like this
in your inbox

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.