85
submitted 1 year ago by 0x815@feddit.de to c/technology@beehaw.org

Here's a brief summary, although you miss something if you don't read the study (trigger warning: stats):

  • The researchers suggest a novel incentive structure that significantly reduced the spread of misinformation and provide insights into the cognitive mechanisms that make it work. This structure can be adopted by social media platforms at no cost.

  • The key was to offer reaction buttons that participants were likely to use in a way that discerned between true and false information. Users who found themselves in such an environment, shared more true than false posts.

  • In particular, ‘trust’ and ‘distrust’ reaction buttons, which in contrast to ‘likes’ and ‘dislikes’, are by definition associated with veracity. For example, the study authors say, a person may dislike a post about Joe Biden winning the US presidential election, however, this does not necessarily mean that they think it is untrue.

  • Study participants used ‘distrust’ and ‘trust’ reaction buttons in a more discerning manner than ‘dislike’ and ‘like’ reaction buttons. This created an environment in which the number of social rewards and punishments in form of clicks were strongly associated with the veracity of the information shared.

  • The findings also held across a wide range of different topics (e.g., politics, health, science, etc.) and a diverse sample of participants, suggesting that the intervention is not limited to a set group of topics or users, but instead relies more broadly on the underlying mechanism of associating veracity and social rewards.

  • The researchers conclude that the new structure reduces the spread of misinformation and may help in correcting false beliefs. It does so without drastically diverging from the existing incentive structure of social media networks by relying on user engagement. Thus, this intervention may be a powerful addition to existing intervention such as educating users on how to detect misinformation.

you are viewing a single comment's thread
view the rest of the comments
[-] WhoRoger@lemmy.world 5 points 1 year ago

I dunno. Lots of topics are where most people simply can't tell what's true. Having an opinion is fine, but this feels a tad too far.

Personally I think it's fine if even misinformation exists as long as there's a warning/banner like Twitter does, that the general consensus is that it's incorrect. You never know when something controversial may turn out to be true, but facts aren't up to democracy.

[-] alyaza@beehaw.org 5 points 1 year ago

You never know when something controversial may turn out to be true, but facts aren’t up to democracy.

from a metaphysical standpoint sure, there might be inarguable facts--but i think the past few years especially have demonstrated that for social purposes almost all "facts" are kind of up to democracy, and many people have no interest in believing what is metaphysically true. i mean, we had people literally dying of COVID because they believed it was a hoax or because they believed in crank treatments of the virus.

[-] WhoRoger@lemmy.world 4 points 1 year ago

And I think it's fine for anyone to believe anything and express it (unless they literally call for physical harm and such).

But this is on the opposite end on the spectrum, where censorship is on the other end. Neither is ideal. As I said, I think having banners and warnings that the topic is controversial or disputed, or the consensus is the opposite, seems like the best compromise.

I mean, it would be fine if people who actually understand the topic would vote, but everyone would, and thus the trustworthiness rating loses its meaning imho.

this post was submitted on 07 Jun 2023
85 points (100.0% liked)

Technology

37716 readers
313 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS