526
top 50 comments
sorted by: hot top controversial new old
[-] fluffykittycat@slrpnk.net 174 points 1 week ago

It was never designed to protect children

Glad to see it's not even working. Let's keep fighting aginst these evil laws

[-] expr@piefed.social 65 points 1 week ago

I mean, social media should be banned for everyone, not just teenagers. It's a great evil in the world today, and in a functional democracy that wasn't braindead, we should ban them outright for the mass harm and destruction they have caused.

That being said, I fully understand that the motivations of countries for these kinds of bans have little to do with the harm of social media and are much more about surveillance.

[-] Link@rentadrunk.org 23 points 1 week ago

Which type of social media are we referring to here?

Doesn’t Lemmy count as social media?

[-] nguarracino@programming.dev 7 points 1 week ago

There's a list of 10 or 12 social networks that are banned: YouTube, Instagram, TikTok, etc.

Lemmy is still legal.

[-] Grainne@lemmy.dbzer0.com 21 points 1 week ago

Lemmy is legal because it’s too small for them to notice.

And YouTube is an incredible resource for finding information. It’s not social media at all.

Its also an incredible resource for finding misinformation and disinformation unfortunately.

load more comments (1 replies)
load more comments (2 replies)
[-] yardratianSoma@lemmy.ca 9 points 1 week ago

It's so bonkers how most of the older generations agree that being on the internet cannot make you social, yet became the default method to communicate.

Ban it for everyone? I mean, lemmy itself is a social network platform, if you want it to be. But I know what you mean: social media being the most used platforms, Google, Facebook, Tik-Tok, etc . . . And for that, yeah, I do agree with a full ban. We need a cultural reset, where we aren't being fed sensationalist bullshit and pure brainrot as entertainment via an algorithm trained on our insufficient capacity to regulate our attention.

[-] Dave@lemmy.nz 27 points 1 week ago

In my view social media is probably not the problem, but the algorithms they use that are designed to be addictive and manipulative.

I saw an article once arguing that the algorithms should be regulated in a similar way to medicine. Give some base ingredients they can use freely (e.g. sort by newest first), then require any others to run studies to prove they are not harmful.

There would be an expert board that approves or declines the new algorithm in the same way medicines are approved today (the important bit being that they are experts, not politicians making the decision).

[-] Instigate@aussie.zone 11 points 1 week ago

This is the correct response. Social media, as a construct, is not evil and dos not do harm to anyone. The commodification and commercialisation of social media by capitalistic companies is what has caused the harm we see today.

All of the harms and evils of social media can be boiled down to a single concept: the algorithm. Because algorithmic recommendation of content wants to encourage people to stay on a platform (for capitalistic reasons), and the most enticing and attention-grabbing content is hate-content, these companies have forced hate-inducing concepts down the throats of people in an endeavour to make more money and destroyed individuals and families/friends in the process.

If we regulate the algorithms, we regulate the harm without disempowering anyone. We can, and we should, regulate algorithms on social media to turn it back into what it was 20-odd years ago - a measure to keep in touch with people you know or care about.

load more comments (4 replies)
[-] expr@piefed.social 7 points 1 week ago

If you take such a broad definition of social media, then nearly the entire Internet becomes "social media" and the term loses its meaning, IMO.

load more comments (1 replies)
load more comments (2 replies)
[-] Lodespawn@aussie.zone 18 points 1 week ago

I don't think they are evil. A bunch of people with good intentions who didn't understand the problem are trying to solve it with a gut feeling rather than analysis and evidence. It's really disappoi ting that they would waste so much of our time and money like this.

[-] Scotty_Trees@lemmy.world 16 points 1 week ago

Former Facebook higher ups have gone on the record to say the Facebook uses destructive algorithms to keep people hooked, they know exactly what they are doing and don't care how it affects us as long as they can squeeze more info from us for more profit. Thinking Silicon Valley tech billionaires actually care about you? Bro, you need to wake up.

load more comments (1 replies)
[-] Deceptichum@quokk.au 10 points 1 week ago* (last edited 1 week ago)

The “good intention” was the packaging. The real intent was population control.

load more comments (1 replies)

Good intentions without the spirit of cooperation or respect for consent is still evil.

The main problem with all of these internet surveillance tools being marketed as ways to protect children is that people are engaging with them on that basis.

As far as I'm concerned they haven't done anything to establish that they actually intend to protect children or that this is a reasonable way to do it. This seems like a solution to a different problem that ignores all of the problems it creates.

Parents should be responsible for their children. A random website creator shouldn't have to be responsible for your children.

Websites aren't stores where people walk in off of a public street. They are services that people reach out to and engage with specifically and intentionally. If we can address the non-consensual non-intentionality part of internet tracking and surveillance a lot of this stuff goes away. So maybe rather than regulating the website to protect your children we should be regulating the website to protect consent.

load more comments (1 replies)
load more comments (2 replies)
[-] gurty@lemmy.world 124 points 1 week ago

‘…internally the government was aware of a lack of evidence to support the ban before they passed the legislation anyway’

Terrific job, gov.

[-] Australis13@fedia.io 49 points 1 week ago

Our government is usually technologically inept.

The first online census (2016) crashed the system because they didn't allow enough capacity. Anyone with half a brain could have told them that most people were going to try to use it during one particular time -- after dinner (especially since the paper census is supposed to count everyone on that particular night). Instead, they decided to rate it for only 1 million form submissions per hour, despite estimating that two-thirds of Australians would fill it out online. At one person per family, that's around 4 million online submissions. Now factor in that the eastern states have most of the population (and are all in the same time zone at that time of year) and, predictably, the site went down after dinner on census night.

https://www.abc.net.au/news/2016-08-09/abs-website-inaccessible-on-census-night/7711652

[-] Lexam@lemmy.world 51 points 1 week ago

I don't know. There's some joy in saying I told you so, to people who had the hubris to try and stop teenagers from being teenagers.

[-] MagicShel@lemmy.zip 21 points 1 week ago

We will simply pass laws requiring them to be adults! Easy!

[-] sbv@sh.itjust.works 43 points 1 week ago

With a 70% non-compliance rate, that isn't entirely surprising.

Platforms are even less likely to implement real reforms that the author alludes to.

[-] Psythik@lemmy.world 33 points 1 week ago* (last edited 1 week ago)

Similar thing happened where I live with porn. Recently passed a law requiring ID. Instead of complying, I just started going to different websites. No way am I giving up my identity to a sketchy porn site, no matter what the law says.

[-] BygoneNeutrino@lemmy.world 7 points 1 week ago

I still think it's a step in the right direction. Once you make it illegal for children to use social media, you can start going after the platforms for knowingly manipulating children.

[-] evilcultist@sh.itjust.works 51 points 1 week ago* (last edited 1 week ago)

Or we can just go after the platforms for knowingly manipulating everyone. And for their invasive data collection. This is probably one reason why Meta spent more on lobbying (primarily for age verification) than Boeing and Lockheed Martin did on lobbying last year. Once the kids are identified, no one gives a shit about the adults so the problem (for them) just fades away.

load more comments (1 replies)
[-] Alwaysnownevernotme@lemmy.world 24 points 1 week ago

You know what they say. Prohibition, works every time.

[-] BygoneNeutrino@lemmy.world 7 points 1 week ago* (last edited 1 week ago)

Prohibition is effective, it's just that it doesn't work for easy to manufacture compounds such as alcohol or marijuana. Every known human culture has independently discovered alcohol, and marijuana is a weed that is ready to smoke in its natural form.

As far as social media goes, my country has reached a point where TikTok and Facebook are preinstalled on every phone. If a parent buys their kid a phone and removes them, they will reinstall themselves after an automatic update. When you take into consideration the "streamlined" registration process, one can argue this is a means to target prepubescent children.

...I guess an 8 year old could download a VPN and steal their parents identification, but I feel like some form of prohibition would help.

[-] Alwaysnownevernotme@lemmy.world 18 points 1 week ago

So you not only create a grey market you immediately inculcate the children into it.

Prohibition is generally ineffective in anything that doesn't involve violating someone else's rights.

If we're talking about getting rid of slopware I'm all for it. But this law. And other laws like it are an incredibly thinley veiled attempt to silence dissent by tying peoples online comments to their employment and subsequently housing and healthcare.

And I will never believe that this is done out concern for children.

[-] insaneinthemembrane@lemmy.world 10 points 1 week ago

The pre installed apps is the problem, make that illegal instead.

[-] shortwavesurfer@lemmy.zip 42 points 1 week ago

Speak for yourself. I find quite a bit of joy in "I told you so".

[-] deathbird@mander.xyz 40 points 1 week ago

Key point: "Ultimately, the fundamental problem with age-gating is that it fails to address any of the root problems with our current online landscape – that is, the extractive business models and pernicious design features of mainstream tech companies. We all exist in a highly commercialised information ecosystem, rife with algorithmically amplified misinformation, scams, harmful content and AI slop. Children are particularly vulnerable to these issues but the reality is that it impacts everyone, even if you’re blissfully absent from Facebook or Instagram."

[-] imjustmsk@lemmy.ml 10 points 1 week ago

They don't wanna solve the root problem, they just want to make the big tech companies happy as well as the people who is sayiing shit about social media happy, Age verification is their stupid answer to which translates to "We don't give a flying shit about kids"

load more comments (1 replies)
[-] BranBucket@lemmy.world 39 points 1 week ago

What if, instead of trying and failing to kick kids off social media, we focused our attention on the reasons why being online is so often detrimental in the first place?

Pre-fucking-cisely.

[-] Jimbel@lemmy.world 38 points 1 week ago

The addictive design of platforms, software and algorithms should be adressed, not the users age.

And the tech companies should be made responsible to design more healthy platforms, etc.

The problem is the design of tech, not the people using it.

[-] A_Random_Idiot@lemmy.world 13 points 1 week ago

Why is everyone forgetting the parents in this shit. They are the ones giving their kids access to this shit, not monitoring and moderating their access to this shit, and letting screens do the job of raising their kids instead of doing it themselves.

load more comments (8 replies)
[-] coolmojo@lemmy.world 7 points 1 week ago

But without the addictive design the users don't spend enough time to see all the ads and tracking required to reach the target growth. Somebody think of the shareholders /s

load more comments (3 replies)
[-] commander@lemmy.world 36 points 1 week ago

They're propaganda laws. Internet censorship laws. Palestinian genocide started trending on social media and suddenly all the countries out in the west wanted to start banning/controlling social media. Plus the earlier push to ban TikTok by Facebook to try to ladder pull the market from competitors

[-] wewbull@feddit.uk 33 points 1 week ago* (last edited 1 week ago)

The fallback argument for the social media ban is that it’s better than nothing. But with results like these, it may be worse than nothing, given it potentially creates new problems. Children will remain online with arguably less supervision and support, new privacy and digital security vulnerabilities seem to have appeared and the worst aspects of social media lay largely unaddressed.

I wish more people understood this. Changing something can mean you've caused harm unintentionally, even if you haven't identified it yet. Too many people seem to have the thought process "We have to do something! This is something. Let's do this." without ever considering the harm they might do.

[-] FlashMobOfOne@lemmy.world 25 points 1 week ago

A 30% reduction of kids being exposed to these harmful platforms is a good thing and I'm glad to see it.

Also, all laws are imperfect, and expecting 100% efficacy is moronic.

[-] fodor@lemmy.zip 8 points 1 week ago

Right, but the politicians didn't sell the law at 30% efficiency. They sold it at something like 95% efficiency. So they lied and they haven't solved anything.

Maybe they could have used all of that money to run campaigns to help convince parents to properly supervise their children. Maybe that would have done more than this 30% figure.

[-] FlyingCircus@lemmy.world 11 points 1 week ago

Or maybe, instead of creating privacy-infringing laws or blaming parents, we actually dismantle the tech companies who created them and imprison their leaders. We all know corporate social media is cancer, that’s why we’re on Lemmy. So let’s fucking do something about the cancer instead targeting the victims or worse, exploiting the situation to expand the surveillance state.

load more comments (1 replies)
load more comments (2 replies)
[-] Baggie@lemmy.zip 18 points 1 week ago

This and the porn thing have been massively invasive in terms of privacy. It's so transparently just building a database of facial data. It doesn't even make an attempt to comprehensively block everything on the internet, or realistically enforce compliance.

[-] melsaskca@lemmy.ca 17 points 1 week ago

Censorship is never the answer. Teaching values and the corresponding ethics and morals that come with it is closer to the answer. A world where you burn down shit just to get a job as a firefighter makes this path a bit more difficult and harder to follow.

[-] UnderpantsWeevil@lemmy.world 9 points 1 week ago

Censorship is never the answer.

https://en.wikipedia.org/wiki/Paradox_of_tolerance

Formally banning certain forms of vulgar and bigoted expression establish a code of conduct for the community, even if they aren't strictly enforced.

Teaching values and the corresponding ethics and morals that come with it is closer to the answer.

Morality is as much about proactive and affirmative pursuit of justice as internalized codes of conduct.

If there is no social consequence for immoral behavior, there is no reason to believe the act is immoral.

[-] Reviever@lemmy.world 7 points 1 week ago

Censorship was never their intention. So they couldn't give any less fucks. They just want to control us.

load more comments (3 replies)
[-] blind3rdeye@aussie.zone 15 points 1 week ago

I've talked to heaps of parents and heaps of kids about this. What I think is interesting is that people face-to-face seems to be generally supportive of the law. They say that social media is problematic, and that the law helps by discouraging its use. A few different kids have said that they it helps them break an addition. Other kids say they don't care, because it hasn't blocked them. So mostly positive or neutral responses when face-to-face.

But every time I see this mentioned on the internet, it's very negative. There are always heaps of comments saying that it is a failure, and could never work, and that the government is stupid; and there are often other comments saying it is a part of a secret plan for the government to track us or whatever. In any case, mostly negative views - with just a sprinkling of fairly neutral views such as "it hasn't been active for very long. Lets wait and see."

I just think that's interesting. I guess my real-world social circles don't totally match my internet social circles.

[-] emmy67@lemmy.world 10 points 1 week ago

Kids will often just repeat what they've heard to adults.

But the largest problems to these laws is the way they affected minority groups. If followed, the law would disproportionately affect disabled and queer teens who may suddenly be unable to access help and community.

I suspect there's some selection bias in the kids you're speaking to.

load more comments (2 replies)
[-] Amnesigenic@lemmy.ml 10 points 1 week ago

The vast majority of new systems throughout history have required some iterative refinement, the fact that this specific implementation attempt didn't work perfectly on day one isn't a particularly strong argument against the concept, and there are plenty of good arguments to be made against it

load more comments (1 replies)
[-] scarabic@lemmy.world 10 points 1 week ago

7 in 10 children remain on major social media services? Does this mean they got 30% of the children off of them? I’d say that’s something other than total failure. A start.

load more comments (5 replies)
[-] M0oP0o@mander.xyz 9 points 1 week ago

What? There is emence amounts of joy in "I told you so". The majority of people warned them this was a stupid idea and now you want to piss on the good feeling of smug correct calling of the clearly failure idea? Fuck off.

load more comments (1 replies)
[-] someguy3@lemmy.world 8 points 1 week ago

IMO It's not a question if they remain on, but how much time they spend on it. She's focusing on the wrong metric.

load more comments
view more: next ›
this post was submitted on 03 Apr 2026
526 points (100.0% liked)

Technology

83753 readers
2474 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS