It was never designed to protect children
Glad to see it's not even working. Let's keep fighting aginst these evil laws
It was never designed to protect children
Glad to see it's not even working. Let's keep fighting aginst these evil laws
I mean, social media should be banned for everyone, not just teenagers. It's a great evil in the world today, and in a functional democracy that wasn't braindead, we should ban them outright for the mass harm and destruction they have caused.
That being said, I fully understand that the motivations of countries for these kinds of bans have little to do with the harm of social media and are much more about surveillance.
Which type of social media are we referring to here?
Doesn’t Lemmy count as social media?
There's a list of 10 or 12 social networks that are banned: YouTube, Instagram, TikTok, etc.
Lemmy is still legal.
Lemmy is legal because it’s too small for them to notice.
And YouTube is an incredible resource for finding information. It’s not social media at all.
Its also an incredible resource for finding misinformation and disinformation unfortunately.
It's so bonkers how most of the older generations agree that being on the internet cannot make you social, yet became the default method to communicate.
Ban it for everyone? I mean, lemmy itself is a social network platform, if you want it to be. But I know what you mean: social media being the most used platforms, Google, Facebook, Tik-Tok, etc . . . And for that, yeah, I do agree with a full ban. We need a cultural reset, where we aren't being fed sensationalist bullshit and pure brainrot as entertainment via an algorithm trained on our insufficient capacity to regulate our attention.
In my view social media is probably not the problem, but the algorithms they use that are designed to be addictive and manipulative.
I saw an article once arguing that the algorithms should be regulated in a similar way to medicine. Give some base ingredients they can use freely (e.g. sort by newest first), then require any others to run studies to prove they are not harmful.
There would be an expert board that approves or declines the new algorithm in the same way medicines are approved today (the important bit being that they are experts, not politicians making the decision).
This is the correct response. Social media, as a construct, is not evil and dos not do harm to anyone. The commodification and commercialisation of social media by capitalistic companies is what has caused the harm we see today.
All of the harms and evils of social media can be boiled down to a single concept: the algorithm. Because algorithmic recommendation of content wants to encourage people to stay on a platform (for capitalistic reasons), and the most enticing and attention-grabbing content is hate-content, these companies have forced hate-inducing concepts down the throats of people in an endeavour to make more money and destroyed individuals and families/friends in the process.
If we regulate the algorithms, we regulate the harm without disempowering anyone. We can, and we should, regulate algorithms on social media to turn it back into what it was 20-odd years ago - a measure to keep in touch with people you know or care about.
If you take such a broad definition of social media, then nearly the entire Internet becomes "social media" and the term loses its meaning, IMO.
I don't think they are evil. A bunch of people with good intentions who didn't understand the problem are trying to solve it with a gut feeling rather than analysis and evidence. It's really disappoi ting that they would waste so much of our time and money like this.
Former Facebook higher ups have gone on the record to say the Facebook uses destructive algorithms to keep people hooked, they know exactly what they are doing and don't care how it affects us as long as they can squeeze more info from us for more profit. Thinking Silicon Valley tech billionaires actually care about you? Bro, you need to wake up.
The “good intention” was the packaging. The real intent was population control.
Good intentions without the spirit of cooperation or respect for consent is still evil.
The main problem with all of these internet surveillance tools being marketed as ways to protect children is that people are engaging with them on that basis.
As far as I'm concerned they haven't done anything to establish that they actually intend to protect children or that this is a reasonable way to do it. This seems like a solution to a different problem that ignores all of the problems it creates.
Parents should be responsible for their children. A random website creator shouldn't have to be responsible for your children.
Websites aren't stores where people walk in off of a public street. They are services that people reach out to and engage with specifically and intentionally. If we can address the non-consensual non-intentionality part of internet tracking and surveillance a lot of this stuff goes away. So maybe rather than regulating the website to protect your children we should be regulating the website to protect consent.
‘…internally the government was aware of a lack of evidence to support the ban before they passed the legislation anyway’
Terrific job, gov.
Our government is usually technologically inept.
The first online census (2016) crashed the system because they didn't allow enough capacity. Anyone with half a brain could have told them that most people were going to try to use it during one particular time -- after dinner (especially since the paper census is supposed to count everyone on that particular night). Instead, they decided to rate it for only 1 million form submissions per hour, despite estimating that two-thirds of Australians would fill it out online. At one person per family, that's around 4 million online submissions. Now factor in that the eastern states have most of the population (and are all in the same time zone at that time of year) and, predictably, the site went down after dinner on census night.
https://www.abc.net.au/news/2016-08-09/abs-website-inaccessible-on-census-night/7711652
I don't know. There's some joy in saying I told you so, to people who had the hubris to try and stop teenagers from being teenagers.
We will simply pass laws requiring them to be adults! Easy!
With a 70% non-compliance rate, that isn't entirely surprising.
Platforms are even less likely to implement real reforms that the author alludes to.
Similar thing happened where I live with porn. Recently passed a law requiring ID. Instead of complying, I just started going to different websites. No way am I giving up my identity to a sketchy porn site, no matter what the law says.
I still think it's a step in the right direction. Once you make it illegal for children to use social media, you can start going after the platforms for knowingly manipulating children.
Or we can just go after the platforms for knowingly manipulating everyone. And for their invasive data collection. This is probably one reason why Meta spent more on lobbying (primarily for age verification) than Boeing and Lockheed Martin did on lobbying last year. Once the kids are identified, no one gives a shit about the adults so the problem (for them) just fades away.
You know what they say. Prohibition, works every time.
Prohibition is effective, it's just that it doesn't work for easy to manufacture compounds such as alcohol or marijuana. Every known human culture has independently discovered alcohol, and marijuana is a weed that is ready to smoke in its natural form.
As far as social media goes, my country has reached a point where TikTok and Facebook are preinstalled on every phone. If a parent buys their kid a phone and removes them, they will reinstall themselves after an automatic update. When you take into consideration the "streamlined" registration process, one can argue this is a means to target prepubescent children.
...I guess an 8 year old could download a VPN and steal their parents identification, but I feel like some form of prohibition would help.
So you not only create a grey market you immediately inculcate the children into it.
Prohibition is generally ineffective in anything that doesn't involve violating someone else's rights.
If we're talking about getting rid of slopware I'm all for it. But this law. And other laws like it are an incredibly thinley veiled attempt to silence dissent by tying peoples online comments to their employment and subsequently housing and healthcare.
And I will never believe that this is done out concern for children.
The pre installed apps is the problem, make that illegal instead.
Speak for yourself. I find quite a bit of joy in "I told you so".
Key point: "Ultimately, the fundamental problem with age-gating is that it fails to address any of the root problems with our current online landscape – that is, the extractive business models and pernicious design features of mainstream tech companies. We all exist in a highly commercialised information ecosystem, rife with algorithmically amplified misinformation, scams, harmful content and AI slop. Children are particularly vulnerable to these issues but the reality is that it impacts everyone, even if you’re blissfully absent from Facebook or Instagram."
They don't wanna solve the root problem, they just want to make the big tech companies happy as well as the people who is sayiing shit about social media happy, Age verification is their stupid answer to which translates to "We don't give a flying shit about kids"
What if, instead of trying and failing to kick kids off social media, we focused our attention on the reasons why being online is so often detrimental in the first place?
Pre-fucking-cisely.
The addictive design of platforms, software and algorithms should be adressed, not the users age.
And the tech companies should be made responsible to design more healthy platforms, etc.
The problem is the design of tech, not the people using it.
Why is everyone forgetting the parents in this shit. They are the ones giving their kids access to this shit, not monitoring and moderating their access to this shit, and letting screens do the job of raising their kids instead of doing it themselves.
But without the addictive design the users don't spend enough time to see all the ads and tracking required to reach the target growth. Somebody think of the shareholders /s
They're propaganda laws. Internet censorship laws. Palestinian genocide started trending on social media and suddenly all the countries out in the west wanted to start banning/controlling social media. Plus the earlier push to ban TikTok by Facebook to try to ladder pull the market from competitors
The fallback argument for the social media ban is that it’s better than nothing. But with results like these, it may be worse than nothing, given it potentially creates new problems. Children will remain online with arguably less supervision and support, new privacy and digital security vulnerabilities seem to have appeared and the worst aspects of social media lay largely unaddressed.
I wish more people understood this. Changing something can mean you've caused harm unintentionally, even if you haven't identified it yet. Too many people seem to have the thought process "We have to do something! This is something. Let's do this." without ever considering the harm they might do.
A 30% reduction of kids being exposed to these harmful platforms is a good thing and I'm glad to see it.
Also, all laws are imperfect, and expecting 100% efficacy is moronic.
Right, but the politicians didn't sell the law at 30% efficiency. They sold it at something like 95% efficiency. So they lied and they haven't solved anything.
Maybe they could have used all of that money to run campaigns to help convince parents to properly supervise their children. Maybe that would have done more than this 30% figure.
Or maybe, instead of creating privacy-infringing laws or blaming parents, we actually dismantle the tech companies who created them and imprison their leaders. We all know corporate social media is cancer, that’s why we’re on Lemmy. So let’s fucking do something about the cancer instead targeting the victims or worse, exploiting the situation to expand the surveillance state.
This and the porn thing have been massively invasive in terms of privacy. It's so transparently just building a database of facial data. It doesn't even make an attempt to comprehensively block everything on the internet, or realistically enforce compliance.
Censorship is never the answer. Teaching values and the corresponding ethics and morals that come with it is closer to the answer. A world where you burn down shit just to get a job as a firefighter makes this path a bit more difficult and harder to follow.
Censorship is never the answer.
https://en.wikipedia.org/wiki/Paradox_of_tolerance
Formally banning certain forms of vulgar and bigoted expression establish a code of conduct for the community, even if they aren't strictly enforced.
Teaching values and the corresponding ethics and morals that come with it is closer to the answer.
Morality is as much about proactive and affirmative pursuit of justice as internalized codes of conduct.
If there is no social consequence for immoral behavior, there is no reason to believe the act is immoral.
Censorship was never their intention. So they couldn't give any less fucks. They just want to control us.
I've talked to heaps of parents and heaps of kids about this. What I think is interesting is that people face-to-face seems to be generally supportive of the law. They say that social media is problematic, and that the law helps by discouraging its use. A few different kids have said that they it helps them break an addition. Other kids say they don't care, because it hasn't blocked them. So mostly positive or neutral responses when face-to-face.
But every time I see this mentioned on the internet, it's very negative. There are always heaps of comments saying that it is a failure, and could never work, and that the government is stupid; and there are often other comments saying it is a part of a secret plan for the government to track us or whatever. In any case, mostly negative views - with just a sprinkling of fairly neutral views such as "it hasn't been active for very long. Lets wait and see."
I just think that's interesting. I guess my real-world social circles don't totally match my internet social circles.
Kids will often just repeat what they've heard to adults.
But the largest problems to these laws is the way they affected minority groups. If followed, the law would disproportionately affect disabled and queer teens who may suddenly be unable to access help and community.
I suspect there's some selection bias in the kids you're speaking to.
The vast majority of new systems throughout history have required some iterative refinement, the fact that this specific implementation attempt didn't work perfectly on day one isn't a particularly strong argument against the concept, and there are plenty of good arguments to be made against it
7 in 10 children remain on major social media services? Does this mean they got 30% of the children off of them? I’d say that’s something other than total failure. A start.
What? There is emence amounts of joy in "I told you so". The majority of people warned them this was a stupid idea and now you want to piss on the good feeling of smug correct calling of the clearly failure idea? Fuck off.
IMO It's not a question if they remain on, but how much time they spend on it. She's focusing on the wrong metric.
This is a most excellent place for technology news and articles.