1
1
submitted 1 minute ago by 0x815@feddit.de to c/news@beehaw.org

Archived link

Since Russia’s full-scale invasion of Ukraine in February 2022, Russia’s attritional fighting has bombarded Ukrainian schools, hospitals and other vital civilian infrastructure in an attempt to make life unsustainable for remaining civilians.

But Russia has waged an aggressive campaign online as well as off, seeking to exploit and exacerbate divisions and tensions created by the war in Ukraine. While the strategies used are not new, the full-scale invasion saw an intensification of efforts to spread fear, muddy the waters, sow division, and ultimately, undermine support for Ukraine.

More than two years on, as more than two billion people across 50 countries head to polling stations in 2024, democracies around the globe are increasingly vulnerable to Russian influence attempts to polarise public opinion, discredit governments, and cast doubt on democracy itself.

**The information laundering process **

While we’ve heard plenty about the Kremlin’s narratives and disinformation campaigns – and the bot networks and troll farms behind them – we’ve heard less about the specific strategies that are making dis– and misinformation increasingly difficult to detect.

Key to this process is information laundering. Akin to money laundering, information laundering is when propaganda is spread through layers of media to conceal and distance it from its Kremlin origins. Actors use a range of techniques to build credibility and embed laundered information within public discourse, allowing falsehoods at the fringes of the media environment to go global and shape mainstream narratives.

One of the aims is to subtly manipulate information in a way that makes inaccuracies difficult to detect or debunk. In simple terms, clean and dirty information – or fact and fiction – are washed together until the two become indistinguishable, explains Belén Carrasco Rodríguez, director of CIR’s Eyes on Russia project.

“Information laundering is a multi-layered influence process involving the combination and progressive application of a set of influence techniques that seek to distort an event, a claim, or a fact,” explains Rodríguez.

“Instead of just disinformation, this involves a more complex process where facts are mixed up, decontextualised, misappropriated or misconstrued. Once a fact is recycled through a network of accounts or layers of media, it becomes completely distorted, and the original source is obscured."

Laundering information involves the gradual application of techniques such as disinformation, misappropriation, click-bait headlines, and the ‘Woozle effect’ – when fabricated or misleading citations are used repeatedly in laundered news items in an attempt to provide ‘evidence’ of their integrity.

The information is then integrated into and spread around the information ecosystem through processes such as smurfing – a term borrowed from money laundering – where an actor sets up multiple accounts or websites to disseminate the information. There’s also what disinformation analysts call ‘Potemkin villages’, a network of accounts or platforms that share and endorse each other’s content, serving to amplify and propagate material.

The goal of such dissemination techniques is to boost visibility while building credibility – based on the premise that audiences tend to trust information more if it’s widely reported by different actors or organisations.

An international operation

CIR has seen numerous examples of information laundering in different languages and online environments since Russia’s full-scale invasion of Ukraine. In 2023, our investigators worked alongside the BBC’s disinformation team to investigate Yala News – a UK-registered, Syrian-linked media company that was found to be spreading Russian state disinformation to millions in the Arabic-speaking world.

The topics and rhythm of Yala’s social media posts revealed traits of information laundering, with many of the posts identical to those seen on Russian state media just a few hours earlier.

Some videos – including one that claims President Zelensky was drunk and 'lost his mind’ – generated over a million views. According to Rodríguez, such content “hits the right audiences”, allowing outlets such as Yala to not only disseminate pro-Russian, anti-western messages but to drive their readership at the same time.

In another case, in February 2023, CIR saw a fake UK government letter circulated online and addressed to UK sponsors of Ukrainian refugees. The letter asked for the personal details of Ukrainian men living in the households, information that had allegedly been demanded by the Ukrainian Embassy in London for reasons unspecified.

It was an operation that Rodríguez describes as hybrid, combining a forgery with an information laundering operation that was designed to stoke fear among the Ukrainian refugee community while portraying the Ukrainian armed forces as desperate and running out of manpower – and prepared to go to cruel lengths to obtain recruits.

Such narratives were embedded into social media groups in countries supporting the Ukrainian war effort, like Lithuania and Latvia, with posts suggesting authorities were collecting information on Ukrainian men so they could be deported for conscription.

“They used that forgery as an initial entry point to a further influence campaign involving information laundering,” explains Rodríguez, adding that the letter was swiftly shared online alongside stories from individuals who had supposedly received it, or knew someone who had. These narratives were an attempt to add legitimacy to the claims, she says.

“This is how laundering works – online and offline networks mobilise to spread a piece of disinformation, in this case, a forgery.”

Sowing division, casting doubt

Like large-scale money laundering operations, it is the transfer of narratives into other countries’ political environments that helps to strengthen their credibility while serving the purpose of the launderer – making the strategy especially dangerous in the so-called year of elections.

Rodríguez says what is particularly concerning is Russia’s ability to embed its influence networks in different communities and launder information by “speaking the domestic language and understanding the domestic grievances.”

Recent CIR analysis shared with Bloomberg revealed how X (formerly Twitter) accounts being used to promote Russian interests in South Africa are attempting to rally support for a new party backed by former President Jacob Zuma. Investigators identified several accounts that praised Russia’s invasion of Ukraine and drew parallels between Zuma’s leadership and Putin’s. One such account has around 170,000 followers and regularly interacts with other users to amplify its reach – at times generating over 1 million impressions per post.

Ahead of elections in the U.S. and Europe, military aid to Ukraine has been a key topic for debate, and American officials have expressed concern that Russia will increase support for candidates opposing Ukrainian aid.

Recent reporting by the New York Times details Russia’s intensified efforts to amplify arguments for isolationism, with the ultimate aim of derailing military funding for Ukraine. While the initial arguments against additional aid may be organic, it is the amplification that is being “engineered” by Russia through the replication and distortion of legitimate news sites – a clear example of the information laundering described by Rodríguez.

Another key Russian tactic to destabilise support for Ukraine is through attacks designed to discredit and undermine the country’s political figures. The Washington Post recently uncovered a Kremlin disinformation campaign designed to push the theme that Zelensky “is hysterical and weak”, and to “strengthen the conflict” between Zelensky and Zaluzhny – the top military commander he dismissed in early February.

One senior European security official commenting on the campaign told The Washington Post: “Russia survived and they are preparing a new campaign which consists of three main directions: first, pressure on the front line; second, attacks on Ukrainian infrastructure; and thirdly, this destabilization campaign.”

Fragmented societies and social media bubbles

But as democracies around the world prepare to open their polling booths, U.S. officials have also warned that Russia may be attempting to move beyond targeting individuals, instead sowing seeds of doubt over the future of democracy itself.

A U.S. review of elections between 2020 and 2022 identified 11 contests in nine countries where Russia attempted to undermine confidence in election outcomes. More subtle campaigns – which attempted to cast doubt and amplify domestic questions about the reliability of elections – were identified in a further 17 democracies.

While content moderation by Silicon Valley companies has been strengthened in the wake of the 2016 U.S. elections, research has repeatedly raised the issue of comparatively inconsistent and weak moderation of non-English language content, leaving hundreds of millions of voters particularly vulnerable to campaigns and strategies that Russia has expertly refined.

Facebook whistleblower Frances Haugen previously warned that 87% of Facebook’s spending on combating misinformation was spent on English content, despite only 9% of users being English speakers – a disturbing finding for non-English speaking voters as they head to the polls. Meanwhile, after Elon Musk’s controversial takeover of X, disinformation and hate speech reportedly surged.

Research indicates that public trust in government, the media and democracy is waning, while conspiracy theories have flourished in recent years, particularly in the wake of the pandemic, a trend noted by Rodríguez:

“Societies are suffering a post-covid effect, we’re still extremely divided, and audiences are being held in social media bubbles. It’s very easy to disseminate false narratives and amplify them online, shaping cognitive processes and impacting public perceptions.”

Coupled with weak or understaffed content moderation from social media companies, this fragmentation provides fertile ground for influence operations to thrive, Rodríguez warns.

“The recent changes in social media platforms like Twitter favour this trend. It is a very volatile environment in an electoral year.”

2
1
submitted 3 minutes ago by 0x815@feddit.de to c/europe@feddit.de

Archived link

Since Russia’s full-scale invasion of Ukraine in February 2022, Russia’s attritional fighting has bombarded Ukrainian schools, hospitals and other vital civilian infrastructure in an attempt to make life unsustainable for remaining civilians.

But Russia has waged an aggressive campaign online as well as off, seeking to exploit and exacerbate divisions and tensions created by the war in Ukraine. While the strategies used are not new, the full-scale invasion saw an intensification of efforts to spread fear, muddy the waters, sow division, and ultimately, undermine support for Ukraine.

More than two years on, as more than two billion people across 50 countries head to polling stations in 2024, democracies around the globe are increasingly vulnerable to Russian influence attempts to polarise public opinion, discredit governments, and cast doubt on democracy itself.

**The information laundering process **

While we’ve heard plenty about the Kremlin’s narratives and disinformation campaigns – and the bot networks and troll farms behind them – we’ve heard less about the specific strategies that are making dis– and misinformation increasingly difficult to detect.

Key to this process is information laundering. Akin to money laundering, information laundering is when propaganda is spread through layers of media to conceal and distance it from its Kremlin origins. Actors use a range of techniques to build credibility and embed laundered information within public discourse, allowing falsehoods at the fringes of the media environment to go global and shape mainstream narratives.

One of the aims is to subtly manipulate information in a way that makes inaccuracies difficult to detect or debunk. In simple terms, clean and dirty information – or fact and fiction – are washed together until the two become indistinguishable, explains Belén Carrasco Rodríguez, director of CIR’s Eyes on Russia project.

“Information laundering is a multi-layered influence process involving the combination and progressive application of a set of influence techniques that seek to distort an event, a claim, or a fact,” explains Rodríguez.

“Instead of just disinformation, this involves a more complex process where facts are mixed up, decontextualised, misappropriated or misconstrued. Once a fact is recycled through a network of accounts or layers of media, it becomes completely distorted, and the original source is obscured."

Laundering information involves the gradual application of techniques such as disinformation, misappropriation, click-bait headlines, and the ‘Woozle effect’ – when fabricated or misleading citations are used repeatedly in laundered news items in an attempt to provide ‘evidence’ of their integrity.

The information is then integrated into and spread around the information ecosystem through processes such as smurfing – a term borrowed from money laundering – where an actor sets up multiple accounts or websites to disseminate the information. There’s also what disinformation analysts call ‘Potemkin villages’, a network of accounts or platforms that share and endorse each other’s content, serving to amplify and propagate material.

The goal of such dissemination techniques is to boost visibility while building credibility – based on the premise that audiences tend to trust information more if it’s widely reported by different actors or organisations.

An international operation

CIR has seen numerous examples of information laundering in different languages and online environments since Russia’s full-scale invasion of Ukraine. In 2023, our investigators worked alongside the BBC’s disinformation team to investigate Yala News – a UK-registered, Syrian-linked media company that was found to be spreading Russian state disinformation to millions in the Arabic-speaking world.

The topics and rhythm of Yala’s social media posts revealed traits of information laundering, with many of the posts identical to those seen on Russian state media just a few hours earlier.

Some videos – including one that claims President Zelensky was drunk and 'lost his mind’ – generated over a million views. According to Rodríguez, such content “hits the right audiences”, allowing outlets such as Yala to not only disseminate pro-Russian, anti-western messages but to drive their readership at the same time.

In another case, in February 2023, CIR saw a fake UK government letter circulated online and addressed to UK sponsors of Ukrainian refugees. The letter asked for the personal details of Ukrainian men living in the households, information that had allegedly been demanded by the Ukrainian Embassy in London for reasons unspecified.

It was an operation that Rodríguez describes as hybrid, combining a forgery with an information laundering operation that was designed to stoke fear among the Ukrainian refugee community while portraying the Ukrainian armed forces as desperate and running out of manpower – and prepared to go to cruel lengths to obtain recruits.

Such narratives were embedded into social media groups in countries supporting the Ukrainian war effort, like Lithuania and Latvia, with posts suggesting authorities were collecting information on Ukrainian men so they could be deported for conscription.

“They used that forgery as an initial entry point to a further influence campaign involving information laundering,” explains Rodríguez, adding that the letter was swiftly shared online alongside stories from individuals who had supposedly received it, or knew someone who had. These narratives were an attempt to add legitimacy to the claims, she says.

“This is how laundering works – online and offline networks mobilise to spread a piece of disinformation, in this case, a forgery.”

Sowing division, casting doubt

Like large-scale money laundering operations, it is the transfer of narratives into other countries’ political environments that helps to strengthen their credibility while serving the purpose of the launderer – making the strategy especially dangerous in the so-called year of elections.

Rodríguez says what is particularly concerning is Russia’s ability to embed its influence networks in different communities and launder information by “speaking the domestic language and understanding the domestic grievances.”

Recent CIR analysis shared with Bloomberg revealed how X (formerly Twitter) accounts being used to promote Russian interests in South Africa are attempting to rally support for a new party backed by former President Jacob Zuma. Investigators identified several accounts that praised Russia’s invasion of Ukraine and drew parallels between Zuma’s leadership and Putin’s. One such account has around 170,000 followers and regularly interacts with other users to amplify its reach – at times generating over 1 million impressions per post.

Ahead of elections in the U.S. and Europe, military aid to Ukraine has been a key topic for debate, and American officials have expressed concern that Russia will increase support for candidates opposing Ukrainian aid.

Recent reporting by the New York Times details Russia’s intensified efforts to amplify arguments for isolationism, with the ultimate aim of derailing military funding for Ukraine. While the initial arguments against additional aid may be organic, it is the amplification that is being “engineered” by Russia through the replication and distortion of legitimate news sites – a clear example of the information laundering described by Rodríguez.

Another key Russian tactic to destabilise support for Ukraine is through attacks designed to discredit and undermine the country’s political figures. The Washington Post recently uncovered a Kremlin disinformation campaign designed to push the theme that Zelensky “is hysterical and weak”, and to “strengthen the conflict” between Zelensky and Zaluzhny – the top military commander he dismissed in early February.

One senior European security official commenting on the campaign told The Washington Post: “Russia survived and they are preparing a new campaign which consists of three main directions: first, pressure on the front line; second, attacks on Ukrainian infrastructure; and thirdly, this destabilization campaign.”

Fragmented societies and social media bubbles

But as democracies around the world prepare to open their polling booths, U.S. officials have also warned that Russia may be attempting to move beyond targeting individuals, instead sowing seeds of doubt over the future of democracy itself.

A U.S. review of elections between 2020 and 2022 identified 11 contests in nine countries where Russia attempted to undermine confidence in election outcomes. More subtle campaigns – which attempted to cast doubt and amplify domestic questions about the reliability of elections – were identified in a further 17 democracies.

While content moderation by Silicon Valley companies has been strengthened in the wake of the 2016 U.S. elections, research has repeatedly raised the issue of comparatively inconsistent and weak moderation of non-English language content, leaving hundreds of millions of voters particularly vulnerable to campaigns and strategies that Russia has expertly refined.

Facebook whistleblower Frances Haugen previously warned that 87% of Facebook’s spending on combating misinformation was spent on English content, despite only 9% of users being English speakers – a disturbing finding for non-English speaking voters as they head to the polls. Meanwhile, after Elon Musk’s controversial takeover of X, disinformation and hate speech reportedly surged.

Research indicates that public trust in government, the media and democracy is waning, while conspiracy theories have flourished in recent years, particularly in the wake of the pandemic, a trend noted by Rodríguez:

“Societies are suffering a post-covid effect, we’re still extremely divided, and audiences are being held in social media bubbles. It’s very easy to disseminate false narratives and amplify them online, shaping cognitive processes and impacting public perceptions.”

Coupled with weak or understaffed content moderation from social media companies, this fragmentation provides fertile ground for influence operations to thrive, Rodríguez warns.

“The recent changes in social media platforms like Twitter favour this trend. It is a very volatile environment in an electoral year.”

3
1
submitted 4 minutes ago by BrikoX@lemmy.zip to c/globalnews@lemmy.zip

Dutch photographer and multimedia journalist Jelle Krings revisits the families that have kept a war-torn country moving, despite the Russian onslaught and in the face of great personal sacrifice

Archived version: https://archive.ph/BpveN

4
1
submitted 4 minutes ago by 0x815@feddit.de to c/ukraine@sopuli.xyz

Archived link

Since Russia’s full-scale invasion of Ukraine in February 2022, Russia’s attritional fighting has bombarded Ukrainian schools, hospitals and other vital civilian infrastructure in an attempt to make life unsustainable for remaining civilians.

But Russia has waged an aggressive campaign online as well as off, seeking to exploit and exacerbate divisions and tensions created by the war in Ukraine. While the strategies used are not new, the full-scale invasion saw an intensification of efforts to spread fear, muddy the waters, sow division, and ultimately, undermine support for Ukraine.

More than two years on, as more than two billion people across 50 countries head to polling stations in 2024, democracies around the globe are increasingly vulnerable to Russian influence attempts to polarise public opinion, discredit governments, and cast doubt on democracy itself.

**The information laundering process **

While we’ve heard plenty about the Kremlin’s narratives and disinformation campaigns – and the bot networks and troll farms behind them – we’ve heard less about the specific strategies that are making dis– and misinformation increasingly difficult to detect.

Key to this process is information laundering. Akin to money laundering, information laundering is when propaganda is spread through layers of media to conceal and distance it from its Kremlin origins. Actors use a range of techniques to build credibility and embed laundered information within public discourse, allowing falsehoods at the fringes of the media environment to go global and shape mainstream narratives.

One of the aims is to subtly manipulate information in a way that makes inaccuracies difficult to detect or debunk. In simple terms, clean and dirty information – or fact and fiction – are washed together until the two become indistinguishable, explains Belén Carrasco Rodríguez, director of CIR’s Eyes on Russia project.

“Information laundering is a multi-layered influence process involving the combination and progressive application of a set of influence techniques that seek to distort an event, a claim, or a fact,” explains Rodríguez.

“Instead of just disinformation, this involves a more complex process where facts are mixed up, decontextualised, misappropriated or misconstrued. Once a fact is recycled through a network of accounts or layers of media, it becomes completely distorted, and the original source is obscured."

Laundering information involves the gradual application of techniques such as disinformation, misappropriation, click-bait headlines, and the ‘Woozle effect’ – when fabricated or misleading citations are used repeatedly in laundered news items in an attempt to provide ‘evidence’ of their integrity.

The information is then integrated into and spread around the information ecosystem through processes such as smurfing – a term borrowed from money laundering – where an actor sets up multiple accounts or websites to disseminate the information. There’s also what disinformation analysts call ‘Potemkin villages’, a network of accounts or platforms that share and endorse each other’s content, serving to amplify and propagate material.

The goal of such dissemination techniques is to boost visibility while building credibility – based on the premise that audiences tend to trust information more if it’s widely reported by different actors or organisations.

An international operation

CIR has seen numerous examples of information laundering in different languages and online environments since Russia’s full-scale invasion of Ukraine. In 2023, our investigators worked alongside the BBC’s disinformation team to investigate Yala News – a UK-registered, Syrian-linked media company that was found to be spreading Russian state disinformation to millions in the Arabic-speaking world.

The topics and rhythm of Yala’s social media posts revealed traits of information laundering, with many of the posts identical to those seen on Russian state media just a few hours earlier.

Some videos – including one that claims President Zelensky was drunk and 'lost his mind’ – generated over a million views. According to Rodríguez, such content “hits the right audiences”, allowing outlets such as Yala to not only disseminate pro-Russian, anti-western messages but to drive their readership at the same time.

In another case, in February 2023, CIR saw a fake UK government letter circulated online and addressed to UK sponsors of Ukrainian refugees. The letter asked for the personal details of Ukrainian men living in the households, information that had allegedly been demanded by the Ukrainian Embassy in London for reasons unspecified.

It was an operation that Rodríguez describes as hybrid, combining a forgery with an information laundering operation that was designed to stoke fear among the Ukrainian refugee community while portraying the Ukrainian armed forces as desperate and running out of manpower – and prepared to go to cruel lengths to obtain recruits.

Such narratives were embedded into social media groups in countries supporting the Ukrainian war effort, like Lithuania and Latvia, with posts suggesting authorities were collecting information on Ukrainian men so they could be deported for conscription.

“They used that forgery as an initial entry point to a further influence campaign involving information laundering,” explains Rodríguez, adding that the letter was swiftly shared online alongside stories from individuals who had supposedly received it, or knew someone who had. These narratives were an attempt to add legitimacy to the claims, she says.

“This is how laundering works – online and offline networks mobilise to spread a piece of disinformation, in this case, a forgery.”

Sowing division, casting doubt

Like large-scale money laundering operations, it is the transfer of narratives into other countries’ political environments that helps to strengthen their credibility while serving the purpose of the launderer – making the strategy especially dangerous in the so-called year of elections.

Rodríguez says what is particularly concerning is Russia’s ability to embed its influence networks in different communities and launder information by “speaking the domestic language and understanding the domestic grievances.”

Recent CIR analysis shared with Bloomberg revealed how X (formerly Twitter) accounts being used to promote Russian interests in South Africa are attempting to rally support for a new party backed by former President Jacob Zuma. Investigators identified several accounts that praised Russia’s invasion of Ukraine and drew parallels between Zuma’s leadership and Putin’s. One such account has around 170,000 followers and regularly interacts with other users to amplify its reach – at times generating over 1 million impressions per post.

Ahead of elections in the U.S. and Europe, military aid to Ukraine has been a key topic for debate, and American officials have expressed concern that Russia will increase support for candidates opposing Ukrainian aid.

Recent reporting by the New York Times details Russia’s intensified efforts to amplify arguments for isolationism, with the ultimate aim of derailing military funding for Ukraine. While the initial arguments against additional aid may be organic, it is the amplification that is being “engineered” by Russia through the replication and distortion of legitimate news sites – a clear example of the information laundering described by Rodríguez.

Another key Russian tactic to destabilise support for Ukraine is through attacks designed to discredit and undermine the country’s political figures. The Washington Post recently uncovered a Kremlin disinformation campaign designed to push the theme that Zelensky “is hysterical and weak”, and to “strengthen the conflict” between Zelensky and Zaluzhny – the top military commander he dismissed in early February.

One senior European security official commenting on the campaign told The Washington Post: “Russia survived and they are preparing a new campaign which consists of three main directions: first, pressure on the front line; second, attacks on Ukrainian infrastructure; and thirdly, this destabilization campaign.”

Fragmented societies and social media bubbles

But as democracies around the world prepare to open their polling booths, U.S. officials have also warned that Russia may be attempting to move beyond targeting individuals, instead sowing seeds of doubt over the future of democracy itself.

A U.S. review of elections between 2020 and 2022 identified 11 contests in nine countries where Russia attempted to undermine confidence in election outcomes. More subtle campaigns – which attempted to cast doubt and amplify domestic questions about the reliability of elections – were identified in a further 17 democracies.

While content moderation by Silicon Valley companies has been strengthened in the wake of the 2016 U.S. elections, research has repeatedly raised the issue of comparatively inconsistent and weak moderation of non-English language content, leaving hundreds of millions of voters particularly vulnerable to campaigns and strategies that Russia has expertly refined.

Facebook whistleblower Frances Haugen previously warned that 87% of Facebook’s spending on combating misinformation was spent on English content, despite only 9% of users being English speakers – a disturbing finding for non-English speaking voters as they head to the polls. Meanwhile, after Elon Musk’s controversial takeover of X, disinformation and hate speech reportedly surged.

Research indicates that public trust in government, the media and democracy is waning, while conspiracy theories have flourished in recent years, particularly in the wake of the pandemic, a trend noted by Rodríguez:

“Societies are suffering a post-covid effect, we’re still extremely divided, and audiences are being held in social media bubbles. It’s very easy to disseminate false narratives and amplify them online, shaping cognitive processes and impacting public perceptions.”

Coupled with weak or understaffed content moderation from social media companies, this fragmentation provides fertile ground for influence operations to thrive, Rodríguez warns.

“The recent changes in social media platforms like Twitter favour this trend. It is a very volatile environment in an electoral year.”

5
1
submitted 4 minutes ago by BrikoX@lemmy.zip to c/globalnews@lemmy.zip

University of Melbourne cancels classes as activists at Deakin defy directives on encampments

Archived version: https://archive.ph/a2eS6

6
1
submitted 6 minutes ago by Syl@jlai.lu to c/lemmediapart@jlai.lu
7
1
submitted 11 minutes ago by scrwd@mastodon.social to c/vuejs@programming.dev

I was just thinking it would be cool if at conferences the @vuejs team gave out little bottles of condiments to attendees - Vue Sauce

Who doesn't like sauce?

8
2
submitted 13 minutes ago by ahimsabjorn@lemmy.world to c/buddhism@lemmy.world

I36

Feed Your Love, Not Your Suffering

NOTHING CAN survive without food, not even suffer-ing. No animal or plant can survive without food. In order for our love to survive, we have to feed it. If we don't feed it, or we feed it the wrong kind of nutrients, our love will die. In a short time, our love can turn into hate. Our suffering, our depression also needs food to survive. If our depression refuses to go away, it's because we keep feeding it daily. We can look deeply into the source of nutrition that is feeding our suffering.

9
1
10
1
submitted 16 minutes ago by ZippyBot@lemmy.zip to c/gaming@lemmy.zip
11
15
submitted 18 minutes ago* (last edited 14 minutes ago) by Confidant6198@lemmy.ml to c/comics@lemmy.ml
12
1
submitted 18 minutes ago* (last edited 10 minutes ago) by BrikoX@lemmy.zip to c/globalnews@lemmy.zip

The reported friendly-fire incident in Jabalia is one of the deadliest since the war began.

Archived version: https://archive.ph/HLsNX

13
3
submitted 19 minutes ago by BrikoX@lemmy.zip to c/globalnews@lemmy.zip

Britain's food watchdog has applied extra control measures on all spice imports from India, it said on Wednesday, becoming the first to ramp up scrutiny of all Indian spices after contamination allegations against two brands sparked concerns among global food regulators.

Archived version: https://archive.ph/XRFdJ

14
2
submitted 20 minutes ago* (last edited 4 minutes ago) by Syl@jlai.lu to c/france@jlai.lu
15
4
wow.wav (youtu.be)
submitted 21 minutes ago by MentalEdge@sopuli.xyz to c/vtubervids@ani.social
16
1
submitted 26 minutes ago by Syl@jlai.lu to c/quefaitlapolice@jlai.lu
17
3
submitted 27 minutes ago by ZippyBot@lemmy.zip to c/apple@lemmy.zip
18
3
submitted 27 minutes ago by BrikoX@lemmy.zip to c/globalnews@lemmy.zip

A suspected arsonist sprays a mosque with petrol and sets it on fire over a family dispute, police say.

Archived version: https://archive.ph/6ncz0

19
14
submitted 28 minutes ago by misk@sopuli.xyz to c/europe@feddit.de
20
6
21
1
submitted 28 minutes ago by BrikoX@lemmy.zip to c/aviation@lemmy.zip

CEO will be replaced by chief financial officer Kenton Jarvis, who like Lundgren joined from Tui

22
3
submitted 29 minutes ago by Syl@jlai.lu to c/france@jlai.lu
23
2
24
1
submitted 30 minutes ago* (last edited 21 minutes ago) by khaosworks@startrek.website to c/daystrominstitute@startrek.website

The title refers to Labyrinths of the Mind, a book written by Dr Marina Derex, a Betazoid and one of the group that hid the Progenitor technology 800 years prior. A labyrinth is also a term for a maze, the original designed by the inventor Daedelus of Greek myth to house the Minotaur.

As mentioned in DIS: “Erigah”, L’ak was the Scion, a direct descendant of the Breen emperor, and held the genetic code of the Yod-Thot, “they who rule”, without whom his uncle, Primarch Ruhn, could not claim the throne. In DIS: “Jinaal”, Stamets discovered the the Progenitor techonlogy could potentially bring someone back to life.

Discovery jumps to just outside the Badlands, first appearing in DS9: “The Maquis” as an area of violent plasma storms in proximity to Bajor and Cardassia.

The shape of Hy’Rell’s head bumps resemble those of Xindi-Primates, first appearing in ENT: “The Xindi”, one of six intelligent Xindi species that were native to Xindus.

Cerenkov radiation is created when particles exceed the speed of light in a given medium, creating a shockwave with characteristic blue glow. In real life, they are most often seen around nuclear reactors submerged in water (the speed of light in water is 75% less than that in vacuum, allowing emissions from the reactor to exceed that).

Discovery was given the ability to cloak when it was refitted (DIS: “That Hope is You, Part 2”). In the 23rd and 24th Centuries, the Treaty of Algeron forbade the Federation from using or developing cloaking devices, with a notable exception being the Defiant during the Dominion War (DS9: “The Search”). Apparently that prohibition no longer applies in the 32nd Century.

Kwejian, Book’s world, was destroyed in DIS: “Kobayashi Maru”, making him one of the last of his species.

An oubliette is a specific type of dungeon, of which the only access is a trap door installed in the ceiling of the dungeon, and usually extremely narrow, such that the prisoner was unable to sit down.

The Tuli tree was native to Kwejian and had a distinctive smell to its sap. The decor of Book’s ship was made to simulate Tuli wood (DIS: “Stormy Weather”). Inside the box are cuttings from the World Root, a tree root system that reached all the way around the planet (DIS: “Kobayashi Maru”) and was sacred to tjhe Kwejian.

Culber identifies the device affecting Burnham as a nucleonic emitter. Nucleonic particles appear in a number of places in Star Trek lore, but most appropriately in TNG: “The Inner Light”, where a nucleonic beam from a Kataan probe was responsible for sending Picard into a mindscape where he lived out a simulated lifetime in a similar manner to what Burnham is experiencing. In that episode, an attempt to disrupt the beam nearly killed Picard, which is the risk Culber is alluding to.

The old school card index drawers Burnham looks at makes me nostalgic for the days when I was a student librarian (yes, I’m old). The mindscape Archives’ category number for history is 002818/5 - in our Dewey Decimal System, history (and geography) is 900.

Book says “Those who learn history aren’t doomed to repeat it.” The usual phrasing of that adage is “Those who do not learn from history are doomed to repeat it.” The philosopher George Satayana is credited with the original “Those who cannot remember the past are condemned to repeat it.”

Burnham refers to the itronok, a predatory species they encounted on Trill while searching for the clue there (DIS: “Jinaal”).

Trémaux’s algorithm is a maze-solving method devised by Charles Pierre Trémaux, which involves drawing lines on the floor marking a path. A version of it - called a depth first search - is used to search tree or graph data structures.

Derex’s reading list references Talaxians, Neelix’s species from VOY and Hupyrians, the species of the Ferengi Grand Nagus’ servants (DS9: “The Nagus”, et al.). Euclid was a Greek mathematician who devised an axiomatic system for geometry.

Rhys intends to use the plasma storms for cover, which is exactly what made the Badlands effective as a hiding place for the Bajoran Resistance and the Maquis back in their day.

Matching weapons to shield frequencies to get past them is a tried and true method, demonstrated most dramatically when the Enterprise-D was destroyed in ST: Generations. Duodeca is a base-20 system.

Hysperia is a planet where the inhabitants have a culture based on a medieval fantasy motif (LD: “Where Pleasant Fountains Lie”). In the 24th Century, Chief Engineer Billips of the USS Cerritos was a native of Hysperia and the ostensible Crown Prince, although he abdicated that position.

Commander Jemison shares a last name with former astronaut Mae Jemison, the first African-American woman in space, who appeared in TNG: “Second Chances” as LT jg Palmer.

A tergun is a sacred Breen oath. Ruhn’s remark that the Federation to save the few would risk the many is reminiscent of Kirk’s inversion in ST III of Spock’s adage about the needs of the many and the few from ST II: “The needs of the one outweighed the needs of the many.”

“Never turn your back on a Breen” is a Romulan saying (DS9: “By Inferno’s Light”), cited by Rayner in DIS: “Erigah”.

25
6
submitted 30 minutes ago by BrikoX@lemmy.zip to c/globalnews@lemmy.zip

A group of determined teenagers has taken the government to court for not doing more to protect their community from the noxious byproduct of oil extraction in the northern Amazon

Archived version: https://archive.ph/Snoy0

view more: next ›

Blåhaj Lemmy

7,806 readers
492 users here now

Blahaj Zone Logo

Blåhaj Lemmy is brought to you by the kind folk at Blåhaj Zone, and while anyone is free to register for an account here, please bear in mind that this is a server that is very protective of our minority members and bigotry of any variety will be squashed with great prejudice.

We have three alternative lemmy frontends

We also have a public matrix channel at #blahaj:chat.blahaj.zone

Community Guidelines

Blåhaj Lemmy is a space where everyone should feel able to participate safely, and to that end, our community is built on the guiding philosophies of empathy, inclusion and acceptance.

Empathy

We want our community members to be guided by compassion and empathy for others.

Examples of behaviour that are contrary to this philosophy are personal attacks, insults, doxing etc. If your comment is designed to hurt someone, this isn't the space for it.

Inclusion and Acceptance

Embracing inclusion and acceptance means listening when people tell you who they are and what their needs are. It means not telling people that you know their experiences better than they do. It means not gatekeeping experiences of identities of others. It means no bigotry such as racism, sexism, anti LGBT commentary, ableism etc. It means doing your best to ensure that you don't over-talk the voices of folk who don't share your privileges.

Supporting Blåhaj Lemmy

After much hesitation, we have a Ko Fi to enable people to help with supporting some of the running costs associated with our instances.

Providing a safe space for our community is the goal, so please only consider donating if you are in a position to do so without any financial stress.

--

founded 1 year ago
ADMINS