Combating the reproducing and recirculating of extreme memes has been an escalating feature of the Trump era.
The same syncretic groups of people protected by Trump during the Charlottesville crisis have seen his tacit “both sides” support as a signal to act. As with #TrumpRussia, collusion has a certain ambiguity when ideological buddies act in concert.
Domestic terrorism now seems to revel more openly in using social media to organize and celebrate killing. This is a moment where the prosecution’s examination of collusion and coordination reveals the degree of cooperation. Terror cell organization is not random regardless of whether media networks are formal or informal. Pro Publica shows it can be mapped.
What seemed to be isolated incidents are now connected to a network of Nazi cells.
Some tech companies have been taking action to combat such online activity.
The continuing problem in the public sphere is to distinguish among the actionable and the inactionable messages.
A subsequent problem is that disinformation as extreme views does tend to circulate faster sometimes unintentionally and even deliberately. Yes, framing works and extreme views are not identical to extreme actions.
At first glance, five killings in three states since last May appeared to be unrelated, isolated cases.
But a common thread is emerging. Three young men have been charged, and all appear to have links to the same white supremacist group: the Atomwaffen Division.
Atomwaffen is German for "atomic weapons," and the group is extreme. It celebrates Adolf Hitler and Charles Manson, its online images are filled with swastikas, and it promotes violence.
The action comes after ProPublica reports detailing the organization’s terrorist ambitions and revealing that the California man charged with murdering Blaze Bernstein, a 19-year-old college student found buried in an Orange County park earlier this year, was an Atomwaffen member.
Activists and journalists with other media outlets have criticized the tech firms — among them chat services, web merchants, social media channels and gaming platforms — for enabling the outfit, which has members in 23 states and Canada, records show.
Meanwhile, a researcher has found that YouTube's algorithm has inadvertently created a network of interrelated conspiracy videos, sending unwitting viewers down a rabbit hole of fake news.
Jonathan Albright, the research director for the Tow Center for Digital Journalism at Columbia University, pulled the "next up" recommendations for several hundred videos that he found using the search term crisis actors and then mapped the 9,000 results. Here's one of his snapshots of the network.
As you can see, YouTube's algorithm will eventually expose you to conspiracy theories about the Illuminati, Pizzagate and Hollywood pedophilia rings once you start watching videos about so-called crisis actors. Albright writes,
Every time there's a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value. The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach.
In other words, due to the increasing depth of the content offerings and ongoing optimization of YouTube's algorithms, it's getting harder to counter these types of campaigns with real, factual information.
[Jonathan Albright via Medium]
Social Media Algorithms Still Can't Tell The Difference Between Genuine Shares And Hate-Shares
Another researcher, from New Media Frontier, looked at the spread of the Hoggs' "crisis actor" conspiracy theory on Twitter and found that it was at least partially amplified by people who were horrified by the theory — just the latest example of how social media algorithms are unable to distinguish between people sharing content because they like it and people sharing content because they hate it.
People outraged by the conspiracy helped to promote it — in some cases far more than the supporters of the story. And algorithms — apparently absent the necessary "sentiment sensitivity" that is needed to tell the context of a piece of content and assess whether it is being shared positively or negatively — see all that noise the same.
This unintended amplification created by outrage-sharing may have helped put the conspiracy in front of more unsuspecting people.
[Wired]