The physics professor who says online extremists act like sour milk science

Lonely wolves. Terror cells. Bad apples. Viral infections.

The language in which we speak about violent extremism is rich in metaphors from nature. To understand why some people behave so inhumanely, we rely on comparisons with biology, ecology and medicine.

But what if we worked in the wrong scientific discipline? What if the spread of hatred is less like the proverbial body politics spreading cancer than it is to the formation of bubbles in a boiling pot of water?

Such is the claim made by Neil Johnson, a professor of physics at George Washington University and lead author of a study published in Nature this week that analyzes the spread of online hatred. If that sounds like a strange topic to a physicist, it is. Johnson began his career at Oxford University, where he published extensively on quantum information and “complexity theory”. After moving to the United States in 2007, he embarked on a new research course applying theories from physics to complex human behavior, from financial markets and conflict areas to rioting and terrorist recruitment.

Johnson’s unusual approach has led to some surprising conclusions – he says that all online hatred in the world comes from just 1,000 online “clusters” – as well as counter-intuitive policy proposals. On Wednesday, he spoke to the Guardian about his findings.

The interview has been edited and condensed for the sake of length and clarity.

How did you get from physics to studying these social problems of violent extremism and online hatred?

Most people think that physics breaks things down into smaller and smaller pieces, but there is actually a whole plethora of physics going the other way and exploring what happens when you put things together. When I put water molecules together, all of a sudden I get a liquid and ice shapes and icebergs and the Titanic sinks. There are all sorts of consequences to what happens when you put good and bad objects together.

We tend to want to blame individual objects, but in the world of physics you would never do that. There is no such thing as a bad molecule that can make water boil. It’s a collective effect. And so we wondered if many of the social problems we face are actually better viewed through this lens.

The typical reaction is, “But I am an individual, I don’t act like a milk molecule.” Yes, but we do it together

[For this study]We were just naively saying what is the online world of hate like? So we set out to find out and found an incredible global network of hatred.

I study networks in biological systems, economic systems. This is the most complicated network I’ve ever studied – ten times more complicated – because it connects geography, continents, languages, cultures and online platforms. Trying to monitor it on a platform is a bit like saying that tending the weeds in your own yard can eliminate the problem in the neighborhood.

You speak of hatred for chemical bonds and “gelation theory”. How did you develop this framework?

These are not analogies. We looked at the behavior of the dates and numbers and found that it was similar [to chemical bonding] not just because the numbers change in a certain way, but actually microscopically in terms of interactions.

If you gradually have milk in the refrigerator, one day that milk suddenly curdles. That’s because, microscopically, you get this aggregation of objects in communities. And the math for it works perfectly for aggregating people into communities. The typical reaction is, “Oh, but I’m an individual, I don’t act like a milk molecule.” Yes, but we do this together because we are restricted by the others. So there are only a certain number of things we can actually do, and we tend to do them over and over again.

Neil Johnson, Professor of Physics at George Washington University. Photo: William Atkins / William Atkins / GWU

So it’s not an analogy. people say [online hate] it’s like cancer, it’s like a virus, it’s like this, it’s like – no. It’s just like gelling, which is another way of saying bubble formation.

How did you create your card?

We started with a number of clusters that were already banned on Facebook, such as the KKK. We looked at which other clusters they also connect to and continued down that chain.

We have found that there is a closed network of around 1,000 clusters worldwide online on all platforms that spread global hatred of all kinds. If there are about 1,000 people in each of them (it’s actually between 10 and maybe up to 100,000, so let’s just say 1,000 on average) you have 1,000 clusters of 1,000 people – that’s a million people. And that’s our very, very rough first estimate of the number of people involved in this online.

That’s an amazingly manageable number – 1,000 networks.

Not if you’re trying to find that under seven billion. But they already did the job for you. They have already come together in a community.

Do you have a list of those 1,000 groups?

We built that. I expected that we would never end the process. But we’ve got to the stage where we thought, gosh, we’ve mapped the universe of online hate to some extent. Now we can understand how things are connected and what they look like.

What was the range of ideology in these hate clusters? Is it mainly anti-Semitism, racism, white nationalism?

We thought we were going to see a discrete, well-defined set of boxes. But just as people weren’t able to categorize [mass] We did not find shooters in defined box sets online. Instead of being like a menu of flavors, it’s actually a continuous spectrum. And it’s not a one-line spectrum, it’s multidimensional.

In the study you describe forms of resilience of hate groups when they are threatened. Of particular concern was this warning: “Monitoring within a single platform (like Facebook) can make the situation worse and ultimately lead to global“ dark pools ”where online hatred will thrive. “Can you describe how that works? What do you mean by a “dark pool”? Is that like 8Chan?

No, it’s worse. 8chan is a bit of a secluded island in itself. I am talking about the dark pools that are forming within the major commercial platforms that we “trust”.

If you happen to remove 10% of the members worldwide, this thing will fall apart

When the KKK was banned from Facebook, they looked for a platform and suddenly there was this welcoming committee [the Russian social network] VK. It was like an orientation week in college [with people saying]”We hold your hand, bring you to the community and find what you want here.” They are now in some kind of close-knit group like Freshman Orientation and can quickly develop bonds to discuss what they banned and how to avoid going back inside.

One of the adjustments was that they then fit back into Facebook with a simple change in name [KKK] in Cyrillic the Russian alphabet. It looks pretty similar, and yet a machine learning algorithm doesn’t know anything about Cyrillic. It is wise.

How did you come up with your four policy proposals?

If I want to stop boiling water, I don’t have to prevent individual molecules from jumping into the steam, but rather prevent bubbles from forming. We know that the big bubbles form from the smaller ones. And today’s great bubble becomes old news for the next generation.

[The first proposal] is to go after the smaller bubbles. Smaller bubbles are weaker, have less money, less powerful people, and become those big ones. So when we eliminate small ones – and we have shown this mathematically – the ecology quickly diminishes. It interrupts the supply.

People gather to protest on the far right in Chicago after the deadly rally in Charlottesville, Virginia.People gather to protest on the far right in Chicago after the deadly rally in Charlottesville, Virginia. Photo: Scott Olson / Getty Images

Number two is that instead of banning individuals, due to the networking of this entire system, we have shown that you actually only need to remove about 10% of the accounts to make a huge difference in terms of the cohesiveness of the network. If you happen to remove 10% of the members worldwide, this thing will fall apart.

That’s a really interesting idea. It also seems a A bit weighed down on our general understanding of fairness to say that only 10% of people who engage in certain negative behavior should be punished.

Allegedly, everyone involved violated the platform’s terms of use, so everyone should have their accounts removed. Facebook is trying to remove them all anyway. Our approach is simply not to look for the most important people first.

Your second proposal involves the provision of “anti-hate clusters” to get in touch with hate groups. How could that work?

You get [the hate clusters] Basically in a battle and they think it’s some kind of supreme battle. It slows them down in terms of recruiting; it just keeps them busy with something that isn’t really that important.

So you’re saying it’s worth your time battling trolls online?

Law. But do it as a group, do it as a cluster. Don’t do it one by one. It will break you.

Are there any examples you’ve observed where it has been effective?

I did not. As I said, this comes from the idea: how can I break off a bubble? Well there is no anti-bubble. But in physics there is the idea of ​​only getting two [opposites] together and they should neutralize each other. They form a tightly tied pair, the plus and the minus.

[The fourth proposal] is my favorite because it really takes advantage of the weakness that comes from the multi-dimensional aromas of hatred. There are two neo-Nazi groups, both in the UK, both supposedly wanting the same thing. But they don’t – one wants a unified Europe, the other wants to break everything apart and wipe out the rest of the countries. So imagine a cluster that highlights the differences. I do not believe that [the introduced cluster] would necessarily include members of the public if they were not trained in some way. But it will certainly include people who practice good psychology and social psychology and know a story that might actually get involved.

This strategy sounds like a tactic used by the FBI under Cointelpro when it tried to pit different sectors of the civil rights movement against each other and exploit potential ideological divisions to create a splinter effect.

I do not know that [history]and I sure wouldn’t want it used in a bad way. But I see it as a way to keep individuals in clusters of hate. In the end, they’ll just be fed up. It’s not that it goes away. It’s just that now they actually hate traffic more than the Jews. It shifts the focus.

Comments are closed.