Associate Professor Marian-Andrei Rizoiu uses algorithms to combat misinformation. His research involves predicting the popularity of content, understanding the spread of misinformation and developing tools to counter it.
Stopping the spread of misinformation

Dr Marian-Andrei Rizoiu
Marian-Andrei recently shared how he collaborates with social scientists, political scientists and digital communication experts in the Research Café.
“I'm calling this area of research into online digital communication environments a brand new world of opportunities,” he explained, beginning his presentation with a story about how Russian controlled bots were able to influence the outcomes of the US Presidential election.
“These bots were implanted in the digital information environment to sway public opinion,” Marian-Andrei said.
Instead of looking at what they say, we can look at what they do, and what they stir
When you're trying to detect such accounts, Marian-Andrei warned that if you take a content-based approach, you quickly run into a ‘Red Queen’ effect.
“In the story Alice in Wonderland, the Red Queen tells Alice: ‘Look, now all you need to do is run as fast as you can just to stay in the same place. If you want to get anywhere, you need to run twice as fast!’”
Marian-Andrei likened the fight against dis and mis information to an arms race.
“When we're trying to detect these individuals, the moment we build a detector, they will change their approaches. The moment when we make a little game, they will adapt to it. It's an arms race that we can't really win,” he said.




Rather than looking at what the troll says, Marian-Andrei looks at how people react.
“Instead of looking at what they say, we can look at what they do, and what they stir. Because if the purpose of the Russian trolls was to instil uncertainty about democratic institutions, and to put fear into the population, that population is going to react in a particular way. And this something that we can study and detect.”
The 2025 Eureka Prizes Finalist is working to protect Australia’s interests through new models that detect digital threats.
Replacing current effort-intensive processes to identify malevolent agents, his mathematical models analyse responses and reactions to harmful content. They also determine how consumption of misinformation can lead to violent extremist acts – vital for national security agencies.
Remote video URL
“We’ve built massive amounts of datasets. We’ve learned the patterns in which typical users react to content and then we focused on how users react to content which is injected into the information environment. This enables us to understand the identity of trolls,” he said, adding that consuming misinformation not only leads to radicalisation, but can also lead to violent acts.
“That’s why understanding where people are on this pathway is really important and can help us to understand what interventions works best."
Marian-Andrei and his team are now working with organisations like Facebook and the Defence Innovation Network.
"In order to make a real dent, I quickly realised I needed to partner up with social scientists, political scientists, digital communication experts, journalists and so on."
Marian-Andrei’s top tips for impact
- Identify and collaborate with experts from various fields to enhance the impact and scope of your research.
- Work closely with your partners to develop and implement tools and prototypes that can be used in real-world scenarios.
- Create methods and tools that reduce the workload on your partners, making collaboration more efficient and sustainable.
- Consider the context in which your research will be applied, including the type of content, the actors involved, and the platforms used.
- Carefully evaluate the potential impact of your interventions and choose the best approach to avoid unintended consequences.
By following these tips, Marian-Andrei says other UTS researchers can effectively collaborate with external partners and enhance the impact and reach of their research.