The Drum Awards Festival - Extended Deadline

-d -h -min -sec

Policy & Regulation Politics For Drummies Media

Online misinformation expert Amil Khan voices media concerns on Politics for Drummies

Author

By Richard Draycott, Associate Editor

June 28, 2024 | 5 min read

The CEO of Valent Projects joins podcast host Alastair Duncan in the run-up to the UK general election to talk about the dangers of media misinformation and how genAI could soon be targeting voters one by one.

Amil Khan

Khan joins Politics for Drummies host Alastair Duncan

Amil Khan graduated in languages shortly after 911 and, as an Arabic and Persian speaker based in Cairo, began his career as a freelance journalist, working with the Reuters news agency as a Middle East correspondent. However, it wasn’t long before he made a move away from news and into documentaries, seeing it as a more effective way to tell stories that would make an impact on a wider audience. He then became a consultant, working for political parties, non-profits and across the board in information.

On his move into disinformation, he explains: “I ended up working on Syria and that’s when the phrase disinformation really became popularized. My window on that was seeing the chemical weapons attacks that happened between 2013 and 2017 and seeing alternative narratives that these communities had gassed themselves, stories that the BBC didn’t believe, that the UN didn’t believe, but were finding their way from obscure little websites to parliamentary discussions in a matter of days.

Powered by AI

Explore frequently asked questions

“While everyone was focusing on the fact-checking of these things, all I could see was the manipulation of people’s accounts, manipulation of platforms, the same accounts going out on multiple platforms and spamming people’s accounts. I thought, ‘I want to do more on this.’ So, really, Valent came out of those technical aspects of disinformation.”

Today, Valent Projects works with various organizations and uses its AI software tools to protect clients against disinformation and other forms of online manipulation and threats.

During the conversation, Khan voices his concerns about the degree of misinformation that now quite easily makes it through to the masses and lays much of the blame at the door of a mainstream media very much focused on using technology to quickly and easily push out information to mass audiences, putting profits before its responsibilities as the fourth estate.

He says: “It’s very much like those gatekeepers [editors] have gone because of technology and it’s a fight for eyeballs. And that fight for eyeballs has often been a race to the bottom. So, you can just say provocative stuff, racism, misogyny, polarizing stuff, and it’ll get eyeballs. And then, if you can’t do that organically, you go and get some bots, you know, or you have a company out somewhere that will help you do it authentically.”

Khan also offers his thoughts on how AI is making it much easier and faster for journalists and media outlets to create content, which in itself can present challenges in presenting information fairly and accurately.

“With AI, it is becoming easier and easier to generate the content and, at the same time, we are discussing that you need to make more provocative content and there are some really interesting issues around that and some big questions around it all.”

One of the questions he ponders is how AI could potentially be used by political parties to disseminate information in future elections.

“We know that AI is about big data spotting trends, modeling and predicting, so your ability to target small audiences is huge. Add to that as part of your toolkit generative AI you can see we could easily get to the next election or an election in another part of the world in a couple of years where every voter is profiled individually and then targeted with very bespoke individual content.”

Policy & Regulation Politics For Drummies Media

More from Policy & Regulation

View all

Trending

Industry insights

View all
Add your own content +