Microsoft Project Artemis automatically scans text-based conversations and rates them on the probability that a user might be trying sexually exploit children.
An engineering team led by Dartmouth College digital forensics expert Hany Farid developed the technique
Microsoft says it has developed a technique to detect online predators who try to groom children for sexual purposes using the chat function in multiplayer video games.
The tech company, which makes the Xbox gaming system, announced Thursday that it's sharing the tool with nonprofit organizations and other gaming and messaging service developers.
Nicknamed “Project Artemis,” the tool automatically scans text-based conversations and rates them on the probability that a user might be trying sexually exploit children. Human moderators are then able to review flagged conversations to determine if they should report them to law enforcement.
An engineering team led by Dartmouth College digital forensics expert Hany Farid developed the technique. Microsoft worked with Farid and the makers of messaging services like Kik and the popular game Roblox. It will be distributed for free starting Friday through the anti-trafficking group Thorn.
Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.