Broadcast United

Children are more likely to be victims and perpetrators of violence

Broadcast United News Desk
Children are more likely to be victims and perpetrators of violence

[ad_1]

According to Europol, more and more pornographic images and videos of children and young people are being circulated on the internet. Artificial intelligence is often used here. This is a problem for investigators.

An increasing number of online child abuse images are created or altered with the help of artificial intelligence.

An increasing number of online child abuse images are created or altered with the help of artificial intelligence.

Basak Gubz Dema/Getty

In a small town in Spain, naked photos of young girls circulated on social networks and WhatsApp. The girls were horrified and scared. Because they never posted naked photos of themselves, the photos that were circulated were generated by artificial intelligence. The perpetrator: a classmate, aged between 13 and 15 years old.

this September 2023 Case illustrates two trends found by Europe’s police authority Europol in its report Latest Cyber ​​Crime Report Emphasize that more and more minors are becoming victims of online sexual abuse or blackmail. At the same time, more and more minors are becoming perpetrators of such crimes.

Artificial intelligence makes crime easier

Both are related to artificial intelligence. Because this makes it virtually trivial to create pornographic material. The production of material depicting child sexual abuse is widely accessible and does not require a high level of technical knowledge, Europol wrote in its latest report. As a result, the number and range of possible perpetrators are increasing.

Abusive material can be real photos altered using AI, as was the case with the Spanish girls, who used an app to turn their innocuous profile photos into realistic nude photos.

Child pornography on the Internet is increasingly created entirely by artificial intelligence. The photo is not of a real child then. But such AI-generated images are ultimately based on photos of real victims used in the model training. They normalize the sexualization of children.

Sex offenders are also increasingly using artificial intelligence to obtain authentic nude photos. Chatbots help them talk to victims online in almost any language and convince them to take and send explicit photos.

According to the Europol report, a large portion of the material depicting the sexual abuse of teenagers is self-created. Once the perpetrators obtain explicit images or videos of the victims, they often use them to threaten or blackmail the victims.

Perpetrators are increasingly difficult to identify

All of this may just be the beginning. The police department believes the use of AI in cybercrime will expand significantly in the near future.

Not only is AI becoming more powerful and easier to use, but according to Europol, more and more AI models specialized for illegal activities are also being spread through the dark web. Compared to publicly available models such as Chat-GPT or Gemini, these models do not offer protection mechanisms against the creation of abusive material. Sometimes they are even further trained to render such material more realistically.

On the one hand, AI is increasing the number of illegal images and videos of child abuse on the Internet. This makes it more difficult for authorities to investigate individual cases. On the other hand, images created or modified with AI are difficult or impossible to distinguish from real photos. Europol writes that identifying real victims and perpetrators is becoming increasingly complex.

Therefore, suitable tools are needed to identify which audio, image and video content or parts of it are deepfakes. The report does not specify what such tools would look like. There must also be a greater focus on prevention of perpetrators, given that the perpetrators are often minors.

In the future, more resources will be needed to prevent children from becoming victims or perpetrators of abuse.

[ad_2]

Source link

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *