Your search:
131 result(s) in 0.04 s
-
DUKAS_186893109_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
The panoramic view of different colossal works is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo in Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour, and its projections take place in the afternoon and evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893108_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of the Sun Stone is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour with projections in the evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893107_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
The panoramic view of different colossal works is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo in Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour, and its projections take place in the afternoon and evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893106_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of Coatlicue is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour with projections in the evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893093_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of Coatlicue is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour with projections in the evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893091_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of Coatlicue is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour with projections in the evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893080_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of Coatlicue is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour with projections in the evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893059_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of the Teocalli is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour, with projections taking place in the afternoon and evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893047_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of Coatlicue is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour with projections in the evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893046_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
The panoramic view of the Metropolitan Cathedral and a replica of Coatlicue is seen before the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour, and its projections take place in the afternoon and evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893036_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
The panoramic view of the Metropolitan Cathedral and a replica of Coatlicue is seen before the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour, and its projections take place in the afternoon and evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893150_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of a colossal work is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour, and its projections take place in the afternoon and evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893146_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of a colossal work is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour, and its projections take place in the afternoon and evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893126_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of Coatlicue is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour with projections in the evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_186893125_NUR
Videomapping Luminous Memory Mexico-Tenochtitlan 700 Years In Mexico City
A replica of Coatlicue is seen during the video mapping show Memoria Luminosa Mexico-Tenochtitlan 700 anos, in the Zocalo of Mexico City, Mexico, on July 11, 2025. This work is available until July 27, lasting one hour with projections in the evening so that the videos and reflections of the image can be better appreciated. (Photo by Jose Luis Torales/NurPhoto) -
DUKAS_185249025_NUR
Daily Life In Kerman, Iran
Iranian ecotourism activists capture videos as young Iranian musicians play traditional musical instruments at the Shafiabad Caravanserai, 118 km (73 miles) east of the city of Kerman, 1335 km (830 miles) southeast of Tehran, Iran, on May 22, 2025. (Photo by Morteza Nikoubazl/NurPhoto) -
DUKAS_184247226_NUR
Cherry Blossoms In Toronto, Canada
People take photos and videos of cherry blossom trees at Trinity Bellwoods Park in Toronto, Ontario, Canada, on May 3, 2025. (Photo by Arrush Chopra/NurPhoto) -
DUKAS_184247223_NUR
Cherry Blossoms In Toronto, Canada
People take photos and videos under a cherry blossom tree at Trinity Bellwoods Park in Toronto, Ontario, Canada, on May 3, 2025. (Photo by Arrush Chopra/NurPhoto) -
DUKAS_160673926_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673963_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673928_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673958_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673973_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673974_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673962_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673932_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Dan Sexton , Chief Technical Officer.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673975_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Dan Sexton , Chief Technical Officer.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673980_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Dan Sexton , Chief Technical Officer.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673967_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Dan Sexton , Chief Technical Officer.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673936_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Dan Sexton , Chief Technical Officer.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673925_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Chis Hughes , Hotline Director
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673965_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Chis Hughes , Hotline Director
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673960_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Chis Hughes , Hotline Director
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673970_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Chis Hughes , Hotline Director
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673982_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Chis Hughes , Hotline Director
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673983_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Chis Hughes , Hotline Director
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673971_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673931_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673981_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673933_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673972_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673929_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673924_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673968_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673959_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673930_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673969_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673934_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673961_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved. -
DUKAS_160673966_EYE
Paedophiles using open source AI to create child sexual abuse content, says watchdog.
People sharing on dark web how to modify software on their computers to manipulate photos of children, says IWF.
Freely available artificial intelligence software is being used by paedophiles to create child sexual abuse material (CSAM), according to a safety watchdog, with offenders discussing how to manipulate photos of celebrity children or known victims to create new content.
The Internet Watch Foundation said online forums used by sex offenders were discussing using open source AI models to create fresh illegal material. The warning came as the chair of the government’s AI taskforce, Ian Hogarth, raised concerns about CSAM on Tuesday as he told peers that open source models were being used to create "some of the most heinous things out there".
Open source AI technology can be downloaded and adjusted by users, as opposed to closed model tools such as OpenAI's Dall-E or Google's Imagen whose underlying models - which underpin the creation of images - cannot be accessed or changed by members of the public.
Inside ‘The Internet Watch Foundation (IWF). The IWF is an international charity working to make the internet a safer place by minimising the availability of online child sexual abuse videos and images.
Pictured; Martin Hines works at his desk at the IWF.
© Graeme Robertson / Guardian / eyevine
Contact eyevine for more information about using this image:
T: +44 (0) 20 8709 8709
E: info@eyevine.com
http://www.eyevine.com
(FOTO: DUKAS/EYEVINE)
© Guardian / eyevine. All Rights Reserved.