Here’s an AI tool that will protect your online images from manipulation

AI being used to keep AI in check

Here’s an AI tool that will protect your online images from manipulation

Generative Artificial Intelligence (AI) systems have begun to make steady inroads into the work of both, professional organisations and educational institutions the world over. Chatbots, in addition to providing text-based responses to prompts, the way ChatGPT and Google Bard do, have now also gained the ability to create and edit images. Some of these chatbots include Dall-E, Stable Diffusion, and Midjourney.

While work, for those who are engaged with graphic designing, whether on their smartphones, laptops, or tablets, has become exponentially easier with the same, image-based generative AI does come with its own dangers as well. Namely, the manipulation or theft of existing images for one’s own use. This is where the PhotoGuard tool, developed by the Massachusetts Institute of Technology’s Computer Science & Artificial Intelligence Laboratory (MIT CSAIL) steps in.

What is PhotoGuard?

PhotoGuard is a tool that alters select pixels in an image in such a manner, that it throws off an AI chatbot’s ability to understand the said image. These alterations, the team at MIT says, are invisible to the human eye but can be read by machines. For the same, the tool uses two different methods – the “encoder” and “diffusion” attack methods.

What are the encoder and diffusion attack methods?

In the “encoder attack” method, the tool focuses on introducing changes to the latent representation of the image to an AI. That is to say, the method, by introducing subtle changes in the image, compromises an AI model’s understanding of the mathematical data points, which describe the position and colour of every pixel in the image to it, and prevent it AI from understanding, and thereby manipulating the image.

ALSO READ: ChatGPT Android app now available: 5 top features for ease-of-use

The more complex “diffusion attack” method, camouflages an existing image as a different image to the AI. It does so by defining a target image and optimising the changes to it, so as to resemble its target. Any manipulations that an AI may try to make on these images, will actually be applied to the fake “target” image, resulting in unrealistic-looking images being generated.

To better understand this method, consider an original image – a drawing of any sort, and a totally different target image. What the diffusion attack method does, is introduce changes to the first drawing, such that to the AI model, it begins to resemble the second.

From thereon, any attempts it makes to manipulate or alter the original image will result in changes to the image, as if it were dealing with the second target image. This helps protect the original image from manipulation, and the generation of unusable, unrealistic-looking images.

Here’s an AI tool that will protect your online images from manipulation

Is PhotoGuard foolproof?

The PhotoGuard tool, MIT CSAIL says, is also not completely foolproof. Though it is an effective tool, once an image is online, individuals could attempt to reverse engineer the same by applying noise to it, or even cropping or rotating the image, and then manipulating it as per their will. However, comprehensive alterations can be employed to ensure images can be made secure from common manipulations.

Unleash your inner geek with Croma Unboxed

Subscribe now to stay ahead with the latest articles and updates

You are almost there

Enter your details to subscribe

0

Disclaimer: This post as well as the layout and design on this website are protected under Indian intellectual property laws, including the Copyright Act, 1957 and the Trade Marks Act, 1999 and is the property of Infiniti Retail Limited (Croma). Using, copying (in full or in part), adapting or altering this post or any other material from Croma’s website is expressly prohibited without prior written permission from Croma. For permission to use the content on the Croma’s website, please connect on contactunboxed@croma.com

Comments

Leave a Reply
  • Related articles
  • Popular articles
  • Laptops

    Google Bard explained: All about the new ChatGPT rival

    Chetan Nayak

  • Smartphones

    OpenAI GPT-4 is here: What it means for ChatGPT and other AI chatbots

    Chetan Nayak

  • Desktops

    Microsoft Orca: Why this GPT-4 rival is a big step in the AI revolution

    Chetan Nayak

  • Smartphones

    All Apple iPhones launched since 2007

    Chetan Nayak

  • Air Conditioners

    Different modes on your AC and what they mean

    Sambit Satpathy

  • Audio

    Bose Ultra Open earbuds launched in India

    Sambit Satpathy