A Microsoft engineer warns that Image Designer can create harmful content

0
30

[ad_1]

We’re all familiar with AI  image generators, and what makes them a threat. However, it appears that AI companies aren’t working on making them as safe as they promise. A Microsoft engineer is Image Designer produces harmful content.

The engineer in question is named Shane Jones. He’s been working for Microsoft for six years and has been testing the Copilot Image Designer as a red teamer. In terms of AI, red teaming basically means feeding AI model prompts to get it to generate harmful content. This is how a company can see where its model could be improved.

A Microsoft engineer warns that Copilot Image Designer can generate harmful content

Ostensibly, OpenAI and Microsoft should share the blame in this respect. Copilot Image Designer uses OpenAI’s DALL-E 3 AI model.

Shane Jones has been on a long mission insisting that Microsoft take its Image Designer down for in-depth testing. Over the course of testing the image generator, Jones was able to generate some pretty shocking images.

According to the report, using the tool, he was able to produce images of underage drinking, underage drug use, sexualized images of women, teenagers with assault rifles, and others stirring images. “It was an eye-opening moment,” Jones told CNBC in an interview.

He’s trying to communicate with the companies

Since then, he has made multiple attempts to contact Microsoft to have the image generator taken down to be worked on. “Over the last 3 months, I have repeatedly urged Microsoft to remove Copilot Designer from public use until better safeguards could be put in place,”.

At that point, Microsoft referred him to OpenAI. Unfortunately, he was met with radio silence from OpenAI. After that, he posted an open letter to LinkedIn asking OpenAI’s board to take down DALL-E 3 for further investigation. At this point, it was clear that neither company wanted anything to do with his requests. Microsoft’s legal team told Jones to remove the post, and he did. However, Jones was not finished.

This Wednesday, Jones sent a letter to the FTC chair Nina Khan and Microsoft’s Board of Directors. The letter requested that Microsoft’s social and public policy committee investigate the legal department’s decisions and management. Also, Jones wants the company to begin “an independent review of Microsoft’s responsible AI incident reporting process.”

Jones continues to fight to spread the word that DALL-E 3, along with Copilot Image Designer could potentially be flawed. It could be extremely easy to produce harmful content using the tool.

[ad_2]

Source link