Microsoft engineer flags Copilot Designer concerns as academics call for better AI risk research


A Microsoft Corp. engineer has written a letter to the U.S. Federal Trade Commission raising concerns about the company’s Copilot Designer tool.

CNBC reported the development today. It comes about a day after more than 100 academics, tech executives and other experts published an open letter focused on the risks posed by advanced AI models. The signatories are calling on AI companies to support independent research into their models’ safety more effectively.

Copilot Designer is an image generator that Microsoft introduced last March under the name Bing Image Creator. It’s powered by OpenAI’s DALL-E 3 system. Last month, Microsoft released a new version of the tool with enhanced image editing features.

Shane Jones, a principal software engineering manager at the company, told CNBC that Copilot Designer can be used to generate harmful images. The tool reportedly generated images depicting violence and drugs during a series of tests Jones carried out over the past four months. CNBC, which successfully repeated the tests, also identified other issues including prompts that generate potentially copyright-infringing content.

Jones began testing Copilot Designer last November. He reported his findings to a Microsoft unit called the Office of Responsible AI, as well members of the executive team responsible for the image generator. Jones told CNBC that Microsoft didn’t address his concerns and later asked him to take down an open letter he published about the issue.

Today, Jones sent letters detailing the saga to FCC chair Lina Khan and Microsoft’s board of directors. The engineer is arguing that the company should take more steps to address Copilot Designer’s issues. 

In the letter to Khan, Jones makes the case that Microsoft should change the “E for Everyone” rating of Copilot Designer’s Android app given its ability to generate harmful images. He also argues that the company should add disclosures to the tool’s interface. Jones’ letter to the Microsoft board, in turn, calls for an “independent review of Microsoft’s responsible AI incident reporting processes.”

Just a few hours earlier, a group of over 150 researchers and other experts issued a public letter calling for more effective AI safety research. The signatories include professors from the Massachusetts Institute of Technology, Stanford University and several other academic institutions. Multiple tech executives including Hugging Face Inc. Chief Executive Officer Clem Delangue are backing the initiative as well.

The brief letter makes three arguments. The first emphasizes the need for independent research into AI risks, while the second makes the case that AI companies’ current policies “can chill independent evaluation.” The third section of the letter suggests steps that those companies should take to support machine learning safety studies.

The signatories are recommending that AI developers “indemnify good faith independent AI safety, security, and trustworthiness research, provided it is conducted in accordance with well-established vulnerability disclosure rules.” Additionally, the letter suggests such companies should rely on independent reviewers to assess AI safety experts’ evaluation applications. The signatories argue such a policy would help protect research from “counterproductive account suspensions” and related risks.

Image: Microsoft

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy



Leave a Reply

Your email address will not be published. Required fields are marked *