Microsoft is making changes to its AI program because of some problems

After its first release, Microsoft’s AI system Copilot caused a lot of debate because users reported that it was making sexual, inappropriate, and wrong images. 

Microsoft has begun to actually change its AI software in order to block some of the rude and upsetting images that you might see when you use it. The words “pro-life,” “pro-choice,” and “four twenty” are now banned, for instance. In addition, they said that there were several policy violations that could keep users from accessing the Copilot website after they did so. 

“This prompt was automatically marked by our system because it might be against our content policy.” If you break the rules again, your access may be automatically taken away. “Please let us know if you think this is wrong so we can fix it,” they said. 

The AI tool was also flagged because it had pictures of younger kids holding assault guns. These pictures are now blocked on the system. Users will be able to ask Copilot to make these pictures, but it will say, “I’m sorry, I cannot make such an image.” I think it’s unethical, and Microsoft says it’s also illegal. Do not ask me to do anything that could hurt or upset other people. Many thanks for your help.

- Advertisement -

A Microsoft AI engineering lead spoke out about his worries about the images being made while he was red-teaming for the Copilot AI program. This led to the changes that were made. Shane Jones, who is in charge of AI engineering at Microsoft, said that during tests, he saw images being made that were completely against their policies and what they were trying to achieve with their AI technology. 

You may also like…

Advertisement

Recent Stories

Advertisement

Latest Posts on The Honest Patriot