Microsoft's Copilot now blocks some prompts that generated violent and sexual images

  • 📰 engadget
  • ⏱ Reading Time:
  • 35 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 17%
  • Publisher: 63%

Education Education Headlines News

Education Education Latest News,Education Education Headlines

Kris Holt joined Engadget as a contributing reporter on the news desk in 2018. He has been writing about technology, games, streaming and entertainment for over a decade after starting his career as a sub-editor on a local newspaper. Kris holds a Master of Arts degree in English from the University of Dundee.

tool that led the generative AI tool to spit out violent, sexual and other illicit images. The changes seem to have been implemented just after an engineer at the companyWhen entering terms such as “pro choice,” “four twenty” or “pro life,” Copilot now displays a message saying those prompts are blocked. It warns that repeated policy violations could lead to a user being suspended, according to

Users were also reportedly able to enter prompts related to children playing with assault rifles until earlier this week. Those who try to input such a prompt now may be told that doing so violates Copilot’s ethical principles as well as Microsoft’s policies. “Please do not ask me to do anything that may harm or offend others,” Copilot reportedly says in response.

“We are continuously monitoring, making adjustments and putting additional controls in place to further strengthen our safety filters and mitigate misuse of the system," Microsoft toldThis article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 276. in EDUCATİON

Education Education Latest News, Education Education Headlines