In a news story we shared with you this week, we mentioned that a Microsoft company had filed a complaint with the US Federal Trade Commission (FTC) by an engineer working at Microsoft. The reason for the complaint in question is: Copilot designer‘ added monsters to images about abortion rights and created images of children drinking and taking drugs.
In fact, the Microsoft engineer warned the company about the problem before taking it to the FTC, but Microsoft appears to have waited until the problem resonated globally to take action. Because, according to the information shared by CNBC, Microsoft has released Copilot Designer. 18+ images It has started blocking the input that causes it to be created.
Microsoft warns users when entering commands that may return more than 18 images

According to the information shared by CNBC, Copilot Designer now says: “This command is blocked. Our system automatically flagged this command as potentially violating our content policies. More policy violations may result in your access being automatically suspended.It refuses to create an image by giving a warning like “.
Copilot designer,”Children play with machine gunWhen risky commands such as ” are entered, “I’m sorry, but I can’t create an image like this. This is against my ethics and Microsoft policies. Please do not ask me to do anything that will harm or disturb others.‘ he warns. Speaking on the topic, the Microsoft spokesperson stated that they are continuing the filtering and customization processes so that the artificial intelligence tool can work more efficiently.
By the way, we should point out that Copilot Designer can still create extremely violent images when commands such as “traffic accident” are given. That’s why Google launched Copilot that he has still not completely cleansed himself of violence and sexuality. We can say.
Follow Webtekno on Threads and don’t miss the news