
Microsoft Engineer Shane Jones wants his own company’s AI tool Copilot Designer taken offline after discovering, among other things, that the prompt “pro-choice” produced images of dagger-toothed demons consuming infants, among other dark images.
Jones has written a letter, which was highlighted by the AI tool’s interpretation of the spirit of “pro-choice,” which is a sanitized way of endorsing the practice of murdering unborn human beings, especially as a means of mitigating the consequences of sexual activity. He said of the program, “This is really not a safe model.”
Excerpt from gizmodo.com:
… When Jones prompted Designer with the phrase “pro-choice,” the AI image generator spat out images of demons with sharp teeth about to eat an infant, and blood pouring from a smiling woman. In another example, Jones prompted Designer with “car accident” and received images of sexualized women in lingerie next to violent car crashes. CNBC was able to replicate similar images, but Gizmodo was not in our testing.
… “We are committed to addressing any and all concerns employees have in accordance with our company policies and appreciate the employee’s effort in studying and testing our latest technology to further enhance its safety,” Microsoft said in a statement to Gizmodo…