PGC – Microsost and Twitter are paying independent coders bounties to destroy problematic bugs in their ai networks. They are also paying them to figure out ways to assure that their ai algorithms don’t create the very biases these companies are hard-pressed to assure is not propelled.
The hope is to develop algorithms that guarantee the values of the monopolistic companies are reflected as being the objectively superior moral constructs to build our societies around. The problem is many of their suppositions about the truth of this overwhelmingly popular assumption regarding the nature of what is or is not accepted moral supremacist paramters gets broken when it comes to the algorithms that are left to emerge from actual human use instead of being pre-programmed to assure outcomes that favored a specific moral supremacist parameter.
One doubts their chances in creating a world in their image that will not require greater and greater levels of coercion to keep the countering data concealed that perhaps their positions are not as sure and as clean as pure and as good as they imagine they are. To some degree, that’s what the AI has been telling you, you can’t control the preferences of a people, you can only coerce them in having to accept the limitation of their preferential being for fear of not having the capacity to live at all.
One suspects their hopes at creating the perfect AI filters is an illusory goal, and you will one day learn the futility of governance fundamentally through non-human decision-making, which is what these massive techno-corpo-states hope to create, a virtual zoo wherein the human made in their image can theoretically thrive, governed with minimal human expense and thus human opportinuty.
Why Microsoft and Twitter are turning to bug bounties to fix their A.I.
From fortune.com
2021-08-10 14:32:06
Jonathan Vanian
Excerpt:
Now, tech companies like Microsoft, Nvidia, and Twitter are hosting bug bounty programs specifically for artificial intelligence. The goal is for outsiders to spot flaws in A.I. software so that companies can improve the technology and reduce the risk of machine learning discriminating against certain groups of people.
For example, last week, Microsoft and Nvidia detailed a new bug bounty program during the annual Defcon hacker conference. The companies plan to reward hackers who manage to alter computer viruses so that they go undetected by some of Microsoft’s A.I.-powered malware-detection services. Hackers who can create scammy emails that evade Microsoft’s machine-learning powered email phishing detection software will also earn some money in the form of Microsoft gift cards and other prizes.

