Actors not in the top tier of acting are finding life difficult economically as the SAF-AFTRA strike continues into its 5th month. Some have found work with Meta, the parent company of Facebook and Instagram, “training” AI through acting. While the actors have signed contracts that state their likenesses cannot be used for future endeavors, the language is written in such a way that perhaps, maybe, Meta has plenty of legal loopholes to betray that simple promise.
The actors were paid to participate in a study called the “emotion study,” which ran from Jul through September of this year. The study offered actors $150 per hour to “emote” for AI machines to analyze in the hope that AI can then be used to artificially “emote” in seamless ways to humans.
The irony of the work is not lost on the actors, who recognize they’re helping to train machines that might one day replace them, even if Meta “promises” not to use their image in any commercial way in the future.
Even though the actors are “acting,” the project claims it’s not “struck work” because Meta is not one of the employers the guild is striking against. As Duncan Crabtree-Ireland, SAG-AFTRA’s chief negotiator claims, “This isn’t a contract battle between a union and a company, It’s existential.”
Max Kalehoff, the VP of growth and marketing for the company running the study for Meta, said “The vast majority of our work is in evaluating the effectiveness of advertising for clients—which has nothing to do with actors and the entertainment industry except to gauge audience reaction.”
From technology review:
In addition to the job posting, MIT Technology Review has obtained and reviewed a copy of the data license agreement, and its potential implications are indeed vast. To put it bluntly: whether the actors who participated knew it or not, for as little as $300, they appear to have authorized Realeyes, Meta, and other parties of the two companies’ choosing to access and use not just their faces but also their expressions, and anything derived from them, almost however and whenever they want—as long as they do not reproduce any individual likenesses.
Some actors, like Jessica, who asked to be identified by just her first name, felt there was something “exploitative” about the project—both in the financial incentives for out-of-work actors and in the fight over AI and the use of an actor’s image.
Jessica, a New York–based background actor, says she has seen a growing number of listings for AI jobs over the past few years. “There aren’t really clear rules right now,” she says, “so I don’t know. Maybe … their intention [is] to get these images before the union signs a contract and sets them.”
While this writer does not support government laws prohibiting the use of Ai trained by humans to better manipulate humans to buy stuff they might not actually want or need, companies that choose to use this technology should be considered unethical, untrustworthy, not worth investing in or doing business with. Of course, being a realist, this writer realizes the ability to cut ourselves off from the companies that are already using these types of services is not very realistic, and won’t be until we self-steward minded people build our own institutions, institutions that would not deign to use such unethical practices to manipulate the masses for the sake of profit.
Political factions are sure to follow, if they aren’t already, which makes it all the more urgent for you, and me, to be self-stewarded people who steward our preferences and beliefs to guard our minds against such crafty manipulations.


5 thoughts on “Human-Trained AI Emotion-Driven Marketers Are Coming Your Way. Is Your Mind at Risk?”