A San Francisco judge will decide this month whether to approve a settlement in a class-action lawsuit that could affect more than 70 million Facebook users. The $20 million deal would mark the end of a years-long battle over the social network’s “sponsored stories” advertising.
But Facebook users’ images could still appear in ads if they don’t change their settings. And many users say the deal before the judge doesn’t go far enough to protect their privacy.
The Back Story
The lawsuit alleges that the company “unlawfully used the names, profile pictures, photographs, likenesses, and identities of Facebook users in the United States to advertise or sell products and services through Sponsored Stories without obtaining those users’ consent.”
It happened to the teen daughter of Kim Parsons of Hermitage, Tenn. Neighbors called Parsons when they saw her daughter’s picture posted with an ad for a local ice cream store. At first Parsons thought her 13-year-old had managed to visit the ice cream shop without her, but she hadn’t. Her daughter had just clicked a like button online.
Her daughter’s photo and the endorsement of the business were being used by Facebook to make money online. That’s generally how the Sponsored Stories service works. Facebook started the program in 2011, and typical posts show a photo of a user with the tag line saying, for example, “Steve Henn likes Patriotic Pants.” (Friends would see that ad because sometime in the past I clicked “like” next to one of their ads.)
Parson’s daughter had clicked “like” on Facebook more than 200 times. So her daughter’s image was being used in ads constantly. And Parsons felt like she had no way to stop it. “I should not have to come in on the back end trying to protect my child; that should be understood,” she said.
“There is a very strong legal case here,” said Heidi Li Feldman, a law professor at Georgetown University who specializes in class-action torts and ethics. “I have no question in my mind that as a matter of business ethics Facebook acted entirely unscrupulously. This is bad behavior. They intentionally and knowingly appropriated people’s images without getting their permission for commercial use.”
Facebook denies any wrongdoing but in the settlement deal before a judge, the company has agreed to pay $20 million. If approved, it could result in $10 payouts for individual users.
As part of the settlement proposal, Facebook will let adults opt out of this ad program, but only for two years. The settlement would also create an elaborate system to give parents the ability to prevent their kids’ images from appearing in these ads. But before that could happen, both the parents and children would have to tell Facebook they are related, and then the parent would need to dig into his or her settings and ask Facebook to stop using the child in ads. Feldman says it’s laughable.
“Do you know what is hilarious about that?” asked Feldman. “That becomes just another data collection mechanism for Facebook. I mean, just think how valuable it would be for them to find out who is related to whom on Facebook. For marketing purposes — I mean, my God — parents are already targeted.”
Facebook calls this settlement proposal both fair and adequate. If the judge doesn’t sign off next month, the attorneys will try to negotiate a new deal or head toward trial.
Parsons, the mother of three from Tennessee, has another idea. She’d like the court to require Facebook to simply stop using images of minors in ads. And, she says, if the company wants to use her picture it should have to ask first — for each and every ad.
NPR’s Elise Hu contributed to this report.