Facebook Inc. is applying to patent software that it could use to create profiles of users’ households by making educated guesses about how many people live in the household, what their relationships to each other are, what interests they share and what electronic devices they use.
The system would draw on the wealth of information Facebook already has about its users — including their photos , comments, messaging history and web browsing activities — and could be used to help target ads, according to the patent application.
“Without such knowledge of a user’s household features, most of content items that are sent to the user are poorly tailored to the user and are likely ignored,” says the patent application, which was filed last year and made public Thursday.
The software would analyze images posted to Instagram or Facebook. (Even users who never upload photos still can be tagged in other users’ photos.) To help determine whether people live in the same home, the patent application says, the software could look at how often people are tagged in pictures together and at the photos’ captions. The software would not be limited to using photos that include everyone in the household; rather, the patent application shows, it would take into account pictures of individuals and pairs.
The application indicates Facebook could also incorporate “past posts, status updates, friendships, messaging history, past tagging history” and web browsing history to put together a profile of a household or family. Those profiles, in turn, could be made available to third parties that want to target “content” to users, it says.
Facebook declined to comment Friday on the details of the patent application but said that applying for the patent does not necessarily mean it will build or use the software.
“We’ve often sought patents for technology we never implement, and patents should not be taken as an indication of future plans,” a Facebook representative said in a statement.
The Menlo Park, Calif., company, which has struggled this year to maintain the public’s trust in the security of its platform, rolled out a device this month that will help it gather more information about people in their homes called Portal, which has smart-speaker functions and is optimized for video chatting.
In the past, Facebook contracted with third-party data brokers such as Acxiom and Experian to offer ad targeting based on users’ family makeup, income levels and other data. It stopped that practice in March, however, after the revelation that another third-party firm, Cambridge Analytica, may have violated Facebook policy by sharing and storing Facebook user data.
Security concerns have continued. This fall, Facebook disclosed a breach that affected 29 million user accounts; hackers accessed user information including name, gender, language, relationship status, religion, birthday, friend lists, timeline posts and titles of recent conversations.
Now there’s concern that the level of predictive analytics proposed in the patent application would introduce additional problems in that vein.
“This is what I would call a classic case of secondary use,” said Pam Dixon, founder and executive director of the World Privacy Forum. “Someone is signing up to Facebook, or Instagram for that matter, to post photos or maybe keep in touch with old college friends. I don’t think people intend to have all their relational outlines queried and mapped by Facebook and used for purposes that people aren’t expecting.”
Based on a profile of a household of seven children, Facebook could make economic inferences, for instance, Dixon said. She also said that using artificial intelligence to analyze pictures of a family could run the risk of drawing false conclusions based on ethnicity or gender.
“Today, Facebook allows people to target ads based on information that is available already about you. It’s just on a fact basis,” Dixon said. “But with this (proposed system), it’s traditional data broker predictive analytics. It puts people in categories based on who Facebook predicts them to be. This is where we enter the realm of unfairness and potential bias and discrimination.”
*Originally published in LA Times.