Facebook was forced to apologize, as it reportedly allowed advertisers to target emotionally vulnerable people aged even 14 years!
According to a news agency, research conducted by two top executives of Facebook in Australia uses algorithms to collect data (through messages, images and reactions) about the emotional state of 6,4 million students, students and young people Australian and New Zealand, "indicating the moments that young people need to boost their self-confidence. In other words, the data tells when young people feel "useless" or "unsafe" and are therefore more receptive to receiving ads.
A Facebook spokesman said the information he collected was never used for targeted ads: "Facebook does not offer tools to target users based on their emotional state. The analysis carried out by an Australian researcher was intended to help dealers understand how people are expressing on Facebook, "said the spokesman. "We are currently trying to correct the problem that has arisen."
However, there is no doubt that data collection algorithms like this, not simply exist, but according to the basic principles of output for profit, are used constantly.
What makes things worse for Facebook is that watching young people's emotions in real time seems to violate the Australian Law on Advertising and Commercial Activities for Children.
As the Australian states, the law defines a child as a child aged 14 years and below, and states that children must "obtain explicit parental or guardian consent before performing any activity that will lead to the collection or disclosure ... of their personal information. "That is," information that identifies the child or can identify the child. "
The collection of Facebook information for young people and their negative feelings, such as "anxiety", "defeatism", etc., seems to be contrary to the ethical standards of Australia's National Advertising Association (AANA).
The report is the most recent example service of Facebook, used to serve what some would regard as unethical advertising. A ProPublica survey, 2016, claimed that the platform allowed advertisers to discriminate by race - what Facebook calls an "ethnic kinship" label.
Perhaps the news that Facebook allows ads to target new Australians based on their low emotional state will lead to another policy change. Either this, or it can create even more AI tools, to try and tackle the problem.