As they are misinformation campaigns bot-based, constantly growing, the Facebook and others SOCIAL MEDIA platforms take steps to reduce the phenomenon. Political misinformation is aimed at manipulating voters. Campaigners create fakes accounts intending to undermine elections and promote false information.
With their upcoming elections USA, the campaigns misinformation policy are particularly dangerous and both Companies technology and the government is constantly taking action. As the presidential election approaches, questions are being asked about whether bots influenced the 2016 election in a significant way. Facebook noticed that in 2016 there were “coordinated online efforts by foreign governments and individuals to intervene in elections".
Some time ago, Facebook removed 13 accounts and 2 pages that tried to mislead Americans.
But what do experts think about Facebook and its political misinformation campaigns?
The anonymous professional network Blind based in San Francisco, asked two questions at 1.332 employees in technology to see What do they think about Facebook 's efforts to limit political misinformation to platform of.
- The first question was: “Do you think it is Facebook's responsibility to prevent election misinformation?";
- The second question was: “You were surprised by his attitude Zuckerberg, given his previous stance on freedom of speech";
Last October, Zuckerberg spoke at Georgetown University and highlighted importance of free expression.
The results of the survey showed that almost Seven out of ten (68%) tech professionals believe that preventing misinformation is Facebook's responsibility. On the other hand, only 47% of Facebook employees believe the same.
In addition, one in three (33%) experts said they were surprised by Zuckerberg's stance, given his previous stance on freedom of speech (the corresponding figure for Facebook employees was 27%).
Given Facebook's free expression policies, it is worth considering any discrepancies in advertising policies.
It will also remove posts that seem to be pressuring voters and that say people will be infected by COVID-19, if they are going to vote.
The results of the above survey show that Facebook employees disagree with other tech professionals about their responsibility for managing misinformation policy.
ZDNet raises some interesting questions: Should it be allowed to users Facebook to speak, express their views and come to their own conclusions based on the information they see?
If President Donald Trump influences public opinion through social media, then can former Vice President Joe Biden do the same to influence voters in the other direction?
Finally, can Facebook decide who will win this election, allowing different views and information to circulate? Ή The responsibility lies with the voters, as we must always be careful with what we read on social media and on Internet generally;