Liana Romulo contributes Part 2 on Social Media
In the year leading up to the US November elections, in which Biden replaced Trump, social media companies faced a lot of pressure to stanch the flow of campaign-related misinformation on their platforms. While the other CEOs proposed and implemented various solutions, Mark Zuckerberg stubbornly refused to interfere with political content on Facebook, considered the most important digital platform for campaigners. The Facebook blog states “that people should be able to hear from those who wish to lead them, warts and all, and that what they say should be scrutinized and debated in public.”
Mocha Uson, warts and all, wishes to lead her flock via her blog page on Facebook. Her content has been scrutinized and debated publicly, culminating in a couple of movements to get her and her online presence expunged. Known as the Queen of Fake News, she is still going strong with about six million followers on her page, while Vice-President Leni Robredo, often the target of fake news, has around 1.5 million.
Ms. Uson has the right to free speech, I think – to get up on stage, grab a microphone, and say anything she wants. She should appear on television and write for print media. The trouble is, unlike traditional media, social media – Facebook, in particular – manipulates the users’ behavior, exacerbating the fake news problem. It also spreads disinformation overwhelmingly and sometimes virally, thanks in large part to click farms and troll farms, easily drowning out real news.
“The short-term, dopamine-driven feedback loops we’ve created are destroying how society works,” said Chamath Palihapitiya,Facebook’s former vice president of user growth. “No civil discourse, no cooperation; misinformation, mistruth. You are being programmed.”
Facebook algorithms exploit the workings of the human brain in order to get the user’s attention and keep it. I’d be less concerned if Facebook was manipulating its users to be kinder, gentler, and more compassionate – and even then it would still be wrong. But its algorithms instead drive mistrust and feed on emotions like hatred and insecurity, quickly leading to social divisiveness, polarization, and radicalization – all without users’ awareness or consent.
A great many places in the Philippines aren’t yet wired for internet, but people have access to Facebook on their phones. If you have a cheap mobile and a prepaid SIM card, all you need is a P1.00 load for unlimited Facebook access, while real internet would be much more expensive and slow. Currently, we have a whopping 80.5 million Facebook accounts, and for millions in hard-to-reach areas, Facebook is the internet. Indeed, it’s very possible that the information they skim off Facebook is all the information they know, period.
It’s no wonder, then, that the Philippines has again taken the world’s top spot in social media usage, with Filipinos spending an average of four hours and 15 minutes each day on social media. As we’ve held this dubious ranking for six years in a row, according to Rappler, the potential for full-on exploitation of our population is truly frightening.
“Algorithms tend to give people what they want to hear or read … and they do it repetitiously. And those two things … are what lead to people believing false content, even when you give them true information. And this is how you radicalize people,” said Dr. Nolan Higdon, author of The Anatomy of Fake News.
People who love President Duterte want to have their love confirmed, which is how humans tend to pursue information, and they don’t want to hear anything bad about their idol. So Duterte fans might click on articles like “D30 Pulse Asia approval rating at 99.9%,” which then prompts the algorithm to surface more pro-Duterte news and maybe some anti-De Lima stuff, too, like news that she received bribes from drug lords. Whether positive or negative, true or false, all of it reinforces the users’ confirmation bias. Rather than balance out their prejudices or correct false information (De Lima denies the allegations), social media reinforces what they already believe.
This contributes to an online environment that’s an echo chamber, where the user encounters only what reflects her preexisting bias. So if you’re an anti-masker, you’re only going to come across stories (fake or real) about how masks are bad for your health. The algorithm doesn’t care about truth or accuracy, and distorts your perspective so that you eventually wind up having difficulty considering another point-of-view. You slowly become radicalized, unable to discuss complex ideas or engage in nuanced thinking. A few weeks ago in NYC, a woman attacked two people after demanding that they take their masks off. Had she been living in an echo chamber?
Ever wonder why people’s stand about Ivermectin (or vaccines) can be so extreme whether for or against? Or why DDS tend to be rabid, even threatening to rape or kill Patreng Non of the community pantries?
We each have our own biases and, therefore, we all exist in our own echo chambers. I’m not a Duterte supporter; therefore, I will never encounter DDS posts on social media. So although I’m happy about pro-Carpio memes receiving a lot of love, a result of Duterte backing out of the debate, I must remember that only like-minded people are seeing what I’m seeing. The DDS are having a totally different experience. They live in an alternate reality, but of course they think I’m living in Lala Land. We think their news is fake; they think ours is. This is what Maria Ressa refers to as “no shared reality.” She tweeted: “Each second wasted w/o a systemic solution to stop the impunity of both the operators behind disinformation networks and the platform itself means the harm continues. No facts, no shared reality.”
Disinformation travels way faster than factual news, and more and more brains are washed every second of every minute of every hour. In a world growing more and more authoritarian, it’s not an exaggeration to say that, unless all Filipinos gain access to reliable news and information, our democracy could very well be at stake.
Source: https://www.philstar.com/business/2021/05/21/2099669/disinformation-substitute-competence