Posted by Marcia Penner Freedman
When did we start talking about our relationship with technology? (Really. Just search it on Google and you’ll be surprised.)
I believe the notion of a relationship began way back, when we first got the inkling that computers were not simply sophisticated typewriters. That they seemed to have a mind of their own. I believe it was when we started arguing with them and becoming impatient with them. I believe it was when we acclimated to the digitized customer service voice. Calling it ‘she’.
We have come a long way since those early, naïve, days of communication technologies, which, today, seem to be influencing us and affecting our behavior in ways we cannot imagine, distancing us more and more from the natural world, the source of our health and well-being.
As an example, I’ve included here excerpts from a blog, Abu Yehuda, about the extreme influence Facebook is having on our lives:
Facebook’s essential extremism
Posted on August 24, 2018 by Vic Rosenthal
There are 7.6 billion humans on this earth. 2.23 billion of them logged on to Facebook (the number counts “monthly active users”) during the second quarter of 2018.
I don’t know about you, but I found this astounding, considering that Facebook did not exist prior to 2004, and was not open to the general public until 2006. This single “platform” has arguably had a greater influence on human social and political behavior than anything since the invention of radio and television. It may turn out to be as disruptive of the social order as the widespread introduction of movable type in the 15thcentury.
The sheer speed at which Facebook has spread through world cultures along with its constantly changing, hidden, proprietary algorithms mean that its effects are difficult to study. Unlike the decentralized publishing industry that grew out of the advances in printing technology, Facebook is tightly controlled by a single private company….
One of the well-known characteristics of Facebook is its encouragement of ideological bubbles. This is by design. The designers understand that the amount of time one spends on Facebook – and therefore the number of ads one sees – depends on the psychic gratification one receives from the content. It’s well-known that such gratification increases when the content includes ideas with which one agrees, while exposure to ideas that challenge one’s beliefs produces discomfort. So the algorithm that decides which posts a user will see chooses those which – according to an elaborate profile created by the user’s own posts and “likes” – it estimates that the user will find congenial….
The platform itself is structured to encourage its users to behave in ways which support its objective of providing a gratifying experience. For example, a user who posts a “status,” photo, or link, has control of the comments that other users can make about it. If another user posts a comment that the “owner” of the initial post disagrees with, the owner can delete it. As a result, Facebook etiquette has developed in which it is considered inappropriate to post a disagreement. “This is my page, and I won’t allow racism (or fascism, transphobia, etc.) on it,” a user will write, and delete the offending comment.
There is also the way Facebook users get “friends.” Friend suggestions are generated in various ways, such as number of common friends, but also by the platform’s evaluation of common interests, which also means ideological agreement. My personal experience illustrates this. I have been a member of Facebook since 2010, and by now have collected several hundred “friends.” After an initial period in which I befriended relatives and real-life friends, I almost never initiated a friend request. But on a regular basis I receive such requests. Some of them are people with whom I share non-political interests or who were my real-life friends in the past. A few are people that I have interacted with in the comments section. But the majority are people with whom I am not acquainted, but who appear (to Facebook) to have a similar ideological profile…..
So why is this bad? Of course it means that I won’t be exposed to ideas that I disagree with. That’s bad enough. But there is an even worse problem. It is that in an ideologically homogeneous group, a participant gets respect by reinforcing the ideology of the group. I can become a hero to my group of hawkish conservatives by being even more hawkish. Because there are no doves in my group, thanks to Facebook’s algorithm and natural selection, there is nothing to stop me from moving farther to the right. And the next person that wants to make his mark in the group will attack me from the right, moving the discourse as a whole along with him….
As a result, ideological groups develop which then move more and more away from the center. They emphasize different facts and even develop their own facts. They create their own dialects, with each side using words that the other side never uses…. Members of opposing groups would think each other’s ideas are crazy, but they will rarely see them….
Facebook often announces programs to try to distinguish real and fake news, and to remove posts that “violate its community standards,” whatever they are. It certainly does not want to provide a platform for incitement to murder, genocide, sexual violence, racism, or many other undesirable things. But it will never do anything that will significantly impact its primary objective, which is to get people to spend more time scrolling through it and encountering ads.
In short, the platform itself, which is designed to increase ad revenues for Facebook’s shareholders, has the undesired side effect of nurturing and amplifying extremism. Rather than bringing people together, it drives them apart and polarizes them. Unfortunately, this is built into the structure of the platform, and is essential to the attainment of its business objectives. It can’t be fixed with anything other than a wholesale change that would make it unrecognizable, and possibly destroy its ability to make a profit…
So you can see, our ‘relationship’ with technology has taken on a different feel from the days when we argued with our computers. As quoted in an IndieBound review, David Auerbach, author of Bitwise: A Life in Code, wrote:
We engineer ever more intricate technology to translate our experiences and narrow the gap that divides us from the machine. We willingly erase our nuances and our idiosyncrasies—precisely the things that make us human.