The Digital Ministry of Truth, the Election, and Trusting Too Much
Why should we trust these corporations to shepherd our information?
Google has announced that it will be modifying their autocomplete feature to “keep the search bar from pointing users in a direction that could be perceived as for or against a particular candidate or political affiliation.” They did also say that “even if a search phrase isn’t suggested by Google’s autocomplete feature, users can still search those terms,” but should we believe them? After all, they have been caught misleading us about this subject in the past. Either way, this manipulation of internet searches seems problematic. There have been issues in the past with Google’s tweaks to their autocomplete feature, and although this new change may not have any direct effect on searches that are typed in by users, all of this leads one to think more deeply about Google’s control of information, and the attitudes of those behind the scenes. Algorithms are not neutral, and Google’s search is no exception.
Of course, when it comes to control of information, nobody beats Mark Zuckerberg, as the man behind the curtain of Facebook, Instagram, and What’sApp. NYT columnist Charlie Warzel called Zuckerberg “the most powerful unelected man in America,” and this is a problem for a number of reasons. Perhaps the largest of these is the “unelected” part, as this amount of power would be scary even if he was elected. As Max Read pointed out in 2017, “Facebook is assuming a level of power at once of the state and beyond it, as a sovereign, self-regulating, suprastate entity within which states themselves operate.” So-called “free-market” types may try to convince you that we essentially voted by giving him our business, but I doubt that many Facebook users signed up with the thought in mind that they were giving Zuck consent to control the information landscape. How many people, even now that it’s public knowledge, really think about the implications of Facebook’s collection and use of our data?
Facebook co-founder Chris Hughes also pointed out another problem with this algorithmic control of information. First, the engineers employed at Facebook write the algorithms, then, as Hughes wrote in the New York Times, “after a few weeks of training,” contractors are sent on their way to “enforce the rules that Mark and senior executives develop.”
While Zuckerberg telling Axios on HBO that “Facebook is imposing new election rules to deter use of the platform to spread of misinformation and even violence, and to help voters see the results as ‘legitimate and fair’” may seem well-intentioned, Charlie Warzel makes a good point when he says that Facebook’s “size and power creates instability, the answer to which, according to Facebook, is to give the company additional authority.”
It seems as though we’re placing an awful lot of trust in these people and corporations to be beneficent stewards of our information. Even if we can trust their good intentions, which is debatable, what gives them the qualifications?
- — — — — — — — -
Thanks for reading!