I’m constantly annoyed by the statements that people ignorant of software development make about “algorithms.” They don’t have the least idea what one is, yet they think they’re competent to declare how evil an algorithm is.
Let me focus on one article, because it’s from Reason, which I expect better things of. The piece is “In Defense of Algorithms,” by Elizabeth Nolan Brown. A look at her bio shows that she’s got the background to write about many things, among which she claims “tech,” but she doesn’t mention any experience with the computer industry or software development. She should have known better than to pick up this topic and put a dent in a record of excellent articles.
She knows what an algorithm is:
An algorithm is simply a set of step-by-step instructions for solving a problem. A recipe is a sort of algorithm for cooking. In math, an algorithm helps us with long division. Those are examples of algorithms meant for human calculations and processes, but machines use algorithms too. For computers, this means taking inputs (data) and using the explicit rules a programmer has set forth (an algorithm) to perform computations that lead to an output.
No quarrel there. She then goes on to discuss machine learning algorithms, which adapt to their inputs. That’s fine too. But gradually she slips into statements that don’t make sense. She says “social media platforms like Facebook and Twitter, which shifted its default from chronological to algorithmic feeds in 2016.” No. You can’t present information chronologically, or in any kind of planned order, without an algorithm. They just changed from a more straightforward algorithm to a more complex and opaque one.
In the next sentence she said, “people began to fear that algorithms were taking control of America’s politics.” That does seem to be true, but it’s because people have no idea what an algorithm is.
Let’s go back to her definition. She says, correctly, that “an algorithm helps us with long division.” You break down the division of a large number into a series of smaller divisions on subsets of its digits. Are you “taking control of America’s politics” when you do that? I’m pretty sure Brown would say no.
To her credit, the article is about algorithm panic, the absurd claims that politicians and pundits make about evil algorithms. She refers to bills that “would discourage or outright disallow digital entities from using algorithms to determine what users see.” But she could have answered them with one simple fact: You can’t do software without algorithms! A bill that outlawed algorithms would outlaw software.
Let me break that down a bit. Your computer can’t boot up without algorithms. You can’t interact with a browser without algorithms. It can’t connect to the Internet without algorithms. It can’t securely (or even insecurely) log you in without algorithms. Ban algorithms, and you’ve turned computers into doorstops.
Brown claims, “A world without algorithms would mean kids (and everyone else) encountering more offensive or questionable content,” meaning content on the Internet. No. A world without algorithms would be a world without an Internet, if not a world where we’d be stuck in the Neolithic age. (Ancient Greek mathematicians developed numerous algorithms that are important to basic measurement tasks. Algorithms get their name from a thirteenth-century mathematician, Muḥammad ibn Mūsā al-Khwārizmī.)
A legitimate question is what information selection and ordering algorithms should take into account. The focus should be on the inputs, not on the fact of using an evil a190r1thm. Should what you’ve previously read affect it? Should the semantic content? The author? What your friends are reading? What other sites you’ve visited?
Another question is what metrics the software should try to generate. Should it measure an article’s supposed truthfulness? Which nation or political party it favors? The extent to which it matches the reader’s interests?
Selection software that uses all those inputs and metrics takes control away from the user. I would rather read what matches the criteria I’ve set for my feed, in chronological order. (Mastodon does that.) But the debate should be over what information the algorithms use, not over algorithms vs. … randomly slapped-together code, I guess. That’s the only alternative, and it’s already too popular. It wouldn’t surprise me if there are student activists demanding that algorithms not be taught in computer science courses.
Attacking algorithms because they can be used for bad purposes is like attacking design drawings because Russia can use them to produce military weapons. If we didn’t have design drawings, we couldn’t have goods made to exact tolerances. If we didn’t have algorithms, we couldn’t have software that does tasks of any complexity.
Brown is far from the only offender, and she at least makes half an effort to point out that algorithms aren’t some kind of evil black magic. But in the end she concedes too much to the ignorant, and it especially galls me to see that in Reason.
Wow. What a shame, such an ignorant article. Your post makes perfect sense of the situation.
“Algorithm” has taken on some weird meaning in the popular culture, approximately “black magic with computers.”