13 November, 2022
Using the term "glitch" when it comes to issues related to software or technology in general is bad journalism.
The term is used by journalists, politicians (and unfortunately, technicians and specialists) as a way to gloss over the details of what exactly is the cause of an issue.
A simple example to make the point. Suppose you have to make an important phone call one morning at 08:00 exactly but you forgot to charge your phone the night before. When you woke up your phone was dead so you couldn't make the call. It's not the end of the world of course, but it's also your responsibility to have remembered to charge your phone. Saying that you couldn't call because of a glitch with your phone or charger is not an acceptable excuse.
In reality, the term is mainly used as a way to conceal human error and make it seem like the bug couldn't be helped and it's no one's fault.
A quick example:
At best, it's the technical equivilant to the yada yada phrase made popular in Seinfeld.
The problem is not only that using "glitch" is useless and gives us no actual details as to the why, but in many cases, it's used intentionally to obscure or disguise the truth.
Even in instances where the term is used as a synonym for a software bug, it's still used incorrectly. In software, the bug itself is the result of something and not the cause. The cause could be any number of reasons, if you're writing C code for example and you forgot to finish a comment with a
*/ (which inadvertently commented out a big chunk of code after that, which itself resulted in a bug) is the actual cause. If you're writing software for airline ticketing and did the same error it's not acceptable for the company to tell their customers that "a glitch" caused people's flight tickets to be cancelled. It's reasonable to expect that a reporter covering the issue will try to dig deeper to find out what exactly is the reason that happened, "a glitch" doesn't tell me much. They don't have to explain the bug in detail as indeed most readers from the public won't understand the meaning if they're not programmers, but maybe the fact that that kind of code made it to produciton for example (or it was allowed to run on production data) calls into question the entire software development lifecycle and process followed at that company. This is what I want to know.
Obviously, it's in the interest of companies' PR (and politicians) to use "glitch" because it muddles up the issue in a way that seems like an answer that -at best- doesn't mean anything, and at worst, it makes it seem like a fluke and just an qaccident that couldn't have been prevented, so there's nothing to compensate and no one to hold accountable. It's the job of the media to understand what happened so we can get to the bottom of an issue. When non-tech journalists do this, it's bad, but when technology journalists do it, it's detrimental.
The level of responsibility and accountability for why something happened depends on the skills and qualifications of the person responsible for the supposed "glitch". A junior engineer dropping a database table by mistake is not their fault at all (and hopefully a recoverable issue if there are backups) but calls into question the process of onboarding or documentation at the company. If the dropped table is a couple of articles from an ecommerce website, it's probably not that big of a deal, but if it's the only copy of tests results of a patient then it's a big deal.
Journalists should not be satisfied with "glitch" and ought to call out businesses and politicians when they use it.
h/t to NoAgenda show.