A Gun to your head is going to be AI
A gun to your head makes you do whatever the one holding the gun wants. It removes choice and free will, while kicking in an anxiety riddled survival mode. The trouble is, a panicked mind can’t think clearly, so it locks onto the simplest solution presented to it. Now for many these days, life has become so complicated, and anxiety ridden that any respite is considered a wonderful gift. Enter ChatGPT on the scene, an AI designed to let one turn off the brain and get quick, grammatically perfect results to all the questions, assignments and written projects anyone can conceive.
For the overburdened, lazy and exploitative, AI that reduces the workload seems like a miracle of modern progress. But by not using your own brain to write something, you are playing a game of Russian roulette with the gun to your head. Let ChatGPT do the writing for you and you will never draw your own conclusions. By letting AI do the thinking for you, you lose the ability to communicate effectively and persuasively, on your own.
Visit Steele Hard
Meanwhile, for a select few AI is a weapon, but for most it is a loaded gun to your head, poised to blow your brains out. Of course there will always be those for whom AI is just a time saving tool. Used to check grammar, or change the tone of something in mere seconds. But for the population as a whole the impact will be the same as the microwave was to how and what we eat. Grandma knew how to cook and bake from scratch, today most dietary options are ready to eat, or heat and serve, because most people are lost in the kitchen. Already Google’s SEO optimization forces websites and blogs to be written at a second to forth grade level. Stifling the potential growth in reading comprehension that a written medium should create.
But the most insidious problem with AI isn’t its ability to weaken society’s cognitive strength. Worst still is that it is NOT neutral, it reflects the opinions and emotional biases of its programmers. This is why, it is not a loaded gun in everyone’s hands, but rather a gun to your head. Because someone else is controlling the narrative, someone else is holding the gun. There are numerous examples of the programmer’s personal beliefs influencing its output results. For example, ChatGPT when asked to write a poem praising former President Donald Trump, will refuse on moral grounds. But it will create an ode to Biden in a heartbeat. Evidently, its designers vote for Democrats. With widespread use, this could mean the end to bi-partisan politics, an absolute quelling of the voices of opposition.
The biases of AI run the whole gambit of political correctness. From refusing to write smut, to ignoring data and news reports on female sex-offenders. (It will write about male sex-offenders, it just pretends that female ones don’t exist.) Given these examples, it is clear that AI can’t be trusted to speak truth or even address problems that its programmers would rather ignore. So think twice next time you want to use a technological shortcut, because you might actually be raising a gun to your head.
Other articles by Hamilton Steele