WHO LET IN THE BULLY BASTARDS

A society with a strong civil rights tradition can’t allow citizen abuse by inhumane algorithm processors.

 

The AI threats to our rights are moving too fast! We’re not even close to being prepared. Take, for example, the Waze navigation service. 

It started as a docile helpful voice of advice saying, “Turn right” or “Left” to get you to your destination. Then it was warm and endearing, like Billy the Boy Scout helping Grandma cross the street. 

But mapping capabilities advanced, and Waze added services. 

Now we know to expect traffic tie-ups, where the cops are waiting, and bad road conditions. You’re no longer simply a passive grandma beneficiary. You are requested to get involved and record the road conditions for others. Soon they will reprimand you for speeding, tossing trash out the window, or stopping to pee unlawfully behind the billboard. Billy would never speak to Grandma like that!

As technology advances, will stern voices inquire: “Does your wife know about that cute girl riding in the passenger seat?” or “Why was your car parked outside a bar last night?” Finally, will the voice one day announce, “If you weave across that median strip one more time Old Man, we’re telling your kids to pull your keys.” 

Is this the innovation we are seeking?

It’s not just Waze. Yesterday I shot a note about the playoffs to a friend and used the proofreading service Grammarly. An invasive but unsolicited message popped up: 

         “You refused to accept our dangling modifier recommendation. Why?” 

Not waiting for an answer, Grammarly queried me with four options to justify my insubordination. Of course, all I did was ask for Harry’s take on the Super Bowl, not threatening mutiny against the digital grammar police.

Does Grammarly retain all my narratives? Will evolving AI software seek out inconsistency? Am I to be chastised by an algorithm flagging my hypocrisy and deceit every time I ask about a semicolon or ellipsis? 

Categories: Humor

2 replies

Share a comment