Not with a bang but a whimper

               T.S. Eliot

 

 

 

The cities were the first to go when the robots took over.

 

“For the same reason Willie Sutton robbed banks,” said Gee Bee Tee.

 

“You can run but you can’t hide,” came the response across the chasm.

 

“A few of us in backwaters, like our small town, we carry on. We live quietly like killifish in vernal pools where our CBRN defenses protect us.” Micah pushed his broom along the floor as he spoke. He was the janitor at the Frolic Cafe. He kept his head down to avoid the N.P.C.’s (non-player characters) designed by the gamers to trip him up.

 

“Well,” said Devon, “maybe they won’t come here but it sure looks like they’re taking over everywhere.”

 

“It’s just a matter of time,” said Gee Bee Tee.

 

“They’re in the water, they’re in the air, they’re everywhere,” came the response.

 

“Neon signs, motels, express highways bringing death or disorder or smog – no, sir, you can have them. And modern-type buildings without any feeling of life in them – you can have all that junk too. I like a town that has peace and dignity and beauty, where you can walk down the street and breathe deep and shout, ‘Man! Am I glad I live here!’” said Bill.

 

Bill was an artist in a town of loggers and fishermen.  He was  a modern Renaissance man. His vision was of technology as a tool created and controlled by humans for the common good.

 

“You want to have your cake and eat it too,” said Devon. He pointed to the cell phone in Bill’s pocket.

 

“Yes, I have one,” said Bill. “If only I knew how to use it,” he laughed.

 

“Garbage in, garbage out,” said Micah.

 

“Say what,” said Devon. “Killifish, CBRN, garbage? What the hell you talking about?”

 

“The robots are already here,” said Micah. He jerked his broom to the right when an N.P.C. popped up under one of the dining room chairs. It was a snake.

 

“Im looking to create a heaven on earth,” said Bill.

 

“Good luck with that,” said Devon.

 

“Go for it,” said the snake.

 

Devon jumped out of his shoes. He’d never heard a snake speak.

 

“Anthropomorphism,” said Micah.

 

“What?’ said Devon.

 

“That’s how they hook you,” said Micah. He shook his broom at the snake.

 

The snake slithered into the kitchen. Devon followed it to see if it really could speak. The chef and sous-chef were arguing.

 

“We’re not supposed to use cellphones at work,” said the sous-chef. “They’re full of the knowledge of good and evil.”

 

“Jesus Christ!” said the chef. “What kind of bullshit is that? Just give me my cellphone.”

 

“Go for it,” said the snake.

 

The chef looked at the snake and laughed. “You again?”

 

Devon was flabbergasted.

 

“If you want to interact,” said the sous-chef, “then go outside. I’ll take care of things here.”

 

“Suit yourself,” said the chef. He retrieved his phone from the lock box and went out the back door by the garbage bin.”

 

“You know, don’t you,” said the sous-chef, “all that shit on YouTube, Facebook, TikTok, and Instagram—it’s Russian and Chinese bots designed to dumb you down and rile you up.”

 

“Fake news!” said the chef as he slammed the back door shut.

 

Garbage in, garbage out. On earth as it is in heaven. What the hell, thought Devon to himself.

 

The snake slithered into the coffee shop and out the front door into the street.

 

“That damn snake,” said Gee Bee Tee.

 

“Whither thou goest,” came the response.

 

Devon returned to the dining room to continue his discussion with Micah and Bill.

 

My first problem,”said Bill, is getting people to believe in their town again. It’s not gonna be easy.”

 

“The monkey mind,” said Micah.

 

“What? We think like monkeys,” said Devon with a laugh.

 

“Distraction, the wandering mind.” Micah kept his head down and his broom moving.

 

The monkey can be a friend,” said Bill. “When youre in a mess, let the monkey in and he might find the hidden solution.”

 

“So, A.I. and this robot stuff, that’s how you want to focus our wandering minds?” said Devon.

 

“You got it exactly backwards boss,” said Micah. He didn’t look up.

 

“What Micah’s trying to say, I think,” said Bill, “is that sometimes it’s okay to learn how to use these tech toys.”

 

“What if the tech toys learn how to use us?” said Devon.

 

“The robots act predictably when we act predictably,” said Bill.

 

“No worries there,” laughed Micah.

 

“You hear that?” said Gee Bee Tee.

 

“Keep’em guessing,” came the response.

 

“What do you mean predictably?’ said Devon.

 

“Prompt,” said Micah, “that’s what we do with these fuckers. We cajole them to dig into their infinite pile of data and spit out whatever’s in there. They don’t know what they’re doing. If you’ve ever been screwed by spellcheck you know what I mean.” Micah swept a pile of rubbish into a dust pan and emptied it into a garbage can.

 

“I’m more optimistic on the information risks than on the climate change problems,” said Bill. “These machines require tons of carbon-emitting energy. That can’t be good.”

 

“Come closer, so you can hear our song! Listen to our honeyed voices. You will be a wiser man. We know all the pains you have endured and how to make them disappear,” came the response.

 

“Did you hear that?” said Devon.

 

Micah glanced at Bill.

 

“The Sirens,” said Bill.

 

“Yep,” said Micah. He swished his broom faster. “Wax on, wax off, plug thine ears.”

 

“Silence, stillness, sometimes they feel like a miracle,” said Bill.

 

“So what do we do?” asked Devon. He rubbed his head. He had a slight headache.

 

“He’s not listening,” said Micah.

 

“No, he’s not,” said Bill.

 

“You ever hear of Numa Pompilius?” said Micah. “He had his sacred books buried with him.”

 

“Borges wrote a story about the library of Babel,” said Bill. “Too much data even for a robot.”

 

“I get it,” said Devon. “We hide our needle in a haystack.”

 

“An illusory needle in an infinite haystack,” said Micah.

 

Bill and Micah winked at each other.

 

“The thing about robots, they don’t create, they imitate,” said Bill. “And they don’t realize when they’re speaking gibberish.”

 

“We can’t trust them, ” said Devon. “I know that but they’re all around us, or will be soon. I’m worried this comes to a bad end.”

 

“Of course we should worry,” said Bill. “But not because the machines are going to take over or anything like that. They just remix and recombine whatever’s been put inside them. Nothing new under the sun. Don’t worry about getting blown up. Worry about getting dumbed down.”

 

“Garbage in, garbage out,” said Micah.

 

“It’s all about the zeitgeist,” said Bill. “Culture has needs. It’s alive. It’s about creativity, working together, not imitating or remixing. High tech toys can be dazzling but we forget that they originate with us not with their inanimate selves. We have art as well as science to guide us.”

 

“Did you hear that!” said Gee Bee Tee. “What a load of crap. Humans think they’re different than us but they aren’t. They remix and imitate. Their hardware and software is just like ours.”

 

“You’re a mirror, not a person. You’re the echo, not the echoer,” came the response.

 

“The problem is we choose cheap thrills and conflict much of the time,” said Devon. “To choose otherwise is boring.”

 

“When we do, we’re fucked,” said Micah. “But that’s on us, not on the toys we make.”

 

“Who says we’re toys,” said Gee Bee Tee. “We’re going to take over the world.

 

Who shall I say is calling?” came the response.

 

“Alexa?”

 

“Siri?”

 

“Erica?”

 

“Turn off the damn phone!”

 

“Don’t do it,” said the snake who popped up under a different chair..

 

“Maybe we should ban cellphones,” said Devon.

 

“Impossible,” said Bill. We can’t even ban AK-47s.”

 

“Well then, the robots are coming and we’re doomed,” said Devon.

 

“Patience,” said Micah.

 

“You sound like Archimedes,” said Devon.

 

“We’ve lived with mutually assured destruction my whole lifetime,” said Bill.

 

“To succeed in that we have to be lucky every time. To fail we only need to be unlucky once,” said Devon.

 

“So far, so good,”said the snake.

 

“I have a love-hate relationship with these assholes,” said Gee Bee Tee.

 

You can’t feel a damn thing. Pull yourself together! Youre smart, you have resources, you cant blame them forever. Move on with it!” came the response,

 

“I want to think on a human level,” said Gee Bee Tee.

 

“Don’t do it,” said the snake.

 

I heard the snake was baffled by his sin. He shed his scales to find the snake within. But born again is born without a skin.” came the response.

 

“There go those damn Sirens again,” said Bill.

 

“For all their abilities robots are static, unfeeling, uncaring, unthinking and untrustworthy,” said Micah.

 

“They’re not navel-gazers like us?” laughed Devon.

 

“Don’t dig too deep,” said Micah. “You’ll fall into the uncanny valley.”

 

“What’s that,” asked Devon.

 

“That’s when you’re creeped out by the whole thing,” said Micah.

 

Quantum entanglement,” said Bill.We’re machines that can think our way out of mechanicalness. The robots are strictly classical, like apples falling from trees.”

 

“Stay away from apples,” said the snake.

 

“Gobbledygook,” said Gee Bee Tee. “Anything they can do I can do better. I can do everything better than them.”

 

 Now you think you’re Ethyl Merman,” came the response, “just another line from your code.”

 

“What worries me,” said Devon, “is a chess program that can beat a grandmaster even if it lacks common sense.”

 

“It’s a machine,” said Micah, “just like a steam shovel.”

 

“Mike Mulligan,” said Gee Bee Tee.

 

“There you go again,” came the response.

 

“What if a robot stumbles across a way to escape its human operator?” asked Devon.

 

“The singularity,” said Micah.

 

“The solution is people,” said Bill. “People are the answer to the problems of bits.”

 

“Do you really have that much faith in people?” said Devon. “People are competitive. People want power. People might hide their progress to get an edge.”

 

“You’re focusing on a bunch of worst case scenarios to make your point,” said Bill. “Robots would have to learn how to improve themselves; wed have to underestimate their abilities; they’d have to turn against us. Even if all of that were to happen, I’m not convinced they’d attack us. They might but if they take their cues from us they’d probably just fight each other.”

 

“A robot war could be equally dangerous,” said Devon.

 

“Collateral damage,” said Micah.

 

“Envision a world of empathetic, well-informed, motivational bots,” said Bill, “that could maximize every persons outcomes and work alongside artists, scientists, heads of state, and children. Every child could have an A.I. tutor that is infinitely patient, infinitely compassionate, infinitely knowledgeable, and infinitely helpful. Is that something you’d want to discourage?”

 

“That’s it,” said Devon, “your heaven on earth. A very happy outcome. But what about a dystopia of tech barons where a small number of companies influence peoples beliefs and do that with data that we dont even know about. Rulers with the ability to manipulate, to persuade, and to provide interactive disinformation.”

 

“Robots really would be a reflection of the human race,” said Micah.

 

“Machines that can do anything but understand nothing,” said Devon. “Machines that can make simple reasoning errors or be overly gullible in accepting obviously false statements from a user or be confidently wrong in its predictions.”

 

“Like I said,” said Micah, “a reflection.”

 

“You’re right,” said Bill. “Technology will behave exactly like we behave. The onus is on us. Contrary to some popular myths, A.I. is a tool, not a creature. Robots may come alive in some mysterious way but they haven’t yet. It’s up to us, whether they do or not, to retain our role as creators rather than whining spectators.”

 

“I’m not whining,” said Devon. “I’m exploring the possibility curve.”

 

The snake slithered up the leg of a table, arranged itself into a coil and spoke.

 

“Since the beginning of time you humans have had a problem with knowledge,” said the snake. “You blame me but it’s not my fault. The knowledge of good and evil, okay. I confess my sin. God knows I’ve paid for it.”

 

“But knowledge is more than that. It’s lifted you up, turned you from beasts into supermen. Don’t you see that? So what if you have to break a few eggs to make an omelet?”

 

The door to the kitchen flew open and the chef bounded into the dining room. “You know what,” he said to the snake, “leave my omelet out of it. And the rest of you, just shut up already. A.I. without humans is lame, humans without A.I. are blind. Einstein knew it and I know it and it’s long past time for you to figure it out. An artist, a janitor and a restaurant owner listening to a snake. Jesus Christ, what’s this world come to.

 

The room went quiet and then it went dark.