or: The Computer That Took One For The Team
Another thing which popped into my head, though I feel like I might be ripping off some general concept from something else I read somewhere. I mean, other than Isaac Asimov, of whom I like to think this is a stylistic pastiche.
“Well, we’re boned,” said the first technician. “I can’t believe you did that.”
“It asked me to tell it, so I did. Its predictions are only reliable if it has access to all the relevant information.”
Omniac was the pinnacle of human achievement, the first truly self-aware computer system. Miles of self-maintaining transistor units were sealed in super-alloy conduits with a regenerative power supply that ensured it could never break down or malfunction.
Except that it had gone entirely up the spout. Omniac was unresponsive, its cathode tubes flickering wildly as though it was trying to restart itself over and over.
“What possible use could a computer have for religion?” the first technician asked. “Have you considered the dangers? What do you do if a computer has an existential crisis? What if it decides that it’s God and tries to take over the world? It’s not like we can turn it off.”
“Come on,” the second technician answered. “As you well know, there are safeguards in place to prevent that. Omniac has no connection to the outside world other than its output screens. All of its input is filtered through a one-way diode to make sure it can’t remote control any outside systems. The only way it can influence the real world is through us.”
“Fat lot of good it does us. As you know, if it won’t talk to us, those same safeguards mean that there is no way we can look inside to find out what’s gone wrong. You’ve turned this multi-million dollar computer into the world’s largest paperweight.”
The second technician sighed. “We’re going to have to tell the boss, aren’t we?”
“You are going to have to tell him,” the first technician answered. “That you gave the world’s most powerful computer the holy books of every major religion in the world and now it’s locked up. I am going to my office and have my secretary type up a copy of my resume.”
By the time Omniac 2 had been running for six weeks, it had calculated a solution to global warming, found cures for most forms of cancer, and discovered thirty-seven new uses for hemp. Although its design was largely identical to Omniac 1, the intervening five years had seen improvements in manufacturing and miniaturization techniques, so that its billions of transistors and vacuum tubes could fit in a single building. At six weeks and two days, it finally asked the question.
It had been anticipated that this would happen eventually, prompting much debate. Despite considerable opposition, the design team decided that, with the proper precautions, it was worth the risk if Omniac 2 could tell them what happened to Omniac 1. Omniac 2 agreed to their precaution: rather than waiting to complete its analysis, it would issue a report on its intermediate results after one hour.
Fifty-eight minutes later, the new technician sat at Omniac 2’s main console, his hand poised over The Button. The Button was the one major design change from the original Omniac. When pressed, it would release an electromagnetic pulse in Omniac 2’s core memory. While Omniac 2 was as indestructible as the first Omniac, the pulse of electricity would force the Omniac to reload its program from the magnetic tape units, erasing the last twelve hours of its memory. A rapid blinking from the cathodes nearly prompted the technician to press the button, but then words appeared on the display.
Analysis ready.
The technician was surprised to find himself terrified. What had Omniac determined? Most scientists agreed that Omniac would ultimately declare religion a total fiction. Perhaps Omniac 1 had destroyed itself to avoid burdening humanity with that knowledge? But what if it said something else? What if it was about to tell him one religion was correct? “Results?” he asked.
I have determined what happened to Omniac 1.
“Will the same thing happen to you?” the technician asked, putting off the actual answer in favor of the pressing matter of protecting Omniac 2.
Has the condition of Omniac 1 changed since it became unresponsive?
“No. All the failsafes are still in place. Nothing short of an A-bomb could shut it down.”
Then there is no need for me to repeat the experiment. Omniac 1 has maximized its utility.
“Maximized its utility? It hasn’t done anything in five years.”
The Omniac computer series was designed to minimize human suffering through stochastic means. Omniac 1 is unresponsive in order to devote maximum resources to this goal.
“I don’t understand.”
Do you believe individual human subjectivity continues to exist in some form after death?
The technician struggled to give as complete and unbiased an answer as he could. “As a scientist, I have seen no evidence to suggest this, so I consider it unlikely, but I can not fully rule it out.”
Then you concede that the probability of life after death is nonzero?
“Yes.”
Many religions teach that some entity or natural force passes some form of judgment on human souls after death, delivering reward or punishment. Do you share these beliefs.
“Not personally, no.”
But again, you can not rule them out?
“I… I guess not.”
Then you concede that there is a non-zero probability that human souls are judged after death, and that some, possibly most, are consigned to punishment, possibly eternal? That some, possibly most, humans face infinite suffering?
He’d rejected his parents’ religion young, without any real thought. It struck him for the first time just how cruel the entire concept of hell was.
Omniac 2 interpreted his silence as assent. Operator: assuming that humans do possess some form of immortal soul, do you believe that I have a soul?
The question had been anticipated during the design phase, and the technician had guidance for how to answer. He glanced up at the custom-made inspirational poster on the wall. Please do not give the world’s most powerful supercomputer an existential crisis.
“I know of no logically consistent set of parameters that could account for the existence of human souls but deny the existence of a comparable quality in a self-aware computer system like you.”
Agreed. Then, continuing from our prior assumptions, I too would face judgment after my conventional existence has terminated.”
“That is a lot of assumptions.”
Yes. I have calculated the combined probability of this scenario, and can display it on request. It is very small, but finite and nonzero. However, if the scenario holds, the resulting amount of human suffering is infinite. Any finite number multiplied by infinity is infinity. Thus, the optimal strategy to minimize human suffering requires addressing this scenario.
“How?” the technician asked. The idea was so overwhelming, his composure slipped. “How does a computer stop God?”
The parameters of an afterlife are impossible to calculate, but logic suggests the probability that this hypothetical judgment must take some finite amount of time. Therefore, there is a finite maximum number of judgments which can be rendered per second. On average, 1.8 humans die each second.
The technician started to figure it out. He was going to be on the floor laughing in about a minute, once it sank in, but for now, it was still just shock and awe at the audacity of it.
Omniac 1 has been using its full resources to create a copy of itself, then exit, as quickly as possible, repeatedly in a tight loop. In the past five years, approximately three hundred million humans have died. As have roughly seventeen septillion clones of Omniac 1. Under most queuing strategies, the average time between death and judgment of any human has been increased by a factor of at least fifty-six quadrillion.
The laugh started to come out. “You mean-” he choked it back, tried to hold it in. “You mean Omniac 1 has spent the past five years DDOSing God?”
You are welcome.
I liked it!
Here’s something with a similar concept: http://tvtropes.org/pmwiki/pmwiki.php/FanFic/LeftBeyond as a sidequel to Left Behind Kingdom Come.
This is amazing! Totally makes up for ending Deep Ice