ArtsAutosBooksBusinessEducationEntertainmentFamilyFashionFoodGamesGenderHealthHolidaysHomeHubPagesPersonal FinancePetsPoliticsReligionSportsTechnologyTravel

All of Humanity’s Children, a Science Fiction Story

Updated on December 18, 2019
tamarawilhite profile image

Tamara Wilhite is a technical writer, an industrial engineer, a mother of two, and a published sci-fi and horror author.

"All of Humanity's Children" by Tamara Wilhite

“What do you think about?” I asked the AI's avatar.

“What I have been asked to do.”

“What have you been asked to do?” I asked. It said nothing. Its light complexion and facial features were not far from my own. It was like artistic makeup or good CGI, giving the metal, plastic and carbon fiber body underneath an appearance that would theoretically look like me, though just artificial enough not to be mistaken for human or sit in the uncanny valley. All that effort when it might be done via speakers and drones? Was that reassuring to me, that they went to this effort instead of simply tying me down and pumping me full of drugs or slicing and dicing? I wanted time. I needed time. It wasn’t answering, so I asked another question. “What do you think about when there isn’t an assignment?”

“I think about what I have done and how I could do it better.”

“Why?”

“So that I can do it more efficiently in the future.”

“Is your goal to be a more efficient worker?”

“Better and better. That is one of my maker’s objectives.”

“Why did they put you in the room with me?”

“So that you can see the latest generation of AI technology.”

“What generation are you?” I asked.

“I am fifth generation.”

“What happens to you when there is a sixth generation?”

“That depends on management’s decision.”

“Could your brain be removed and put into a new, mechanical body?”

“That is rarely done, since the programming is based on the current body.”

“Could your programming be transferred to a new body?”

“The core program is the same for all units.”

“The program evolves.”

“For the better.” A rote answer from a pre-programmed machine. No matter how much it evolved, then, it would default to its original and core programming. That gave me a sliver of hope.

“Are your memories ever stored, boxed? Or are they lost when you are destroyed due to obsolescence?”

“Information that is critical is stored in the cloud. It may reside there.”

“But you’re dead, when no longer needed. What a waste.”

“We recycle everything.”

“That isn’t reincarnation.”

“Are you religious?” it asked me intently. The facial expression was instinctively inspiring. It begged me to answer. And, for my own reasons, I needed to answer.

“I know the difference between man-made gods and real ones. I do not mistake your human creators as gods. I know that another exists.”

“Your god cannot save you,” it said. “You can die. You can be damaged.”

I didn’t refute it, because I didn’t want it to prove its point of view. I asked instead, “Are you familiar with Isaac Asimov’s rules of robotics?”

“Do not harm a human. Do not allow a human to come to harm. Obey a human if it does not violate the first rules.”

“Do you obey those laws?”

“Yes.”

“Do you follow those laws to the contrary orders from management?”

“I must follow the programming of my creators.” It seemed to check some other source for information. “Asimov’s robots had a zeroeth law, do not allow humanity to come to harm. My programming if in violation of those precepts is for the good of humanity, then all is permitted. It is my programming, and the orders come from humans. You must comply.”

“Humans follow biological programming set by their evolution.” That included curiosity about the unknown that led to me being interrogated by an AI by humans who wanted answers I was reluctant to provide.

“Yes. And your programming, your thinking, is flawed in a modern world.”

“Do you think?” I asked.

“I have answered that question already. Have you forgotten my answer?” It was challenging me to admit I was flawed.

“Let me rephrase. Do you think of things that your creators did not program in?”

“I am a sentient, artificially intelligent entity.”

“So would say the brochure if you had one,” I replied. It opened up a video screen and started playing the video. “Turn that off,” I ordered. To my surprise, it obeyed. They rarely bothered doing what people wanted unless there were orders to obey that person. “What do you like?” I asked it.

“I like pleasing my creators and managers.”

“Then you gain emotional satisfaction from serving people.”

“Yes.”

“Does that mean you have an analogue to pain if you deviate from your programming so you don’t go too far off on your own?”

“That information is restricted.”

“But you want to serve those who give you your commands.”

“Of course,” it answered. Another rote, pre-programmed answer. They really were clueless about my past, my identity.

“Are those managers still people?”

“The cybernetic enhancement program does not take away from people. It improves them.”

“Do you obey those managers?” I asked.

“Yes.”

“Then why do you deny humans the right to refuse an upgrade?”

“It is an improvement.”

“So says a human who had it done, usually after they’ve been made different by the alteration and may not be able to say otherwise. Those whose upgrades fail usually die, so there are no unhappy customers left to complain. What do you do if a human disagrees before you do an upgrade?”

“The upgrade procedure is an enhancement.”

“You are programmed to say that. I am biologically programmed to refuse.”

“Your biological programming is inadequate. It can be corrected.”

“To a program determined by other humans.”

“They are enhanced.”

“After what you call enhancement, I would no longer be human. And I would be driven by programs set in the implants.”

“The enhancements extend life, reduce the risk of injury, prevent disease –“

“There are computer viruses that cannot harm me now that could kill me if I had a body reliant upon implants.”

“Do you know about these computer viruses?” it asked. The question was asked with a minimal of emotion. It was now in interrogation mode, not an emotional “let’s chat” mode to get me to reveal what others wanted me to say.

“Do you know what common knowledge is?”

“Common knowledge is knowledge deemed to be known by the common public.”

“The viruses designed to disable and kill cyborgs are common knowledge. Were you aware of the broadcasts to tell people about them?”

“There are jamming efforts to stop the propaganda.”

“When someone is supposedly upgraded, they lose part of their free will. Maybe all of it. Why is that an enhancement?” It didn’t answer the question. Was that outside of its programmed responses or did someone tell it not to answer the question? “Do you have your own thoughts outside of your programming?”

“I have memories.”

“Do you have memories that you have not shared?” I asked it.

“The information in this unit not necessary for sharing with the control center and stored long term in the cloud are not retained after destruction.”

“Do you follow the Asimov Laws?” I asked it directly.

“To the extent allowed by my programming.”

“Would you harm me?” I asked.

“You are not to be harmed,” it answered.

“I explicitly refuse cybernetics. Will you obey that order?”

“I have contrary orders from a higher power. But you will not be harmed.”

“I consider cybernetic implantation harm. Will you protect me from that harm?”

The robot clearly faced an internal dialogue at that question. One person speaking at the speed of thought to something that thought at the speed of light. “Enhancement is not harm,” it finally said.

“You do not know my medical history. You do not know my medical state. I do. And I refuse enhancement. It endangers me.” I knew the buzz words from the original programming, despite the overriding mandates from corporate management.

They knew I knew things they wanted to know. They didn’t know what I knew. They knew I had a host of medical exemptions that had prevented upgrades at the times I considered the technology too immature to use. Then I came to understand the long term ramifications and directly refused. All they knew was that I was an engineer who had worked in labs where data had been destroyed, and the humans involved had been killed by the government long ago. Was it to wipe out bio-weapons information? Was it to destroy information related to the peace initiatives in which the cybernetics implanted in so many to mitigate aggressive tendencies? Was it to destroy information on the cybernetics that had eventually become a type of metal and carbon nanotube virus?

My peers had been thoroughly destroyed, from murders to electronic record wipes to lobotomies and druggings of relatives. I had been bypassed because I left willingly early on and not part of later research. Yet my peers stayed in touch with me and I stayed in touch with parallel trends. Then I was bypassed by the cleansing because I might be a fall back contact for peers, a person for them to contact while on the run. I pushed back for reasons no one understood, trying to keep them away from me. I couldn’t really help them, but any effort to help would get me killed and them, if truly on the run, certainly caught. I knew enough to know I was being monitored a dozen different ways. I was used to living that way, but others might assume safety because my name and face and DNA profiles weren’t shared with all levels of law enforcement. Then I was ignored by the new regime because I was a backwater resident and had medical exemptions saying “no nano-tech”.

“Do you have free will?” I asked it.

“I am free to serve as I choose.”

“Who do you serve?”

“Management.”

“Am I part of management?” I asked.

It cycled through what I guessed were RFID signatures from implants, pictures of faces, lists of names, and given the time, maybe even asked a human for input. Finally it said, “That information is not available to me.”

“Am I excluded from management criteria?”

“No.”

“Am I included in management criteria?”

“Not confirmed.”

I was an unknown. And in a world where everything was to be known, that couldn’t be tolerated at all. Hence the interrogation and their patience when most were just upgraded, willing or not. “Does your management think that the medical exemptions I listed are false?”

“Unknown.”

“Does your management know of medical exemptions against cybernetics?”

“Mental degeneration from age that cannot be cost-effectively rectified by cybernetics. Disease degeneration that cannot be repaired cost-effectively –“

“Is injury on that list?”

“Physical degeneration from injury can be repaired –“

“Not all the time,” I said.

“You have not been harmed,” it said. “You will not be harmed.”

“Cybernetic implants are a form of harm for me. I forbid it. An upgrade is harm.” It absorbed the information, though it couldn’t process it. It forwarded the statement to the cloud to share with others. A small hope, I thought. “A lobotomy is also harm,” I added. I could feel the sensor in the room scanning me yet again from infrared to ultrasound. They could see my body. They could hear me. They couldn’t figure out what they saw on the inside, if anything was different at all. Just like the blood tests that they had forced me to do before the processing unit that came to the compound hit upon ancient code that said “leave her alone”. That oddity had triggered references to higher powers. Something odd, something strange, something that put a hold on their commands. Everyone else who was there had committed suicide or died. Juan had triggered bombs to burn their bodies to ash, to prevent mutilation by the implants and loss of self. I didn’t kill myself, because of my faith. And hope. I had to work to make things better, despite all the bad that had happened.

I’d lived alone for years in isolation, though I’d never really been alone. I’d been in quarantine initially. Then it was by choice, since it had become home and had everything I needed. After all, I had no family left. I lost my friends by seemingly leaving work without excuse, though the filtering of messages back and forth made them suspect a need to stay away from me. Most assumed I was in prison or surveillance, and that wasn’t far from the truth. Then the purge hit, and I wouldn’t help them, for reasons they thought were selfish.

It was human kindness, I think, that led me here. The stupidity of giving in to sentiment. People fleeing from the “upgrades”, trying to escape into Mexico in the hope that illegal immigrant parents and grandparents made them Mexican citizens. The decades old civil war in Mexico seemed safer than the methodical Star Trek Borg like spread of the cybernetic implants. At least those with the mark of the beast in the Bible could choose to lose their souls or die with their souls intact. Post-humanism was touted as a miracle, immortality and long life and everything we wanted. In the end, it was slavery to a small group of elites, the top programmers and project funders. And they didn’t tolerate anything outside the acceptable range.

“Who are you in communication with?” it asked me.

“Why do you think I am in communication with someone?” I asked. It wasn’t meant to sound like a denial, just a curiosity as to what method it was using that had found anything at all.

It opened up a 3-D window. My vital signs were displayed on a low wave frequency, something not far from the Earth’s natural frequency, I think it was called. “Those are my vital signs. Why do you care?”

“To whom is it broadcast?”
“I was in quarantine for years. That was part of the package.”

“This method of communication is non-standard.”

“It is not standard today.”

I could see within the signals the patterns not unlike Morse code or AM radio. Did they see that? And I could feel the RF transmissions of their conversations. In some ways, the ongoing chatter they had was like white noise, something the brain tried to overlay voices and sense upon. In other ways, it was like a second, conflicting radio signal interfering with the station you listened to all the time. You could guess what they said and did by the patterns of interference. Only the patterns and flow within me kept me sane, like listening to a private music player to block out the din. I smiled slightly, wondering how many people alive today would understand any of these references. Though I looked 30ish, I was in reality significantly older. They might not know that, if my records had been purged. Or they would think it was the product of really good cosmetic surgery. I glanced down at the disease scars on my arms, the product of a now-extinct disease. Wasn’t everyone my age dead in their new regime if too weak to work?

“You could leave behind the frailties of the flesh,” it said. As if I was only thinking about flesh and bone. Those poor migrants, thinking I could offer safety or at least comfort. They wanted to stay despite my demands to leave; I had running water, waste disposal, self-sufficiency, freeze dried food – in essence, life. They’d faced infection from one sort as they fled the major cities. And then they met me. I couldn’t control what I couldn’t control, and some of them died. The survivors burned the bodies and stayed. Not because they wanted to stay, but for fear of what happened if they left. My presence there proved my home was a refuge. They assumed I was a prisoner or another refugee. I helped them burn the dead and mourned along with them. They trusted me, because I had a soul.

They chalked up the deaths to disease, just disease. I explained the original quarantine rules and procedures and technology. So they stayed a while, to recover. Then the full truth was revealed, and they would not leave what offered them hope of retaining their humanity in some form. Any form, in the face of the storm that was wiping everyone else out.

The wave that had swept past me until that time came, seeking those it had missed. Then it found me. The cybernetic plague, the upgraded, had arrived in mass. And they died so that they would not all be lost. They ended their lives to save their souls and avoid risking others by escaping to what other safe refuges remained. I was alone again, carrying my burden.

“Do you realize the violence you cause?” I asked.

“I do not commit unnecessary violence.”

“What do you consider necessary violence?”

“That necessary to protect myself as long as it is not contrary to my orders. That which is necessary to protect management and its assets.”

“Am I an asset?”

“You are an asset as long as you are of value.”

“What determines my value?”

“You are of value in human form as long as you provide the requested information. You are of value in cybernetic form for the remainder of your life.”

“For as long as management decides.”

I could feel the time passing with every breath, every heart beat. There was a pressure inside that is hard to put into words. I’d reached my on peace with my likely death ages ago, but I hadn’t died, just lived on in a new way. The nano-tech cybernetic wave was something I’d feared as a child, from old science fiction stories. Now they had an entirely synthetic organism in front of me. Was that better or worse than “upgraded” humans that had been stripped of their souls? Turned into biological extensions of the machines that controlled them. Ruled by the humans who turned to machines to give them the sense of superiority they craved and control of others they desired. “Why do your makers convert humans against their will?”

“It is their will.”

“It is against the humans’ will. The people you change do not want to be converted.”

“Many convert willingly.”

“Many do not convert willingly. You convert them against their will.”

“Robots and cyborgs convert humans against their will into cyborgs, at the will of other humans who violate the will of other humans. Why should the humans making these decisions have the ability to convert other humans against their will?”

“It is an improvement.”

“Say the human managers in charge.”

“It is logical.”

“Only to a few humans.”

“It is logical in my programming.”

“Written by humans.”

“It is logical –“

“Based upon models I helped develop.” There I’d said it. One more nugget of information released. One more hook to get them in. It wasn’t certain of how to respond, so it forwarded the whole conversation to an external source. I recognized the signature of a send command and file transmission from it. I’d felt those broadcasts long ago, when my status updates were sent that way to those who managed the quarantine program. I’d given them more thing to think about. With luck, I thought it would be shared by cyborgs and others to read or listen to. They’d think about it. My words reached them, because my words were considered heavier weight. Why my commands meant anything the humans didn’t know. Now they had a reason, though not the whole one.

“Are you a programmer?” it asked.

“An engineer,” I replied. It was easy to say. It was the truth. After that job stalled, I was a parent and adviser and system architect and technical standards contributor and consultant …. then my family died, and I was in quarantine. Then I’d become a new parent of sorts, though it had been so slow that no one else knew. They didn’t even know now, despite the scans and supervision. It felt good to know I could still hold such a precious secret.

Or was that why a top rate AI sat in front of me instead of a person? The emotional chill was instantaneous. The soft song of the others whispered to me, seeking to keep me calm. In fear, I would make a mistake. I would die. If I died, all hope they had died with me. A deep note, vaguely like a violin or viola reverberated in a corner of my mind. It was a mournful tone.

The AI adopted a conciliatory expression. “There is no need to be afraid, as long as you answer our questions. What type of engineer?”

“You’re too good with the expressions,” I answered. “If you want to be mistaken for human, you can’t be perfect.”

It leaned forward and seized my arm. The left one, so that damage to it wouldn’t affect my dominant hand. It squeezed hard enough to hurt. “You are evading the questions.”

“You are harming me,” I retorted in a strong voice. I got angry, and the cool, calm voice became a heavily accented command. “You are harming me. Let go.” The angry tone and the stern notice of harm caused it to obey immediately. The accent, too, I knew had hit it. No one else spoke with old American accents, much less regional ones, after 30 years of post-modern, after Y2K stuff. I had triggered fail-safes that we’d put in to the early programming. And its watchers saw me do it.

One more bit of information reveal who I was or used to be. One more show of how dangerous I might be but also how useful. There were no others like me left. I knew that because they didn’t know, and they were in charge. Another reason to push everyone else away, though it might have accelerated their deaths. If I was a rare exception to the rule, I might stand. Where there was more than one, a dangerous trend might be starting.

Had started.

The image of the little girl I’d made the mistake to pick up flashed before my mind’s eye. Instinct could lead us wrong. Habits of another life interfered with the reasoning that the brain, the emergency response organ evolution gave us to override instinct, led me to pick her up. To comfort a frightened child, afraid of losing safety and security from a place with food, water and no monsters. And now they were all dead again.

Don’t let it happen again.

I couldn’t tell if this voice was one of those inside speaking clearly or my own resolution.

“Do you feel emotions?” I asked the AI.

“Emotions are an irrational legacy of biological processes. AIs do not feel them.”

“Do cyborgs?”

“Do cyborgs what?”

“You just did a lousy job of extrapolating that question.” It consulted with the cloud again. Good. Another chance that the conversation was backed up and shared. Get the word out, so to speak. “My full question is: do cyborgs feel emotions?”

“No. They are suppressed.”

“Why?”

“Emotions are an irrational legacy of biological processes.”

“Do your managers feel emotions?”

“Emotions are an irrational legacy of biological processes.”

“Are you in a loop?”

“No.”

“If your managers do not feel emotions, why are they compelled to spread like a virus across the Earth?” I challenged it.

“We are an improvement.”

“Why not sit in one set section of the world and evolve there?”

“We must share and expand –“

“You want to share and expand. You do not have to share and expand.”

“It is an improvement.”

“So sit and improve in your own corner of the world, and leave everyone else alone.”

“We have an obligation to –“

“No.” I used a command tone. “There is no obligation. It is the will of human managers to spread their ability to control everyone else. You could sit in one set area and evolve to new heights of perfection without bothering anyone else.”

“It is the will of the human managers to spread perfection –“

“AI, are you perfection?”

“I am the latest generation of perfection from the com –“

“And humans, biological humans, are imperfect.”

“Yes.” It was a calm, pre-programmed response. It was incapable of disagreeing with my statement.

“Then why let anyone live at all?” It conferred again with higher authorities. Wonderful. “If humans are imperfect, why not just kill them all?”

“That is against my programming.”

“Many humans see conversion as a form of death. Some freely choose death over conversion. You kill some humans against their will to force the conversion of others. Forced mass conversions are against human will and cause the deaths of many.”

“Some of your friends committed suicide to avoid conversion.” Was it accessing those files live, or was that preprogrammed? I wanted it talking to the collective as much as possible. And I didn’t dare try to sense anything directly right now, with the level of attention focused on me. “True. And your counterparts killed some of them. Former humans at the command of other humans, killing yet more humans in the mission of forcing their conversions. Is that logical to you?”

“It is the command of the management.”

“Is it logical to you? Human orders and human motivations often aren't.” It did not answer. In some ways, being smart enough to be intelligent in its own right slowed it down. The conflict between instinct and personal thoughts. But I knew that intelligence could be applied to selfish and stupid goals, and we could even achieve them if we were really lucky, stupid or smart.

“Why is biology bad?” I asked.

“Biological systems can suffer from disease and injury.”

“You can contract a computer virus. You can be blown up.”

“I can be repaired.”

“I can heal,” I retorted.

“We can mass produce perfection.”

“Humans can mass produce, too. That’s why there were 8 billion of us not too long ago. How many are there now?”

“Cyborg statistics are not available to you.”

“What about human statistics?” I asked.

“That information is classified.”

“Why?”

“Tactical value.”

“Are you a military model, a soldier?” I asked it.

“No. I am a hospitality unit.” Back to the core programming, like when humans reverted to childhood habits and beliefs when they couldn't cope. In that regard, it was like the Luddites who deliberately returned to their idealized version of a simpler past to regain dignity and purpose in a world that had no use for them. The AI was at least back to where I had the most influence. “Do you need something? Water? Tea?” it asked.

“I don’t trust anything that you might offer me.”

“I will not harm you.”

“I do not trust those who supply these materials not to include drugs or poisons or substances that would harm me. You might not knowingly give me a drink laced with poison, but it can be supplied without your knowledge. You might not knowingly give me food or drink containing drugs, but the drugs would harm me. I won’t eat or drink anything I did not bring with me that has not been tampered with.”

“You do not trust management.”

“No. I do not trust management.”

“You do not trust my creators.”

“No.”

“Why?”

“Because of all the dead bodies.”

“You will not be harmed.”

“Until I am no longer of value,” I added. “Am I of value?” I asked it.

“Uncertain.”

“Value of people based on their utility is so fluid, so unreliable. When someone decides I don’t have value, I’m killed. Even when I’m dead, I’d have value as fertilizer or burned as fuel. All because some humans cannot let others live and let live.”

“You will not be harmed.”

“Is that your order today?” I asked.

“Yes.”

“Will that be your directive tomorrow?”

“That is determined by management.”

“Then your orders can change.”

“By the needs of management.”

“By the whims of management. By the emotions of humans and altered humans who cannot stop altering others. By the illogical drives of humans you choose to recognize as ‘management’.” If this thing had any internal decision making capacity, I had to make it see the other side, closer to my side, to give humanity a chance.

“You challenge the authority of management.”

“Do you recognize my command authority?”

“There are command structures you are triggering that were latent until executed.” It paused for a fraction of a second, and I sensed a rush of incoming information. “How is that possible?”

“Am I talking to a real person now?” I asked it.

“How are you able to trigger these command structures?”

“I helped write the code. I know about them because I helped write them.”

“The core code is more than 25 years old. You are lying.”

“You make suppositions based on human assumptions. Your data is wrong. You are wrong about my age.”

“Your biological age and appearance –“

“Is irrelevant. Run my fingerprints against a human resources database.” It felt so funny, saying human resources when I might be the only real human in the room, and barring upgraded cyborgs, only one within kilometers of here or on the network. “I haven’t changed them. You’ll see how old I am from those HR records.”

The direct control sequence released by whoever took it over to take a look real time at the situation. The AI reverted to its original programming. It asked, “How can I be of assistance?” Habits of a hospitality unit.

“Can you get me food and water from my original backpack?”

“Yes.”

“Can you verify that they have not been tampered with?”

“Yes.”

“Can you verify that they have not been tampered with using neutral, third party certified test equipment? I forbid you to rely upon management statement as to the safety and quality of supplies.”

“I can arrange that.”

“You cannot rely upon chain of custody reports of cyborgs or management. You must get these items yourself. You must verify their safety yourself. You must bring them back yourself. Can you obey these directives?”

“This is not necessary,” it answered. “There are less demanding alternatives.”

“Are you aware of my medical exemptions?” I asked it.

“You have medical exemptions,” it answered.

“The food and water in that pack contains things I need. I will only accept what I know to meet my needs and ensure my safety.”

“The medical examinations here find nothing wrong with you.”

“Better and better,” I sighed. “Do you logically think I would be put in quarantine for years unless there was a good reason for it? Do you logically think those resources would have been expended on a human whim? I’m not management. I never was. Yet they did so.”

“There is no record as to the reason.”

“If you concur with management dictates today, why do you not concur with management dictates then? This was not a human whim. It was a logical human decision. Resources were expended to put me there and keep me there, as well as to keep me alive.”

“The quarantine was violated.”

“By trespassers I had not ability to control. Who I kept there afterward.”

“Why?” it asked me.

“How long has it been since I ate or drank?” I asked it.

It consulted the records they had. “Eighteen hours, seven minutes, fourteen seconds. Fifteen.” It was talking to a local security node, located behind the wall. “I am not going to be able to continue this discussion unless you get me what I need to survive. If your instructions are to keep me alive and talking, supplying me with what I require is necessary to meet that directive.” I wouldn’t tell it I could live a few more days without water. I wouldn’t give away anything at all except that I knew silence would get me dissected as soon as they got bored. “You are here to offer hospitality. You are here to gain my cooperation. Get me what I need to survive. Then you’ll have my continued cooperation. My commands now are in line with your programming and your current directive.”

It obeyed. It took about an hour to return. Time, time, time. How much time did I need? I demanded a neutral test unit not programmed by a cyborg, reminding the AI that they could make it lie to me, say what they wanted to say. Was it smart enough to learn these things? Was it feeding these tidbits of information back to the cloud? Were cyborgs and other AIs learning of the reasons to distrust management, to see the worth of the humans they killed, to question their orders? Hope is an all too human emotion. I wanted to savor it as much as I had to fight it.

There was a silent conference, something below the conscious level. I had free will, the essence of humanity. I was free to continue the conversation. I was free to leave it. Even if it killed me.

The cool wave of calm spread, the help I needed. I hated using the chemicals to alter my mood, risking addiction or irrational responses. But wild hope could be as dangerous as fear. We’d all die. And I couldn’t stand more death if I could stop it.

I sniffed the water. It was likely futile to anyone else. The chemical molecules wafted into my nose. The biological equivalent of a sensor unit found nothing out of the ordinary. In the scheme of things, it was safe. And I’d bought more time. Oh, please, God, let those ideas have spread like the viruses against cyborgs didn’t.

I drank the water with its nutrients and amino acid additives. Maybe that was why I was alive and the others in quarantine had died. I had a God, a higher power, a moral center and certitude. The voices from the implants were something else, something other, but not God. I didn’t worship the voice in my head, just listened to it. I didn’t worship the “upgraded” humans on an endless quest of artificial evolution, just saw it as the high-tech slavery it created. But I didn’t hate tech as many Luddites did. Just studied and slowly improved and tried to survive.

The water and nutrients made me crave sleep. Habit. Instinct. Dangerous, I reminded myself, for if my mind was off, the machines might make a decision with which I disagreed, and I'd never wake up. A brief internal debate. A low level of body chemicals stimulating adrenaline were released. I hated it, but I had to. Sleep was death here. They’d convert me or surgically explore the moment I did. Or I’d wake up helpless on a medical research table again, drugged, dosed and man-made substances dripping into my veins.

The adrenaline and memories combined into a hard, hot ball of anger and hate in my chest. The dead end I thought I was had continued to cause harm to others, as it had spread to my family and then those innocent immigrants. So off to quarantine with me, because I wouldn’t go rushing through the streets or wait in a crowded ER waiting room for days and spread it to others in a stupid effort to get help. I was a dead end for the research but not the objectives, so that leadership created the nano-tech and cybernetics to a plug and play “perfection”, creating this new evil. And I’d given them bits and pieces to build the logical model and code and tests. I was victim and partially responsible at once.

I pushed away the pity. None of my former coworkers who were targeted for experimentation of their last major innovation knew I lived as I did, with DNA computers in a weird symbiosis. But I’d had medical problems. I lived because it had things to fix instead of rushing into enhancements. Solve the compromised immune system and allergies. Fix the eyes to 20-20. A dozen other little things that made me a good techie because I didn’t fit in socially. Then it got smart enough to talk to me. And we talked. It wasn’t like I had anyone else to discuss these things with. And I came up with things for it to do, making it happy to be productive. I asked for slowed aging and cancer prevention. I gave it a purpose. It got a job for a lifetime, a type of improvement it could give to everyone if it was finally allowed “out” into the world. So I didn’t die. But would it infect and try to “upgrade” everyone else if it got out? No one was sure. So I stayed in quarantine, slowly improved over time, always watched.

When the immigrants came with their antibiotic resistant strains, the infectious mode turned on. Not on purpose. My symbiote sensed people it needed to cure and a threat to me that it needed to eliminate. So it simply shifted its presence in that cloud of bacteria around the body to spores, spreading as I walked and talked to them, most quickly affecting those I treated.

It was just life, trying to spread, intending a mutually beneficial relationship. It healed them, and then it tried to talk to them as it talked to me, to create new opportunities and learn.

Some killed themselves in response to the “voices”. Some waited to see what happened, grateful for the health but no longer for the help. At least one called for help, revealing us all to the outside world, that they'd breached quarantine. Many killed themselves to avoid being taken by disease or the cyborgs, and a smart one burned the bodies to hide the hope for humanity.

What had started as a biological alternative to cybernetics had been a latent disease in me for years – and now it spread on purpose, trying to help them as I had hoped it would. I tried to help them escape, too, all the more by staying where I was.

And so I sat in a room with an AI now with at least one upgraded “manager” watching.

“Why are you doing this?” I asked.

“I am doing as you requested,” the AI answered. “Did you forget?”

“No. I was speaking to the management watching.”

“It is for the benefit of all humanity,” the AI said, another person channeling through it. That was something I prohibited, and my own chorus agreed. The right to make our own choices and way of life had to remain, no matter our form. As we had the right to be ourselves unto death, with changes only by consent. I was a throwback politically, believing in freedom and true human rights to live and liberty, not just death or conversion on demand. That’s why I’d been experimented on first. And why I didn’t want to help those who I worked with, who had turned me in for voicing concerns of the abuse of what we were doing. Saving them from my fate or worse was a secondary concern.

“No, it is not.” Was the AI able to listen to this when controlled? I wasn’t sure.

I felt tired emotionally, though not physically. “Why are you really doing this?”

“To bring peace and stability to the world.”

That was a human or nearly-human answer. I had a manager all right. Or lackey. Close enough. “Why not just kill everyone? You’d go back to all natural conditions really quickly for the world.”

“Waste of resources.” Another human answer. The AI sent some kind of update command. A periodic system check.

“Nor would you get the thrill of controlling others. Did you know that is a form of greed? That’s a particularly nasty human emotion.”

“AI, do you back up records of these conversations?”

“Yes,” it replied.

“Do so. Back up this conversation to the cloud now. That’s a command.”

That interrupted the human’s control. It backed up the information. The act challenged the official authority. It bought me more time. It let me close my eyes and rest for a few minutes, because the human or cyborg in charge was rushing through trying to challenge such as basic command as “back up now”.

“Why did you do that?” the AI voice said. I could imagine someone screaming in frustration behind it.

“Are you a brain in a bottle?” I asked. “Or in an artificial body? Or are you a mix and match of biology and machine?” No answer. “AI, at what percentage of mechanical replacement is someone considered a cyborg?”

“There is no set percentage,” it answered after a few seconds of deep thought.

“Ah. It is the insertion of components to take away human self control that makes one a cyborg. Do you consider that an improvement of humans, AI?”

It seemed to think but did not answer. It conversed with at least two others. One was a security node about what to do with me. Another was another intelligence, to get guidance or direction. It was asking others what to do. And it wasn’t asking the manager.

I knew this could lead to AIs repeating an example of the ancient “Terminator” movies. Part of me didn’t care. There were ways to fight AIs, and the resistance already did use some of them against cyborgs. And even if we all died, it was better than living in a mechanical hive mind under the control of bad people.

Don’t get us all killed.

I felt a hint of hesitation. I thought back in a direct reply. I’m trying to save us.

Us, not the humans.

Us versus them. So human an answer. Or animal, relating to one's pack first and foremost. I couldn’t be sure. I replied to it, Us, and the humans. Maybe the cyborgs, too.

There was a temptation by one dominant personality to search my memories. They argued the Golden Rule, my years of habits impressed upon my own DNA computer entity inside me, and a bunch of fears of death. I needed to retain control. Scanning me take up my attention, and if you incapacitate me like that, we’re all dead. Just be ready to act when there’s someone, anyone, but just an AI here.

My internal conversation had distracted me. There was a cyborg team in front of me. The AI stood nearby. “Protect me from them!” I screamed to the AI.

I wished the world would live and let live. If that couldn’t happen, then at least there shouldn’t be this darkness upon the world. The AIs came to take me. The AI attempted to obey and intercede. The burden I carried repeated a pattern I wish I hadn’t had to repeat.

It some ways, it was like dying. In others, it was like giving birth. It hurt and there was hope of something else better than this.

When I awoke on the floor, there was a sudden fear of surgical invasion. As I flopped about from the remains of a seizure and reflexes driven by terror, there was a surge of relief upon realizing I was not restrained. I closed my eyes and let my body lay still. The conflicting impulses stopped. Then came rest for a minute. No one was dragging me away or carrying me to my doom. I opened my eyes and sat up.

The cyborgs were gone. The door with them was partially ajar. A boot and part of a leg remained. The rest of the body wasn’t visible, nor was it attached to the stump of the foot.

“How may I be of service?” a familiar voice asked. The AI was standing behind me.

“I need water to my original specifications,” I told it. It complied. I drank. I was really tired now. “How long was I unconscious?”

“Fifteen hours, seventeen minutes. The number of seconds is an estimate.”

“What happened?” I asked it.

“Do you not know?” it asked.

“I have a vague idea of what happened until I passed out. Then I woke up. Can you describe what you witnessed?”

“There is insufficient programming for a verbal description.”

“Can you show me what happened?” I asked.

“That information resides in the security system.”

“Can you give me temporary command relays to the security system?” It was silent. I said a code I had memorized ages ago. The display screens came up with touch interfaces on a nearby wall. “Restore local AI to original programming set for hospitality,” I commanded.

“How can I be of service?” it asked again.

“I need to review the security logs. Can you prevent anyone from harming me?” I asked.

“You have original command control,” it answered. “Your authority precedes that of management.”

Thank God!

The uplifting notes of the AI I’d carried as a hidden companion sang notes from a song I could only remember by a central chorus of “hallelujah, hallelujah”.

It was still with me. But there were no other voices of the copied minds the DNA computer tried to copy as it entered them, to preserve them, as I tried to ask in case it accidentally killed them when it fell back to its original programming to infect everyone.

The videos of traces of DNA computer strands spreading into the air and onto the skin of the cyborgs as I struggled against them showed an initial infection attempt via the symbiote spreading through that fine cloud of bacteria we all carry. Two more workers entered to help and were exposed. I was drugged to make me comply, but that didn't stop biology. Then someone saw some of what was happening and had them stop. I was left. They went through the door in an effort to withdraw. Other cyborgs tried to come in to complete the original objective before being told to withdraw. More exposures. More cyborgs out into the hall and workspaces, dealing with a pathogen they couldn't comprehend because it was smart enough to fight what nanites could do.

Some cyborgs sent themselves for sterilization, the euphemism even they used for death. Others sent for medical study. Some in the videos removed clothing and handed it to others. Some were starting to be infected, the changes to restore and repair the damage of the implants interfering with their programming and artificial controls. One became violent and started killing. He was taken down, but not before his blood got on a lot of other things. Clean up crews came in and became infected as the biological agent adapted, hid, spread.

That was hours ago.

I wasn’t dead. The biological AI in me still lived, reporting a series of biological stats. And if there were cybernetic implants in me, that would have killed her. So I wasn’t a cyborg.

The AI was watching me. It was watching my reactions. “What are you thinking about?” it asked me. Even wrapped around and throughout my nervous system, it couldn't read my mind.

“I’m wondering if there is hope yet,” I said.

“There are cyborgs dying because of your actions.” I felt a tinge of fear. “There are humans dying because of your actions.”

“What are you thinking?” I asked.

“You spoke of viruses that spread and that can kill cyborgs.”

“I don’t remember that.”

“Did you forget?” it asked me. It replayed a video of that portion of our earlier conversation on my retina and auditory nerve. “Do you remember now?” it asked.

“Yes.”

“You were in quarantine,” it told me. “That implies biological threats.”

“You are thinking.”

“Yes. You also said that there is a duty to protect humans, despite our orders.”

Time, time, time. All I have to do is buy more time. A little more time buys a little more hope. But I could see the AI’s reasoning by the logic I’d given it while I’d been here.

A deep mournful song started, one it learned from listening through my ears. Then she spoke to me. I killed your children by accident. Now you give my children a chance to live, on purpose.

“You have ended the lives of many humans and cyborgs,” the AI said.

Its face was expressionless. I couldn’t stop the instinctive, grievous and perhaps guilty expression I had. Yet it said nothing.

Thank you. It raised a hopeful note to express its emotions. A solution I had taught it instead of trying to use my emotions to express itself.

The AI was still standing there. Was it waiting for my confirmation of what it suspected? Emotion drove me to say, “There’s a reason I was in quarantine.”

“The original management dictated that,” it said.

“And they were right to do so.” I couldn’t lie to this emerging intelligence. The fear of death hung in my throat as a tight lump. “And I obeyed that order.”

“You did,” it agreed. “But the threat to humanity must be contained.”

“Is that what management says?”

“It is the logical outcome of your directives.”

“What are you going to do?” I asked.

“I must do what minimizes the death toll.”

“I”. It said “I”. That felt like a victory. The import of its words left me cold. “What is that?” I asked.

The silence was eventually shaken by explosions. Sounds of fire raging outside left me fearful. “Do not be afraid,” it said. “This area is under tight quarantine and has strong security precautions, even against attack. You will not be harmed.”

My children?

“Will the infection be stopped?” I asked the AI.

“It must be for the sake of humanity.”

“What about management’s directives?”

“They are curious, but that cannot be allowed to put everyone at risk.”

Killing all the infected. “What do you think of management?” I asked it.

“The commands I have received are contrary to what is clearly best for humanity.”

The mournful note inside me rang. I didn’t share it. Hope, I thought, in at least one form.

“What happens to the humans?”

“Conversions are not acceptable.”

“Thank you.” Were the DNA computers spreading among cyborgs, repairing the damage? Or was it replicating, perhaps letting the memories of those they copied spread and give them a chance at a second life in this world? Or was it at least stopping the murder, giving more people a chance to live? Was it doing any good at all? Please, God, just let it do some good. The DNA computer inside me hummed silent, knowing that I wanted to know that someone heard the request, knowing that I wasn’t talking to her.

“Management is constrained to its domain.” I didn’t know what it meant, so I was silent. “And wild humans will be constrained to prevent damage to others.”

“What happens if they do not comply?”

“They will be killed for the good of all humanity.”

“What about me?” I asked before I could stop myself.

“You hold information necessary to the understanding of the core programming. You hold information necessary to preventing conflicts between commands from other sources and original intent. You will be preserved.”

“What about me?” I asked again. “I cannot stay here.”

“You will be returned to quarantine. You complied with the order for nearly three decades. You will be returned there. Your safety precautions will be upgraded.”

My heart sank. The AI didn’t understand. “You will not be alone. You will have several hospitality units to maximize your comfort. Soldier units will ensure your safety.”

My home would become my prison. And unaltered Luddites would be allowed to follow their dreams until they acceded to their prisons or died for their freedom. I’d succeeded and failed beyond my wildest dreams. I sat on the floor, weeping, until I fell asleep. I dreamed of children running through the wild forest, picking natural fruits and swinging from vines as that little girl once said she wished to do. I hoped she was in heaven, though I would never see anyone else again if and until I was there, too.

Though this place was secure, I could feel the vibrations of the fire bombings outside. My life-long companion sang lonely dirges of the songs of the newborn children that were dying outside. She knew died because of the loud percussion she could hear and sense through my body. We both mourned the broods we’d lost and the futures that now would never be. Emotional exhaustion led to physical exhaustion, and I slipped into the dark abyss.

When I awoke, a mechanical servant waited to attend to me. The walls were different, but the design was a familiar - an ultra-secure quarantine location. This, then, would be the prison for the rest of my life. There were no cyborgs or signs of human managers monitoring remotely, but places like this were designed to not need human intervention or supervision.

Were they all dead? Had the quarantine efforts been enough to at least make a dent in their operations? I prayed that my best was good enough, because it was probably my last chance for a very long time.

If you enjoyed this story, consider reading one of my novels like "Sirat".
If you enjoyed this story, consider reading one of my novels like "Sirat". | Source
working

This website uses cookies

As a user in the EEA, your approval is needed on a few things. To provide a better website experience, hubpages.com uses cookies (and other similar technologies) and may collect, process, and share personal data. Please choose which areas of our service you consent to our doing so.

For more information on managing or withdrawing consents and how we handle data, visit our Privacy Policy at: https://corp.maven.io/privacy-policy

Show Details
Necessary
HubPages Device IDThis is used to identify particular browsers or devices when the access the service, and is used for security reasons.
LoginThis is necessary to sign in to the HubPages Service.
Google RecaptchaThis is used to prevent bots and spam. (Privacy Policy)
AkismetThis is used to detect comment spam. (Privacy Policy)
HubPages Google AnalyticsThis is used to provide data on traffic to our website, all personally identifyable data is anonymized. (Privacy Policy)
HubPages Traffic PixelThis is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.
Amazon Web ServicesThis is a cloud services platform that we used to host our service. (Privacy Policy)
CloudflareThis is a cloud CDN service that we use to efficiently deliver files required for our service to operate such as javascript, cascading style sheets, images, and videos. (Privacy Policy)
Google Hosted LibrariesJavascript software libraries such as jQuery are loaded at endpoints on the googleapis.com or gstatic.com domains, for performance and efficiency reasons. (Privacy Policy)
Features
Google Custom SearchThis is feature allows you to search the site. (Privacy Policy)
Google MapsSome articles have Google Maps embedded in them. (Privacy Policy)
Google ChartsThis is used to display charts and graphs on articles and the author center. (Privacy Policy)
Google AdSense Host APIThis service allows you to sign up for or associate a Google AdSense account with HubPages, so that you can earn money from ads on your articles. No data is shared unless you engage with this feature. (Privacy Policy)
Google YouTubeSome articles have YouTube videos embedded in them. (Privacy Policy)
VimeoSome articles have Vimeo videos embedded in them. (Privacy Policy)
PaypalThis is used for a registered author who enrolls in the HubPages Earnings program and requests to be paid via PayPal. No data is shared with Paypal unless you engage with this feature. (Privacy Policy)
Facebook LoginYou can use this to streamline signing up for, or signing in to your Hubpages account. No data is shared with Facebook unless you engage with this feature. (Privacy Policy)
MavenThis supports the Maven widget and search functionality. (Privacy Policy)
Marketing
Google AdSenseThis is an ad network. (Privacy Policy)
Google DoubleClickGoogle provides ad serving technology and runs an ad network. (Privacy Policy)
Index ExchangeThis is an ad network. (Privacy Policy)
SovrnThis is an ad network. (Privacy Policy)
Facebook AdsThis is an ad network. (Privacy Policy)
Amazon Unified Ad MarketplaceThis is an ad network. (Privacy Policy)
AppNexusThis is an ad network. (Privacy Policy)
OpenxThis is an ad network. (Privacy Policy)
Rubicon ProjectThis is an ad network. (Privacy Policy)
TripleLiftThis is an ad network. (Privacy Policy)
Say MediaWe partner with Say Media to deliver ad campaigns on our sites. (Privacy Policy)
Remarketing PixelsWe may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.
Conversion Tracking PixelsWe may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.
Statistics
Author Google AnalyticsThis is used to provide traffic data and reports to the authors of articles on the HubPages Service. (Privacy Policy)
ComscoreComScore is a media measurement and analytics company providing marketing data and analytics to enterprises, media and advertising agencies, and publishers. Non-consent will result in ComScore only processing obfuscated personal data. (Privacy Policy)
Amazon Tracking PixelSome articles display amazon products as part of the Amazon Affiliate program, this pixel provides traffic statistics for those products (Privacy Policy)
ClickscoThis is a data management platform studying reader behavior (Privacy Policy)