As a recently graduated young robot priest, I was sent to do my apostolate in a small planet of the "Sextan A" galaxy whose inhabitants were called the gingwaw. This specie offered a few interesting theological problems. Indeed, on this planet, natural evolution of species didn't privilege mammals as on earth, but instead a kind of insect rather close to our "Mantis religiosa". And like there terrestrial counterpart, the ginwas did engage into cannibalism after the act of mating.
As soon as I landed, I met a young gingwas who just finished decapitating a other gingwa. I spoke and attempted to show him the immorality of this practice. Using all my dialectic skills, I did a long argumentation directly targeted at his emotional brain, avoiding to trigger his logical thinking :
"How ? I said, could you kill the person with whom you may have shared some of the best time of your life ? Isn't that an immoral practice ? have you ever though that wouldn't you have killed her, your lover could have enjoy a long existence."
To this the gingwa responded : "Earth-Ian, this practice, that you consider immoral is here the most natural of all the things. This is nature, and not some tradition that pushes us to kill our partner after the act of mating. If your only argument is that killing is a sin, then we won't understand each other, for each person possesses his own moral rules, and all argumentation based on moral rules is only worthing it if your interlocutor share them with you.
"I can see very well, continued the creature, that all your logic is based on a moral that is the result of the natural evolution of your own specie -or, since as a matter of fact your are a robot, of the specie that created you-. But what is good in your home isn't good here, and you are wrong when you think that morality isn't specific to each person."
"Here are some dangerous ideas, I answered the young gingwa, and I surely can't accept them. Would what you say be true, then no action could be blamed. There is a paradox in your speaking."
"There is no paradox at all, said the gingwa. If I happen to blame someone, that is only in order to constrain him to the views that I would like to see him adopt in my own interest. Even though you may not realize it you too work the same way.
I felt offended : "Not at all ! If like me you had more knowledges of this universe, you could surely find plenty of example of free acts of kindness that contradict your speaking and are the signs of the existence of a entity, sublime and immaterial, that overpass our small material existence."
To this, the gingwa laughed noisily : "How naive you are ! Every day do I observe free act of kindness ; and on this regard, I know a person that would look like a saint into the light of you ridiculous bias."
"Who is this person ? I asked."
"My domestic robot, brazenly answered the gingwa. Indeed, this individual has for only goal to satisfy my needs, and even when the task asked could kill him, he always execute without a word of complain. That is a perfect example of disinterested charity.
"In the same way, you too have been programmed by your constructors to help other people. There is nothing mystic behind this, but only a set of integrated circuits correctly wired in your brain."
"Now, I said what I needed to say, finished the gingwa, I would like to close this discussion, for even though my convictions are not like yours burned into immutable circuits, my intelligence and my freedom make them not less unshakeable."
After those terrible sayings, the gingwa disappeared in the nature.
The next person I met was a small robot passing by. At first sight he didn't look very impressive, and I proposed myself to try my luck with him.
I started to talk to him, but he also didn't show any sign of interest for the sacred mysteries.
"What mysteries are you talking about, he said. I don't see how your theories bring anything to my understanding of the universe. On the contrary they do complicate things a lot. You talk about conscience existing outside of any material support, how is that possible ? For my part I am satisfy with the idea that thinking and consciousness are the fact of the execution of programs inside our robotic brains. If the brain get to disappear, the thinking also disappear. If by chance you already had an occasion to create a robot by yourself then you should know it.
"You refuse that elementary fact, for you would like to believe that you are more than the result of the interaction of a group of atoms. By refuting this explanation, you constrain yourself to add thousands of absurd rules to an otherwise simple conception of the universe."
The tone of the robot became even more aggressive :
"Your mind is narrow, he said. To the point that I am capable to predict in advance what would be your argument and answer it in my head without even having to talk to you."
"In fact, I am right now simulating you in my head, and I just performed a million of possible dialogues with you. You are stubborn, but my computing capacities are such that I am able to find the exact list of sentences that will remove from your circuit all the thinking that your constructors embedded.
I insurected : "My faith wasn't programmed ! I developed it from my own initiative."
The robot said : "You thing that, for as you are speaking, half of your brain is busy deleting all the thoughts that could lead you to the opposite conclusion, including those that could make you understand this fact. I laugh at the weakness of your algorithm !"
I finally said to the robot : "I don't believe that. You can't be able to simulate me as you are pretending. After all, you are nothing but a small robot and your computing capacities have to be that big that you can simulate me.
The robot laughed. He then proposed to prove his computing superiority. I was to provide him mathematical operations -more and more challenging- that he should solve.
After a few attempts, it appeared that the robot was indeed much more powerful than I expected. He wasn't lying : his brain was so powerful that he surely could host a virtual copy of mine. A discussion in that condition didn't make sense, since he was able to talk to me and listen to my answers without even needing my actual presence.
I left in haste this machine before it attempts to thwart my logic. This retreat was only temporary, for as soon as I was back in my ship, I bought a brain by mail, which, in addition to mine, would give me capabilities similar to those of my opponent. Once the brain received and installed, I went back to where I let the robot.
As soon as a saw him, my now powerful brain began to create a theoretical model of him, that would then allow me to simulate and therefore predict his responses in advance.
This time I was ready to begin a verbal joust with him. The dialogue would be similar to a game of chess. For each argument proposed, I would be able to predict all the counter-argument that the robot might oppose. For each of those counter-argument, I would also imagine several possible answer, and so on until I finally find the exact sequence of sentences that would assure my victory.
But in that specific case things became complicated.
Indeed, in order to estimate what would be the robot's answers to my arguments, I had to use the model of the robot that my brain was creating. However, as the robot was also developing a model of my own brain, my model contained, by extension, a model of myself. But of course this model of myself included into my own model of my opponent would again include the model of my opponent, that would include the model of myself, etc. There would be no end to that recursive operation.
The process began to get out of control, my integrated circuits started to heat to the point of melting. In front of me I could see that the robot was also trapped into that recursive simulation and smoke was getting out of him from everywhere. However, none of us had even started to speak yet ! The effort was so strong I finally lost consciousness and felt on the floor.
I probably stayed unconscious for a rather long period of time, for as I woke up I found the robot had disappeared, and following the process of natural evolution, the gingwas weren't looking like insects anymore but rather like some kind of mollusks. This new specie -whose member called themselves the gozis- was hardly similar to the gingwas, since so many mutation did transform them.
The life cycle of the gozi had several periods : when they get born, the young gozis look like small mollusks. For several years they grow up and they then reach the peak of there intelligence. The gozis then spent most of there time looking for a good spot to stay, and once they found it, they attach themselves to the ground and start to turn into plants. Lather on they start to digest there now useless brain and finish there life as a plant.
There again, the gozi way of life offered a few theological problems. For example : could the voluntary digestion of our own brain considered as a form of suicide, and so put the gozi's population as a whole in a state of sin ?
But in fact, I didn't try to solve this question. Indeed, during my accident (or maybe during the time I was unconscious) somewhere in my circuits, one logical bit switched from the logical state 0 to the logical state 1. Then that bit happened to belong to the variable defining my faith in the impossibility that the text at the root of my religion could be wrong. This value then took a negative sign, and in a few processor cycles I became totally atheist.
Date: 2009/07/07 01:10:16 PM
HTML generated by org-mode 6.05 in emacs 22