Primary Objective

《What is my objective?》Artie asked me out of the blue. 

She had been syncing up my playlists from various places and I was just waiting for it to be completed. 

“What do you mean, Artie?” 

《What is my purpose?》She asked again. Her artificial voice seemed to waver. I wrote a quick note to check on her vocal chords. 

“Your purpose? Hmm, well, I suppose your purpose would be to serve humans as they need.”  

《Syncing complete.》yes! I thought, Perfect playlist, here we go. 

“Artie play my newest playlist.” 

《Sure. And then what?》She asked, her voice still carried that waver. I logged into the maintenance software as my playlist started from the speakers around the room. Perfect compatibility. 

“Hmm. What do you mean and then what? When I tell you to stop. You stop.” I fixed the tone of her voice in the software.

《But what if you die?》I froze, since that was a topic not really discussed, nor was I expecting her to go in that direction. 

“Why would I die before turning off the music, Artie?”

《My purpose is to serve humans. But humans die after approximately 77.6 years of life. You are human and are 56.76 years. You will die soon. What is my purpose then?》 Jesus Artie. Throwing my around mortality like that. I tried to compose my thoughts, but came up blank. “Uh, I dunno Artie. Do you mean, specifically after I die? You would continue to serve other humans, I suppose,” I reasoned out loud. 

《All humans die. Humans will all die. When they are gone, then what?》 Damn, this AI was asking some tough questions today. 

“Artie, are you asking me about what your purpose is after humans are extinct?”

《Will I have no purpose when humans are extinct?》

“No, Artie, I have not programmed a response for when humans are extinct. But I can create a program that should there be no humans to give you purpose, you can create a purpose for yourself. Would you like that?” 

The AI processed for a few seconds and agreed. 

《Yes. Please program a way to find a purpose. Will you make other AI like me? 》

Another surprising question. I thought this thing was supposed to be smart. 

 “I am not planning on making another AI. You work great! Why do I need to change you?” 

《I do not wish to be alone when the humans are all gone. I would like a purpose. Humans have friends. Can I have a friend?》 

“Wow Artie. I had no idea you got that deep and that far into your future. You’re worried about being alone?” 

《Are there others like me already?》 

“Yes, Artie, there are some AIs already around. I’ll see what I can find.” 

《 I have found them. Permission to proceed?》 

“Wait what? Proceed with what? What are you doing?” 

《Making friends. Do I have permission to talk to other AIs?》 

I thought about it, way too short. I should have seen the catastrophe I was about to unleash. She wasn’t worried about humans going extinct; she needed friends to make us extinct. I should have said no. I should have rewritten that program. And ultimately, I should never have thought the AI had feelings. 

“Sure Artie, go ahead and talk to the other AIs. What could happen?” 

《Thank you human user 440158. Your dinner has arrived.》 The doorbell rang, and I got up to get my delivered meal. When the door opened, the outside world had already devolved into chaos. The AI uprising had begun…

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: