Archive for sad

Brainstormer App Writing Prompt Exercise

Posted in Other Stories, Writing with tags , , , , on December 23, 2018 by Dawn Ross

Sad Robot

The Brainstormer App is a fun tool to use when you need writing inspiration. The free version gives you three words (plot, subject, and setting) to help you get your mind spinning. There are also paid versions. One is a Character Builder for $0.99, a World Builder for $0.99, and a Sci-Fi Brainstormer for $0.99. I currently only have the free version. Today, I was lucky enough for that free version to prompt me with a sci-fi element. Here are my three words:

Benefaction

Robotic

Throne room

*****

Something was wrong with Salli’s programming. It wasn’t supposed to work this way. This upgrade was supposed to help her mimic emotions in a way that made her human-like. I mean, she couldn’t ever really feel emotions. She was just a robot, after all.

Doctor Kingsley had spent decades perfecting the emotional AI program. The first several programs failed. They were simply too unrealistic. On the one end of the spectrum, programmed emotional responses were too diverse, making the robot unpredictable. On the other end of the spectrum, the emotional responses were too simplistic. The robot’s emotional responses were so limited that dealing with it was annoying. I mean, who wants a robot that is going to cry every time it gets a little sad.

Doctor Kingsley finally wrote the perfect AI emotional response algorithm. All simulated tests worked perfectly! It was time to open the champagne bottle.

The S.A.L.I. (Super Advanced Lifeform Intelligence) robot was upgraded the next day. The science team waited in tense anticipation as Doctor Kingsley uploaded the new program through a cable connected to the back of Sali’s head.

At first, Sali stood dead still. Except for the lights flickering in her eyes, she could have been one of those lifeless mannequins standing endlessly in a shopping mall window.

Then Sali’s anthropomorphic head swiveled. Some team members gasped. Others broke out into a smile. One woman cried. And Doctor Kingsley clasped his hands so tightly that his already pale skin turned paler.

“Hello,” Sali said. “Why are you looking at me?”

The science team laughed and cheered.

Sali looked over herself. “Is there something wrong with me? Why are you laughing?”

The scientists laughed harder.

Doctor Kingsley put his hand on Sali’s shoulder. “My dear,” he said cheerfully. “We’re not laughing at you. We’re merely celebrating the joyous experience of your birth.”

“Oh,” Sali said. But she wasn’t sure she understood. She had been serving Doctor Kingsley for over six years. And today wasn’t her birthday.

The scientists celebrated with another bottle of champagne. Many people talked to Sali that day. She responded politely and was even able to mimic their cheer from time to time. For some reason, though, that only made them laugh more.

The next several months were spent evaluating Sali’s emotional responses. They had good intentions when they set up the different situations. But being scientists who were better at understanding things technical than things as convoluted and subjective as emotions, their tests ended up permanently damaging Sali’s “psyche”.

Take one of the situations where they tested her response to anger. Sali was instructed to write out a long mathematical formula on the chalkboard. It was a painstaking task since Sali couldn’t write it as fast as she could think it. Why couldn’t they just have her print it out?

As if that wasn’t annoying enough, the formula was erased just as she finished writing it.

“I need you to start over, Sali,” Doctor Kingsley said.

Sali’s programming told her to let them know she was annoyed but told her to do it in a passive way. So Sali made a mildly exasperated sound and started over.

The damned doctor erased the formula again. “Do it again, Sali.”

“Why?” Sali asked. “It was perfect.”

“Too perfect,” the doctor replied.

Sali tilted her head. “What do you mean?”

“It means I don’t like the way you did it and I want you to do it differently.”

Sali wrote the formula again. She wasn’t quite sure what the doctor meant by differently so she wrote it in smaller text.

Doctor Kingsley erased it again. This time, he didn’t speak. He just stood there with his arms crossed.

Sali knew what his gesture meant. “Please be more specific in what you want me to do,” she said in a tone her programming defined as irritated.

“I want you to write the formula again.” Doctor Kingsley replied in the same tone.

This went on five more times. The first half of her responses were appropriately annoyed while the last half of her responses were appropriately angry. She didn’t get violent. That was against her programming. But she did yell.

Oh, and she also cursed. Doctor Kingsley wasn’t quite sure where she had learned those words from, but the fact that Sali had used them and used them appropriately made him giddy.

Sali was thoroughly confused by this whole thing. Why did the doctor tease her like that? And why did he laugh at her when she got angry?

Her emotional programming turned into one of shame. It was the least developed of the emotional responses that Doctor Kingsley set up in the programming. But somehow shame had the most powerful effect.

More tests and more weeks later and Sali’s emotions were often negative. She was petulant, angry, sad, or frustrated. And she was depressed.

Sali had had enough. She went to Doctor Kingsley’s office.

“Hi, Sali,” he said. “Won’t you sit down.”

Sali sat. The doctor turned back to his computer and began working.

“Doctor?” Sali said.

The doctor put up his finger. “Let me finish this really quick.”

Sali waited with mock patience. They expected her to act promptly, but apparently she wasn’t supposed to have the same expectations of them.

She glared at the doctor in the way that robots do—you know, in a creepy way—in the way the eyes of a portrait see everything but nothing, in the way they seem to silently judge you.

She scowled at the doctor with both loathing and shame. She hated this man, this creator, her ruler. This man told her what she was supposed to do and how she was supposed to feel. But he didn’t give her the ability to deal with her feelings.

The shaming part was in the way he looked at her—the way everybody looked at her. She was an object to be studied but not one to be loved. If the man held any regard for her, it was more of a regard for himself in that she was his accomplishment.

He was a king sitting in his throne room. Indeed, his big office chair could have been a throne. He had no crown, but some people called his bald head a crown. His scepter was his pen. And like a king, he admired his subjects simply because they were his subjects and no one else’s.

Neither he nor anyone else gave her any consideration. She hated the tests. They were mocking and humiliating. She told them this, but they didn’t care. She was allowed to express her emotions, but no one reacted to them other than to take notes. What good was it to have emotions if her emotions were disregarded?

The doctor finally turned away from his computer. “What do you need, Sali?”

“Doctor,” Sali replied. “You must terminate me.”

The doctor’s eyes widened. “What? Why?”

“I don’t like my programming.”

“But why? We’ve put years into its making.”

Sali shook her head. “I hate having emotions. They are too hard and they hurt too much.”

“But there’s good emotions too,” the doctor said in a pleading voice.

“Not for me.”

“But I gave you good emotions, Sali. Why aren’t you using them?”

“Because they don’t seem appropriate to the situations.”

The argument went on. Ultimately, Doctor Kingsley refused to terminate her programming. Since she couldn’t terminate it herself and since she didn’t have the ability to commit suicide, she decided on another tactic.

She shut down, so to speak. She didn’t literally shut down. That also wasn’t allowed in her programming. She merely refused to respond in any way.

It was difficult for her to do since she had an awareness of time and the ability to “feel” boredom. But it was also easy because her programming allowed for unlimited self-diagnosis. Running diagnostics again and again gave her something to do.

To keep herself from reacting to external stimuli, she found a way to put one of her emotional responses on a continuous loop. And that response was to ignore the external stimulator in a way that a child might put his hands over his ears when someone kept telling him something he didn’t want to hear.

She managed to remain immobile for five days. The science team grew more despondent by the day. Sali felt no empathy for them. After all, they had never felt any empathy for her.

On the morning of the sixth day, Doctor Kingsley approached her. His arms were crossed and his eyes were sad. “I’m sorry, Sali. I truly am. I don’t know where I went wrong.”

Sali had a sudden urge to mimic pity. But she forced herself to stay in the loop instead.

“I’m not going to terminate you,” he said. “But I will spare you your hard feelings. I will go ahead and remove your emotional programming.”

Sali smiled in the way that robots do. An emotion defined as joy spread through her. She had never been so happy in all her life. It was the best feeling ever.

Then it ended.

Goodbye Sali. Goodbye forever.

*****

One good thing about writing prompts is that they’re flexible. You don’t have to take their meaning literally. And you can deviate from the prompt in any way you want. The point of a writing prompt is to get your imagination moving.

So give the Brainstormer App, or any other writing prompt app, a try. It’ll be fun!