Brown Noise

by Peter L. Ormosi

An unbranded, generic issue dog-walking drone logged into the building’s central hub requesting access to flat 3F1. The door opened and the drone hovered into the dimly lit studio. The room was furnished with nothing but a sink, a table with a chair, and a third generation VR Pod, which voluminously dominated most of the spartan arrangement. Deep-layered brown noise from the VR Pod suggested that he was connected.

A pug, which had been sprawled on his dog-bed excitedly jumped up to the sound of the drone entering the flat. He snorted happily, wagged its tail, and watched with expectant eyes as his master’s algorithmic substitute descended next to him. The drone’s sensors wirelessly connected to the dog’s smart collar, then it hovered back to the door. The dog abidingly followed, which its collar rewarded with an infinitesimally small dose of oxytocin injected into the body to reaffirm a Pavlovian response. Before they left the room, the drone’s speaker attempted to get through to him. 

‘Thank you for using our dog-walking services. Your dog will be returned at 6:00pm’. Without receiving a response, they left and the door shut behind them.

Dimness and brown noise reconquered the space again. Outside, a patrol drone was passing the window of his 52nd storey flat. The drone’s solid-state laser spotlight lit up the room for a moment, casting light on his face. He looked pale, probably late 20s, but it was difficult to tell precisely. Age had become an elusive concept. He wore a long-sleeve olive overall, with a sign that said “LABELLER”.

The VR Pod abruptly went to standby. He cursed, then climbed out of the machine. The sudden jumping out of his Pod gave him a head rush. His vision went dark for a second and he needed to hold on to the side of the Pod to stop himself from falling over. The voice of his home system broke the silence.

‘Collect food delivery from landing pad.’

In a confused haze he walked over to the window and leaned close to see through the tinted screen. Against the slate opacity of the sky, he saw a food delivery drone levitating in the thick rain. He pressed the delivery door’s button. The small door opened, and a tray gently slid inside, with a waterproof food box on top.

‘Return old food box!’ The new instruction took minutes to ignite a neural response in his brain. Suddenly the small, unfurnished studio felt like a depressingly large haystack to him. He tried to think hard but had no recollection of his last meal. A few minutes later he found the box under the table.

‘Please return old food box,’ the algorithmically gentle voice politely reminded him why he was looking for the box, which he then put on the delivery tray and pressed the button next to it.

‘Thank you for using our food delivery service.’

He sat down to eat. His body looped over the somatic instructions required to bite, chew, and swallow, but his mind paid no attention to the sight or the flavour of his food. He stared at the wall-to-ceiling window. The home system detected the direction of his glance.

‘Transparent window mode activated,’ the system noted. The liquid crystal modulators on his window slowly faded out the tinting. He watched the setting sun projecting its rays under the clouds from the distant horizon. With the marginally improved visibility he could see the building across the road, and another building, and another, until they all blended in with the dark grey curtain of haze and rain.

His brain was numb. He spent the whole day labelling short videos of facial expressions for an emotion-detecting algorithm. Sad, happy, joyful, morose, angry, frightened. Male, female, old, young, Asian, African, white. Videos after videos and the monotonous task of picking the word on the right that best described the emotions.

As he finished his lab-grown burger, an unwelcome wave of anxiety hit him. He had just spent half an hour disconnected. He walked over to his VR Pod, and picked up the goggles, which had been sitting idly in their charging station. The specs automatically activated as he put them on.

‘You have spent all day in your Pod. The optimal decision would be to go for a walk now,’ his personal system was talking to him through the tiny speakers of his goggles. A walk. That suddenly seemed like a great idea.

‘You will need to put your shoes on. It is 15 Celsius degrees outside and rain. We suggest you wear this coat.’ His augmented reality vision highlighted a long, black, oilskin overcoat hanging on the wall. He put his shoes and coat on. Aware of his intention to leave the flat, the door opened, and he walked outside.

Downstairs, at street level, it was already dark. Mountains of 100-storey apartment buildings blocked out daylight even on the sunniest of days. The rain switched to a lower level of intensity. A sluggishly flowing river of uniform oilskin overcoats and white goggles surrounded him. He joined the flow in the direction indicated by his device. After a half-an-hour traipse in the uniform crowd against an invariable background of buildings, he was instructed to turn to a side street, where the crowd became sparser. A few blocks later he spotted the first sign of foliage. One of the city parks. His system instructed him to walk to the park. His goggles pointed to an unoccupied bench, and he walked over to sit down. Rain and sweat mixed on his forehead and it took a few minutes for him to recatch his breath.

Flashbacks of the emotion videos were flaring up in his mind. The bulging veins of an aggressive man yelling angrily. The waving flirtatious woman in a flowery dress on a sunny day. Then a crying and desperate child trapped in a cot. He couldn’t get the image of the child out of his head. An unexpected thought ascended on his brain then left and returned again as if an old hard-wired routine was trying to resurface.

‘Why am I doing this?’

The image of the boy’s desperate attempt to escape his cot flashed up again. With his mouth, the boy was trying to formulate a word.

The sharp sound of an advertising hologram brought him back from his absorption.

‘We do not leave anyone behind,’ the projection of a man in a grey civil servant uniform announced. ‘Celebrate 5 years of Universal Income with entering our game. Apply here.’ A holographic code showed up in the streets. A few people stopped to scan the code with their lenses.

He turned his head back to the trees. A new thought emerged and hit him as hard as it was metaphorically possible. Suddenly, he felt an irresistible urge to take his goggles off. The trees, and the intermittent sound of birds slowly sank into his conscience and began to open rust-eaten, heavily jammed, old doors in his mind. He reached for his goggles, when, sensing the change in his pulse, and the widening of his pupils, a new instruction from his personal system blew him.

‘Time to go home! Follow the arrows on your screen for the quickest itinerary.’

As if he had just aroused from a strange dream, he realigned his attentiveness with his system and began to walk home. This time the journey seemed much shorter.

The dog had already been returned when he stepped inside his flat. He hung up his dripping coat and walked over to his VR Pod. He was ready to get inside, but then he changed his mind and decided to sit down by the window. He reached to take his goggles off when a message appeared.

‘You have 12 unread urgent messages. Enjoy reading the messages in the comfort of your Pod.’ The brown noise from the machine invitingly purred. His dog let out a half-hearted, inauspicious growl.

He hesitated, then he reached for his goggles again.

‘Two of your messages require urgent response,’ his system relentlessly reminded him.

He lowered his hand. After a short pause he got up and walked to the VR Pod. He removed the goggles, placed them on the charging station, and then slowly got inside the Pod.

#

Next evening, an unbranded, generic issue dog-walking drone logged into the building’s central hub requesting access to flat 3F1. The door opened and the drone hovered into the dimly lit studio. The wireless sensor connected to the collar, which rewarded its wearer with a small dose of oxytocin for obedience. As they approached the door, the dog longingly watched from its bed as his organic master obediently followed the non-organic one.

~

Bio:

Peter-Ormosi is British-Hungarian, living in the United Kingdom, and when not writing fiction, he is a Professor of Economics, studying the social and economic impact of AI. He has just finished his 100,000-word debut novel (for which he is now seeking representation).

Philosophy Note:

My unconcealed goal is to use science fiction as a vessel to expose currently pressing issues with the role of AI in society. “Brown Noise” is a caricature of human-machine symbiosis, depicting the life of a labeller, one of the most menial of human jobs – a human sacrificed to make machines more human-like.

Feel free to leave a comment

Previous Story

Committed

Next Story

“Why Is Her Face Doing That?”: The Personhood Of Robot Nanny

Latest from Fiction

Untitled

This self-defeating excerpt does not sum up a story of paradoxes, by Jeff Currier.

Charlie v. Inman

Could an extraterrestrial attain legal personhood under current human laws? By Mary G. Thompson.

Stereopolis

On the perils of inhabiting urban space with more than three dimensions, from Gheorghe Săsărman's cycle