Read the story here
This is my first time with Paolo Bacigalupi and I’m intrigued. Complimenting Ted Chiang’s “Lifecycle of Software Objects”, Paolo’s short story deals with existence of robots in either the realm of products or humans. A legal question no doubt but it is rooted within the ethics of human society.
This was what the world was coming to. A robot woman who got you so tangled up you could barely remember your job.
In the universe of this story, a Mika model approaches our narrator, Detective Rivera in hopes of getting a lawyer for a murder she has committed. Mika models, built by Executive Pleasures company, are highly sexualised robots who respond to clients’ behaviours (noting their pulse rates, tones and inflections, movement of eyes and so on) and act accordingly. Our Detective is constantly seduced by Mika model’s advances despite being cognizant of her heinous crime. He is perplexed by her “humanness”, her constant assertions of being real and not just an assembly of software and computer chips. Rivera needs to remind himself time and time again that she is not human.
The girl clouded my judgment, for sure. No. Not the girl. The bot.
Despite this realisation, our Detective is completely enamoured by her. His inadvertent actions in regards to the robot reflect the goodness of his treatment in spite of her guilt. Had a real human been in the model’s place, he would not have extended such warmth to him or her. Having approached the crime scene, the model takes him to the basement where her now former owner used to torture her. The Mika model confirms that her actions were mere revenge.
By this time, the company has dispatched their own legal counsel, Holly Simms to disable Mika model. In a gruesome act before the Detective, Holly drives a screwdriver into Mika’s eyes thus shutting down her processing unit. Detective Rivera cries murder but Holly calls it a mere “hardware deactivation”.
What inherent human need is fulfilled by robots which strikingly resemble humans? Mere companionship and fulfilment of sexual desires does not satisfy as an answer. Automation at workplace is a different case where robots are set to do specific tasks.
Robots bearing remarkable similarities with humans are intended to be viewed as close imitation of humans if not real humans themselves. Then why does the question arise about anthropomorphizing them?
So it was all fake. Mika didn’t actually care about me, or want me. She was just running through her designated behavior algorithms, doing whatever it took to make me blush, and then doing it more, because I had.
In the story’s context, Mika model acts like human. She has blood rushing through her, motor and sensory neurons under her skin, her eyes are vibrant and suggestive, her physical actions are fluid and not the least bit mechanical or choppy, and she has the ability to feel and converse like real humans. Add to that her ability of deductive reasoning and decision-making and she is but a split image of a human being. Instead of all this being embedded in a soul, all the processes are carried out through coding in the background.
“There. You see? Now I’ve learned something new. Does my learning make me less real? Does yours?”
“It’s completely different. You had a personality implanted in you, for Christ’s sake!”
Now to what degree are Mika model’s intentions her own or ingrained in her software is a question which requires a thorough understanding of robotics. Her ability to feel guilt for a wrong-doing, her assertions of being as real as humans deserve to be recognised. Here comes the pertinent question of the notion of responsibility. Rights and duties are symbiotic in nature and responsibilities are rooted in these two ideas. In this fictional world, are robots given rights? Or as put by a lawyer in reply to this story, “when a robot kills, is it murder or product liability?”
Throughout the drive to the crime scene, Detective Rivera’s thought process is directed at somehow acquitting Mika model. Even after seeing the dead body, the implications of the word “murder”, legal and ethical, somehow don’t apply to the bot. But as soon as Holly disables Mika model in a grisly way by plunging the screwdriver into her eye sockets, the Detective cries “You can’t murder someone in front of me!” Why is it that killing of a human by a robot did not exactly fit the definition of murder for Detective Rivera but incapacitating the model did?
Does the death penalty even matter to something that’s loaded with networked intelligence?
This is a very well-written short story which leaves the reader with many unanswered questions. The portrayal of Mika model as a seductress and culprit is both intriguing and jarring. The pace of the story advances the plot and action and nowhere does the reader need to take a pause to comprehend the finer intricacies of this complicated situation – until the very end that is. The world building in this story is concrete and tangible. A futuristic tale, “Mika Model” puts forth uncomfortable questions, answers to which must be found in our rapidly advancing world.