Netflix has released Season 6 of the Black Mirror series, and its first episode “Joan Is Awful” definitely made the 4-year wait worthwhile as it once again weaved together worrisome developments over the last few years, such as deepfakes and artificial intelligence, into an over-the-top cautionary tale.
The Plot
If you’ve already watched the episode or are not interested in reading a synopsis, you can skip this part.
The title character (played by Annie Murphy from Schitt’s Creek) is first shown going about her daily life, starting from the moment she wakes up, goes to work, attends a therapy session, meets and shares a kiss with her ex-boyfriend, until she comes home to her fiancé and they sit down to watch TV. As they scroll through the list of shows on Steamberry, the Netflix in their fictional universe, Joan and her fiancé come across a new series where the protagonist (played by Salma Hayek) bears both her name and a striking resemblance to her.
Perplexed, they start watching the first episode, which - to Joan’s utter horror - is a dramatized replay of her day, replete with inaccurate quotes that highlight the bad moments. This sets off a chain of events that completely upended her life, such as her fiancé walking out on her and being fired from her job the next day.
This pushes her to consult a lawyer, who tells her that she didn’t have a case against Streamberry because she had agreed to their watertight terms & conditions when she first signed up for the service, thus giving them the right to exploit her life story for free.
The lawyer tells her that Streamberry is able to do that real-time, allowing them to release episodes on the same evening, thanks to a quantum computer that generates a motion picture of her through the use of the image and likeness of an actor (in her case, Salma Hayek) and some predictive algorithm that the lawyer compares to the uncanny ability of cellphones to show ads of products that were mentioned during real-life conversations.
Joan frustratedly asks if she could sue the actor who plays her, to which the lawyer also answered in the negative because the shows were just digitally-generated based on the image of the actor who had sold the streaming company the rights to do with it as it pleases - to the extent that such image of the actor could be used to generate explicit videos involving the actor and an orangutan, for example.
In her desperation, Joan pulls an outrageous stunt so that it would catch the attention of Salma Hayek, in the hopes that the actor would put an end to the show. Joan’s gamble paid off and Salma Hayek was outraged enough to consult her own lawyer about pulling the plug on the show, but was told that Streamberry was well within their rights under the contract to do whatever they wanted with her image.
Joan and Salma Hayek join forces to hatch a plot to destroy the “quamputer”, which they were able to pull off.
Deepfakes
Deepfakes have now infiltrated our information ecosystem, made possible by millions poured into research. We haven’t even figured out how to deal with misinformation and yet we’re already facing a sophisticated form of technology that has the potential to cast more doubt on just about everything we see onscreen. We don’t have to just deal with disproving “alternative facts” anymore, as they can now be backed by their own, digitally-fabricated evidence.
In true Black Mirror fashion, “Joan Is Awful” takes the problem of deepfakes to the extreme by taking elements of our current reality and then mashing them up together to create an alternate universe that, when we really think about it, isn’t that detached from reality.
This article will explore how two themes depicted on the episode were brought together to show the extent to which deepfakes can destroy lives at such a personal level:
- The commodification of human individuality
- The trivialization of the human experience
1. Human individuality as a commodity
There are many ways to answer the question “What makes us human?”. This episode gave two examples of what makes each of us human - our physical identity through our outward appearance, as well as our actions.
Physical identity for sale
The sale of a person’s physical image is not something new, it’s been around since the invention of photography and film - models and actors are just the most common examples of people engaged in this type of commercial activity.
However, the difference between conventional form of this trade and deepfakes in general is that, in the former, the person still exerts some control over the images produced of them. There is no such control over deepfakes, which allows the creation of images of a person that they would otherwise have objected to.
When Salma Hayek’s character sold her image for use by Streamberry, she didn’t foresee that her physical image would be digitally made to perform a publicly humiliating and sacrilegious act, something that brought her profound shame and distress even though everyone knew it was just a digitally-generated perfomance.
Salma Hayek: I am Roman Catholic. My grandmother Rosa was going to be a nun. She might die when she sees this. What right do they have to kill my abuela with this deepfake heretic abomination?
Lawyer: Uh, page 39, paragraph eight… of your image rights agreement with Streamberry.
Unfortunately, this isn’t too far beyond reality as we’d like to think, because the image of deceased artists have already been used for concerts, best exemplified by 2Pac Shakur’s performance at Coachella in 2012.
As social animals, face recognition is so important for us humans that our brains tend to recognize a face when there isn’t one (think of a Type B power socket). It isn’t too farfetched, therefore, to say that our individual physical form has inherent social value, regardless of our standing in society. Since our identities as individuals are so inextricably linked to our physical image, it deserves protection and should be outside the commerce of men. It is high time we protected our physical image from trafficking - the same way we have laws against organ trafficking.
Every move potentially monetized
Joan, by upon subscribing to Streamberry, unwittingly gave the company the rights to use her daily life as material for a TV show. Suddenly, her actions were no longer merely such; instead, they became fodder for a mighty entertainment machine, to be churned out into grotesque scenes for other people’s savage pleasure.
Joan: But the show is using my life. It’s … my name. It’s my career. It’s me. They’re … they’re using me…
Lawyer: And you assigned them the right to exploit all of that … when you first signed up to Streamberry. And you clicked “Accept”.
Joan: What? I mean, I had no… How was I supposed to know this?
One would be forgiven to think that this is too crazy to happen in real life, but our reality is closer to Joan’s than we’d like to think. In order to explain this better, we need to break this down further:
i. We accept Terms & Conditions without reading them
The vast majority of us accept terms & conditions without reading through them because the alternative is to not use the product / service.
These take-it-or-leave-it agreements are formulated by only one party i.e. the tech company; they are, for all intents and purposes, contracts of adhesion. However, even armed with that knowledge, suing a big tech firm that has a formidable and well-paid legal team is not something that the majority of us would like to spend years doing.
On the other hand, we cannot stop using such services, because they have become so intertwined within the fabric of today’s society that we can’t reasonably live and actively participate in it without consenting to such terms & conditions.
Put these things together and what we have is a state of collective permissiveness which bestows legitimacy upon dubious acts of big tech companies, thus perpetuating a vicious cycle that edges us closer to a dystopian way of life - and we’re already partly living in it, because even people who never agreed to such terms aren’t spared.
ii. Our data is being used against us
Such terms & conditions are merely the means to get us to give away what many tech companies are after - our data.
The same way that Streamberry collected data on Joan’s own actions and then used it to portay her in such a sensational way that it turned her into an anti-hero, big tech firms compile data on us and use it in ways that, naturally, serve their interests.
There are times when such interests may coincide, such as when tech companies use our data to create profiles of us in order to generate targeted ads.
Joan: But how do they… even know what I’m doing? It’s the same… it’s the same day.
Lawyer: Well, you know when you got your phone face down on the table, and you’re in the kitchen, and you’re talking to your friend about, I don’t know, shoe deodorizers, and then you know, you go on your computer and what pops up? A shoe deodorizing ad.
However, there are also times when the interests of tech companies are adverse to ours, as Joan found out. The same targeted ads that we see are exactly what push big tech companies to keep us hooked on their products - effectively using our own data against us.
2. The human experience reduced to mere material for shows
Art imitates life, as the saying goes. But “Joan Is Awful” gives that old saying an entirely new (and infinitely darker) dimension.
Another thing that makes us human is that each one of us is a product of our past experiences and our environments, which makes us predisposed to act (or not act) in a certain way. We are complex individuals whose actions are filled with context. When such context is stripped away, it leaves a gapingly incomplete view of a person’s actions and life decisions that is open to be filled in with harsh criticism.
“Joan Is Awful” takes this to the extreme by turning Joan’s actions and decisions into mere fodder used to feed a grotesque entertainment machine that churns out episodes of her daily life, to be shown on the same evening in an unflattering light and devoid of the respect due to her as a person with the freedom to live her life the way she chooses.
While it is obvious that reducing a person’s life to a mere show has consequences for them, Black Mirror shows us that it also has consequences for the people around them, giving us the example of Joan’s fiancé, who was also humiliated after seeing Joan’s infidelity glamorously splashed on TV. Uninvolved viewers are also affected, such as the cyclist who turned from being a friendly neighbor into a hostile stranger.
It would have been nice to say that there are no examples of real-world events that would demonstrate this point on the consequences of portraying someone’s actions without any context, but McDonald’s Hot Coffee Lawsuit provides a very sad example.
Food for thought
Black Mirror has once again successfully captivated audiences by warning us about pressing matters that could derail the progress of human civilization, namely deepfakes.
We should heed it.