Machine learning algorithms can help create fake videos or pictures of someone else without their knowledge or consent. In fact, in 2017, University of Washington’s researchers created a video of Barak Obama who was seemingly discussing important issues. Now, there are software applications such as FakeApp that can help create deepfake pictures or videos for free. FakeApp was created by using Google’s open-source deep learning software program.
The advent of “fake news” has created a new movement in the entertainment and news industries. It has allowed everyone to question the source and validity of journalistic works. So now, deepfake movements and creations are creating new legal predicaments. The relevant issues, include, but may not be limited to, invasion of privacy, false light, and defamation.
The creator or publisher of the deepfake picture or video can put together a seamless video by having access to a base video and several source images of the person’s face. The computer-generated face can look identical to the original person’s face which can create confusion. This confusion can result in monetary damages to the victim. For example, a deepfake video can show the victim saying or doing something wrong which could cause the victim’s loss of employment. Or, in another example, the victim, who is running for political office, may be shown to have said or done something that could impede the election process.
Defamation takes place when someone makes a false factual statement that is not privileged and tends to harm the victim’s reputation. For example, calling someone a “rapist” or “convicted felon” could constitute defamation if it’s not true. In most jurisdictions, these types of comments – whether online or offline – are actionable by law. However, the result may be different if the defendant creates a fake picture or videos about the plaintiff. In some cases, the courts create an exception to defamation if the statement constitutes parody. In general, parody is some kind of work that is designed to imitate or ridicule the original work. So, in other words, it’s a humorous or satirical imitation of a serious piece of literature or writing. In that case, the court may carve out an exception to the general rule.
The fact that someone is subject to deepfake pictures or images could cause emotional distress if he or she illustrates or exemplifies some form of physical symptom by the trauma – e.g., insomnia, loss of appetite, anxiety, depression. Then, in that case, the court may grant legal or equitable relief if it determines defendant intentionally or negligently caused emotional distress towards the plaintiff. The court may issue injunctive relief against the defendant in order to prevent him or her from further violations. Our internet and technology lawyers can assist clients in these types of cases.
False light is another cause of action that may be sought by the victim. In California, false light arises from the dissemination of false statements about plaintiff that would be objectionable to a reasonable person. This cause of action is different from defamation and intentional infliction of emotional distress since it’s a privacy tort that is designed to protect the victim’s privacy rights. So, in essence, the plaintiff has the burden to establish the false or misleading nature of the statement. In addition, the right of publicity can be applied to cases where the website, blog, or forum has posted the deepfake pictures or videos – i.e., there may be a potential cause of action against the online platform for hosting the false/fake picture or video. In some cases, the plaintiff may sue the online platform (e.g., Instagram, Reddit, YouTube, Twitter) for intellectual property violations. It is important to note that the federal Communications Decency Act (under 47 U.S.C. § 230) may provide immunity to the online platforms because it states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Nonetheless, there is an exception in the statute since it does not limit or expand any law pertaining to intellectual property. The online platforms may argue that they should not be held liable because it is not practical for them to monitor each and every post. Now, with that being said, there should be legislation that helps the victims remove these types of postings just like they can under the Digital Millennium Copyright Act.
Our law firm assists clients in matters related to privacy and cybersecurity and the related state, federal, and international laws. Please contact our law firm to speak with an internet attorney at your earliest convenience.