Articles Posted in Technology

For this week’s blog post, we will be continuing with a discussion of another recently decided Supreme Court case.  Specifically, we will cover United States v. Microsoft Corporation, and talk about the ramification’s the Court’s decision has on the world of internet technology.

This case involves user data privacy rights and the ability of US based technology companies to refuse to comply with federal warrants when user data is stored overseas.  The case had to do with the extraterritorial (outside of the United States) application of the Stored Communications Act (SCA), and whether warrants issued under SCA could be effective with regard to new internet technology such as cloud storage and data centers.

In 2013, FBI agents obtained a warrant requiring Microsoft to disclose emails and information related to a customer’s account who was believed to be involved in drug trafficking.  Microsoft attempted to quash the warrant, claiming that all of the customer’s emails and information were stored in Microsoft data centers in Dublin, Ireland.  The court held Microsoft in civil contempt for refusing to give agents the emails, but this decision was reversed by the Second Circuit.  The Second Circuit held that requiring Microsoft to give federal agents emails that were stored overseas would be outside the realm of permissible extraterritorial application of the Stored Communications Act (18 U.S.C. 2703).

This month on our blog we have been talking about internet law related legislation and development. This week will continue the topic with a discussion of artificial intelligence and privacy issues. Specifically, we will talk about Google’s new Duplex technology, which allows users to make appointments, book reservations, and call businesses via an artificial intelligence robot.

Last month, Google introduced Google Duplex at its annual conference. Google’s CEO presented the technology on stage and played a recorded phone call in which the digital assistant booked a hair appointment for a client.  A second conversation of the digital assistant making a restaurant reservation was also played.  What was immediately striking about these conversations is how realistic the robot sounded.  Upon listening, the robot voice and interaction is nearly indistinguishable from a human conversation.  The digital assistant uses phrases such as “uh” and “uhm,” and also pauses before responding as if it is “thinking” of a response.  More often than not, the party on the other end of the line likely has no idea they are talking with a robot, or even further that they are being recorded. This may pose a liability for Google.

Google has been recording the conversations that the robots have in order to improve and modify the technology.  By recording the conversation, engineers at Google can see exactly what phrases, or questions, make the robot function in a less than optimal way.  When engineers and programmers are able to recognize these challenging variants in conversation, they are more able to program an appropriate response for the robot, so as to make the technology more user-friendly and successful. Problems arise in the legal world, however, when Google developers fail to tell the party on the other end of the line that they are being recorded.

In our first June blog post, we discussed a bill passed by the State Senate which would provide net neutrality rules for ISPs in the State of California.  We continue this week with the theme of internet regulating laws being proposed in our state.

The California Consumer Privacy Act of 2018 (CCPA) is a ballot measure, which would provide unprecedented protection for user data in California.  Users would have the ability to prevent companies from selling their data to third parties, as well as demand full disclosure of all data being collected.  Consumers would also have the ability to sue companies in violation of the law.

The CCPA was started by Alastair Mactaggart, a real estate developer in the San Francisco area, along with Rick Arney, a finance executive, and Mary Stone Ross, an attorney who has worked on national security matters with the House of Representatives and was a former CIA analyst.  The group says they are just three people living in California who want what is best for their kids and the future of Californians.  They believe the “bargaining” that occurs between big companies and users regarding consumer privacy, which is basically take-it-or-leave-it is not bargaining at all.  With the practical necessity of laptops and cell phones today, they want users to have more choice and power in terms of what information is collected, and how that information is used.

Mentioned in passing, in our first December blog post is another potential pitfall for operators of Internet-based services such as websites or applications. This one pitfall in particular comes out of the State of California. However, given the role of the internet as a wide-spread source of information, this is a lesson for any individual pitching to minors online. This law is Business & Professions Code 22580-22582 (“BPC 22580-22582”) otherwise known as “Privacy Rights for California Minors in the Digital World.” What does this law pertain to in general? What kind of entities need to be concerned about California Minors? What are the privacy rights these minors are allowed to enjoy?

What is BPC 22580-22582?

BPC 22580-22582 is a sub-part of the California Business and Professions Code.  It applies to operators of Internet websites and services, including, but not limited to, applications that are directed towards children and those same entities where the entities know the websites or services are used or visited by children. Here, “directed to” means it was created mainly for children, and is not intended for a general audience, including, but not limited to, adults. The law states, for children with registered accounts, entities must:

Last week we discussed smart toys, and we mentioned “COPPA” in that article.  As such, some of you may be asking what is COPPA?” In short, COPPA is a federal law specifically tailored towards children, and stands for “Children’s Online Privacy Protection Act.” This law is meant to protect children from over exposure and prohibit businesses from gathering invasive amounts of analytics on children using their products or services. This remains a legitimate concern, attempting to curtail some of the worst aspects of online life.  What exactly does COPPA prohibit? Is there any limitation? Does it provide guidelines for a business to follow and ensure compliance?

COPPA Prohibitions

The spirit of COPPA can be summarized as follows: It is unlawful for an operator or a website or online service directed to children or with knowledge that it is collecting or maintaining a child’s information, to violate this federal statute by failing to give notice on the website of what information it collects, how it’s used, and how it’s disclosed, failing to obtain parental consent, providing reasonable means for parents to review or cancel the use of the service or website, to not condition participation in a game, offering of a prize or other activity by disclosing more personal information than is necessary, and failing to establish and maintain procedures to protect the confidentiality, security and integrity of the children’s information.

We have finally reached December, and with it, comes the time for shopping. Of course, some people will focus on the youngest members of their families – i.e., children.  However, it needs to be emphasized that even with children, there are special concerns. The law considers juveniles and their decision-making capabilities, and in the age of the “smart toy,” this could have far ranging impacts on businesses and the emerging market. What is a smart toy? How might it differ from an average toy? What would a business need to be aware of?  What about a parent?

Smart Toys

Smart toys, alternatively known as “connected” toys, are those devices that can be used for play, but also connect to the internet or cloud.  This concept may sound like the internet of things–and these smart toys are just another part. A good example of this may be something like the “Hello Barbie” dolls from 2015. These dolls were akin to a smart chat program, or a more personable Siri/Cortana/Alexa. While Barbie’s operating system would not allow her to break significantly off a script, she would remember and adapt to a child’s thoughts, concerns, or desires.

In a current dispute between Google and a Canadian company over de-indexing a competitor, Google is doing everything in its power to avoid the court order. Not necessarily because it believes in the innocence of Datalink, but because to de-index would be removing an important immunity under current U.S. laws. One may be wondering, what was the immunity that prompted Google’s move? Why could it just pick up and go somewhere else? Should other businesses be concerned for this possible loss of immunity, and why might a business support Google here?

Case History

Equustek Solutions, Inc., a Canadian company, engaged in litigation with Datalink due to illicit activities on Datalink’s part (e.g., misappropriation of trade secrets) and using those trade secrets to confuse consumers in the market. Due to the similarities resulting from the alleged misappropriation, Datalink led consumers to believe that they were purchasing Equustek’s products. Equustek then sued in Canadian courts, resulting in various court orders against Datalink. However, Datalink managed to evade enforcement by fleeing the country and setting up shop somewhere else.

A question for you to consider: Imagine a world where music is created by a random set of numbers. Who owns the music? Is it the programmer? Is it the user who gave specifications for the music? It’s certainly an odd question to ask, and unsurprisingly, one without a clear answer. The question has been mostly unlitigated, although programs such as the Artificial Intelligence (“AI”) made by DeepMind can produce music by listening to it.  For example, some programs can restore or create mimics of Rembrandt. One might wonder: With the increasing role of technology, what are the limits to copyright laws? Who is a creator, and hasn’t this issue already been settled in courts?

Previous Litigation

To determine the possibility of authorship to AI, it’s important to simplify things. Technology is a little complex. What about monkeys, animals, or something that occurs naturally?

As the Equifax breach continues to become a complicated issue, certain lessons can be learned for other businesses handling personal information. Namely, what not to do in their business operations?  In the wake of the cybersecurity breach, it had been reported that Equifax was aware of the security gaps, and did nothing to remedy them. So, where exactly did Equifax go wrong in its data security plans? How was it informed about the open holes in its security infrastructure?  What can a business owner do to avoid becoming an encore of Equifax’s folly? Is there any way to determine gaps in security policies and procedures?

Where did Equifax go wrong?

Effectively, Equifax appears to have failed at multiple levels, resulting in this breach. This is best summarized into one large mistake. There were no updates implemented to the computer systems Equifax used on its networks.  This was due to a delayed response to a known vulnerability in the Apache Struts web application. This framework is well known, it is used in the business community, and is an open-source framework for developing Java applications. In short, the delay was exasperated by the company’s failure to detect the vulnerability during a security scan.

As the Equifax breach has developed recently, another issue has come up, namely the arbitration provision within its website, which has caused consumer outrage and confusion. So, why does this provision matter? If consumers want to get their credit frozen, or check to see if they were affected, surely Equifax wouldn’t add insult to injury to the consumers who are suffering from its mistakes. Certainly, it would appear to be bad business to do so, or at least, unwanted attention. However, Equifax cannot be said to avoid adding insult to injury. Instead, Equifax has implemented that arbitration provision, and later removed it. So again, why would Equifax implement this provision? What impact might it have on the consumer? Why might this be important for businesses everywhere to observe?

What is the arbitration provision?

The arbitration provision that had insulted many consumers was attached to Equifax’s offer of free credit monitoring. In exchange for the service being performed (after the security breach) Equifax demanded that consumers settle any dispute with them through arbitration. In general, arbitration is a private and less costly way to settle disputes outside of the courtroom. While the results of the arbitration may be binding, it gives broader latitude to discovery, time, and may be faster and less formal than a formal trial.  While Equifax later clarified this provision would not apply to the current breach, however, nevertheless consumers were upset at the revelation.