The rise of augmented reality and its novel challenges
ARTIFICIAL INTELLIGENCE & ROBOTS - October 2023

The Rise of Augmented Reality and its Novel Challenges

By Lau Kok Keng and Zachary Foo (Rajah & Tann Singapore LLP)

I.  Introduction

Augmented reality (AR) is technology that provides an enhanced, composite version of the real life, physical world through the use and superimposition of digital audio-visual elements. The proliferation of AR technology is on the rise, with AR technology being adopted in more and more industries and rapidly permeating into mainstream lifestyles. Apple recently announced that it would be launching the Vision Pro[1] in early 2024, while the Infocomm Media Development Authority (IMDA) also announced a project which makes use of smart glasses combined with AR technology to enhance the operational readiness of Singapore Civil Defence Force (SCDF) frontliners.[2] Both these launches have brought AR technology to the mass market and put the global spotlight on AR and the various applications and devices that make use of it. Other examples of devices and applications that make use of AR technology include Microsoft HoloLens, Google Glass, Snapchat and Instagram.

AR blends the physical world that we live in with virtual reality experiences. It makes use of digital information to add to what we already mentally process in the physical world, and immerses the user in an environment where he or she often finds it difficult to differentiate between the physical and the virtual. Some examples of AR that we encounter daily include sponsor logos that appear on sporting fields during sports broadcasts, games like Pokemon Go where virtual avatars of the eponymous monsters appear superimposed in physical reality and react to the user when viewed through AR applications or devices, and social media filters where images are superimposed over the head of the user.

This article addresses the potential legal issues and disputes that may arise from the mass adoption of AR devices.

II.  Legal Issues that May Arise

The mainstream adoption of AR devices may lead to novel legal issues that arise in various areas of law. It creates a new medium of human interaction where the virtual and physical worlds intersect, that may not have been previously contemplated or anticipated by existing legal and regulatory frameworks. These novel legal issues include those relating to intellectual property, particularly copyright and trade marks.

New and emerging technologies tend to push the existing boundaries of intellectual property laws. The increased adoption of AR is no different. One of the main challenges concerns the treatment and management of derivative works. AR users may infringe on existing copyright when they combine their own work with a copyrighted work in a way that creates a new work or changes the market for the original work.[3] For example, an AR user might place a virtual Pokémon ‘inside’ the sculpture of David by Michelangelo or overlay the sculpture with images of different Pokémons, merging a copyrighted work that appears in their AR devices with the physical object that appears in front of them. These derivative works may be done in the spur of the moment and kept only briefly, but case law in the United States have treated such ephemeral works as infringing.[4] The position is similar under section 50(1) of the Singapore Copyright Act 2021. The challenge for courts will be whether an AR user who created a derivative work for his own viewing through his AR device and is not visible to other people at all, would nonetheless constitute copyright infringement.

Another concern would be the copyright and trade mark issues that may arise from the generation of content by an AR device over a physical object. For example, if someone creates an AR application that extracts and compiles virtual information from various databases and overlaid such information on a real-world object, it is unclear as to who would be the owner of the compiled virtual information generated by the AR device. Would it be the AR user, the developer of the AR device or the application developer that coded the software that compiled and overlaid the virtual information? Similarly, if an AR device reproduces an image of a trade mark to overlay on a real-world object without the consent of the trade mark proprietor, it is unclear whether such reproductions would be considered trade mark infringement,[5] since the developer of the AR device may not necessarily be deriving any commercial benefit from such reproductions. The trade mark owner may also not even be aware of the unauthorised reproduction of their trade mark in AR, given that the information is only generated for the user of the AR device.

III.  Potential for Abuse of Technology and Legal Issues and Disputes arising from such Abuse

New emerging technologies often emerge and exist in unregulated or yet-to-be regulated environments. The legal vacuum that these technologies operate in may result in individuals or businesses seeking to exploit the gaps in regulation for commercial or even criminal gain.

Criminals are known to make use of AR technology to create “darknets” in the AR space, as it offers a unique opportunity to create digital safe havens that combine the best aspects of both the digital and physical worlds. [6] For example, criminals could share information over an encrypted AR network without being detected through their wearable AR device, tagging physical objects or locations with virtual images in the AR space to signal that the location is safe to conduct illicit drug transactions. Darknets are networks which are only accessible with specific software or authorisation. These would allow criminals to hide away from plain sight and conduct their activities in the AR space in a public setting without being detected by law enforcement. Another example is the abuse of AR technology by criminals to place digital bounties on certain places or people, which would only be visible to criminal associates who also use the same AR darknet as the AR user who placed the digital bounty.[7] This could easily avoid any detection by law enforcement and would pose significant challenges to the combatting of such criminal activity.

Another form of potential abuse of AR technology by criminals would be the collection and generation of information to enhance user experience by overlaying it over individuals in the vicinity of their AR devices – an activity called “data enhancement”.[8] AR devices may be used to record images of people in the vicinity of the user of the device, triangulate that image with data of such people that is available on various databases, and collate them together into one combined display for the user to view. Criminals would relish the ability to easily identify and pinpoint anyone in a crowd. This would allow them to easily target a group that they do not like, or make mass shooters even more dangerous in a public setting.

AR technology may also be misused by businesses to have their brands substitute the physical brand placements of their competitors.[9] For example, the AR device may be made to display a Mercedes car being driven past the user, even when the actual car that is being driven in the real world at that moment is a Toyota. Likewise, product advertisements being displayed at a real-world physical location (such as perimeter advertising at a stadium) may have different and competing advertisements superimposed over them using AR technology. This may lead to legal disputes between parties, as there is no clear law now which regulates how content or advertisements may be portrayed in the AR space. For example, in the “Toyota masquerading as a Mercedes” scenario above, Toyota may sue Mercedes for falsely misleading AR users into believing that the physical Toyota car is a Mercedes, or Toyota might sue the developer of the AR device for allowing the device to display the image of the Mercedes that is superimposed over the Toyota car. Additionally, some companies may even take advantage of AR technology to disparage their competitor’s products or make comparisons to their own product. Imagine looking at the same Toyota car with your AR device in a showroom while shopping for a new car and you suddenly see this sentence overlaid over the car: “This car only has a 200km range, why not consider our Mazda 3 with double the range instead?”. This would be a tempting proposition for any business who would view the AR space as the advertising version of the Wild West.

Given that the activity in the area is still in its infancy, it remains to be seen whether governments would introduce legislation to preemptively address the plethora of potential abuse of AR technology.

IV.  Data Protection Issues in an AR World

The growth in adoption of AR may also lead to a multitude of data privacy concerns. AR devices contain many different sensors and cameras that are always active. AR devices can also record, synthesise, and store new forms of sensitive data that were not previously collected by other non-AR consumer technology devices. For example, AR devices can collect data such as eye movement, facial muscle tracking and pupil dilation. [10]

For an AR device to interact directly with a person, the device would be required to recognise the user, as well as the location where the user is. Based on the current state of technology, the only way that this may be done realistically in a social setting is through facial recognition.[11] The use of facial recognition technology in AR could allow a computer platform to capture and analyse faces of people that are within the vicinity of the AR user without their consent.[12] If AR glasses with cameras and sensors that are “always on” are adopted, the scope and scale of the concern increases, since live views would enable the location and identification of individuals present at any location at any time, and the ability to harvest data about such individuals and their surroundings without any consent, actual or deemed. This could create a global panopticon-like society, where there is constant surveillance in public and semi-public spaces. The concern was worrying enough for the Electronic Frontier Foundation (EFF) to sound the alarm bells on the potential surveillance capabilities of AR glasses with “perpetually on” audio camera or powerful 3D mapping sensors, which was in their view, more invasive than existing technologies such as Global Positioning System (GPS) or cell-site location information (CSLI). [13]

There are also data privacy concerns over the “data enhancement” abilities of AR devices,[14] in that information that was once considered private would now all become publicly available. To facilitate their “data enhancement”, AR devices would collect massive amounts of data to provide an AR user with an immersive experience. These devices would be able to triangulate the information collected across various databases, exposing what were ostensibly private information to the public. Examples of such information could be a person’s voting records, criminal records, sexuality, access to healthcare, insurance details and financial activities.

The data privacy concern does not just extend to people within the vicinity of the user of the AR device, but also extends to the users themselves. The very same sensors and cameras on the AR device can also record and store all the data generated by the user, as the AR device heavily relies on the information gathered to generate an immersive experience for the user. The observed data collected can reveal significant details of the users’ lives with varying amounts of sensitivity, either directly or by inferring from the details provided.[15] In a more drastic scenario, businesses can even make use of the data collected against the user’s interests. For example, the biometric data collected by the AR device may be sold to health insurance companies and the user would have their insurance premiums raised based on the underlying health conditions in the user’s biometric data.[16]

Various non-profit organisations and corporations are currently trying to anticipate and mitigate these issues, and have proactively created policies to regulate the AR space. For example, Extended Reality Safety Intelligence (XRSI) – a global thinktank on privacy, safety and security – offers a free, globally accessible baseline policy framework for safety and privacy in AR, which draw on existing laws like the European General Data Protection Regulations (GDPR).[17] Meta has also developed Responsible Innovation Principles to address the privacy concerns that arise from the development of new technologies and their impact on data privacy.[18]

V.  Conclusion

The increasing adoption of AR technology across multiple industries and purposes may seem to be a breeding ground for a litany of novel legal disputes. Many of these issues would probably require the law to adapt existing doctrines to deal with the new and emerging technological developments and related legal issues arising, and to evolve new legal doctrines to address them.

The unique legal, ethical, and social challenges which the widespread growth in adoption of AR technologies will pose are likely to require a restructuring of existing legal concepts and even social norms, in order to tackle the unique issues posed by the usage of AR technologies and reach the desired legal-tech equilibrium.

 

AUTHOR INFORMATION

Lau Kok Keng is a Partner and Head of the Intellectual Property, Sports and Gaming Practice at Rajah & Tann Singapore LLP.
Email: kok.keng.lau@rajahtann.com

Zachary Foo is an Associate in the Intellectual Property, Sports and Gaming Practice at Rajah & Tann Singapore LLP.
Email: zachary.foo@rajahtann.com

 

REFERENCES

[1] Apple, ‘Introducing Apple Vision Pro: Apple’s First Spatial Computer’ Apple.com (5 June 2023) < https://www.apple.com/sg/newsroom/2023/06/introducing-apple-vision-pro/ > (accessed 30 August 2023)

[2] Infocomm Media Development Authority, ‘Singapore launches Southeast Asia’s first AR smart glasses and AI visual recognition technologies to empower the nation’s civil defence with 5G’, imda.gov (13 Sep 2023) < https://www.imda.gov.sg/resources/press-releases-factsheets-and-speeches/press-releases/2023/sg-launches-ar-smart-glasses-and-ai-visual > (accessed 14 September 2023).

[3] Mark A. Lemley & Eugene Volokh, ‘Law, Virtual Reality, and Augmented Reality’ (2018) 166 University of Pennsylvania Law Review 1051, 1112.

[4] Lewis Galoob Toys, Inc. v Nintendo of America, Inc. 964 F.2d 965 (9th Cir. 1992); Micro Star v FormGen Inc. 154 F.3d 1107 (9th Cir. 1998).

[5] Tam Harbert, ‘The Legal Hazards of Virtual Reality and Augmented Reality Apps’, IEEE Spectrum (20 Feb 2018) < https://spectrum.ieee.org/the-legal-hazards-of-virtual-reality-and-augmented-reality-apps >(Accessed 28 August 2023).

[6] Brian Wassom, Augmented Reality: Law, Privacy, and Ethics (2015), Chapter 8: Criminal, at 222.

[7] Ibid, at 214-215.

[8] Brian Wassom, Augmented Reality: Law, Privacy, and Ethics (2015), Chapter 3: Privacy, at 56.

[9] Sailesh Patel, “Protecting Intellectual Property in Augmented Reality”, ipwatchdog.com (31 May 2022) < https://ipwatchdog.com/2022/05/31/protecting-intellectual-property-augmented-reality/id=149227/ > (accessed 30 August 2023).

[10] Sara Cabonneau, ‘What makes VR/AR privacy concerns different from existing privacy concerns?’, The Extended Mind (9 Jun 2022) <https://www.extendedmind.io/the-extended-mind-blog/what-makes-vrar-privacy-concerns-different-from-existing-privacy-concerns> (accessed 30 August 2023).

[11] Brian Wassom, Augemented Reality: Law, Privacy, and Ethics (2015), Chapter 3: Privacy, at 50.

[12] Adi Robertson, ‘The Next Privacy Crisis’, The Verge (1 Nov 2021) https://www.theverge.com/c/22746078/ar-privacy-crisis-rethink-computing (accessed 20 September 2023). <https://www.theverge.com/c/22746078/ar-privacy-crisis-rethink-computing>  (accessed 4 September 2023).

[13] Katitza Rodriguez & Kurt Opsahl, “Augmented Reality Must Have Augmented Privacy”, Electronic Frontier Foundation (16 October 2020) <https://www.eff.org/deeplinks/2020/10/augmented-reality-must-have-augmented-privacy> (accessed 5 September 2023).

[14] Brian Wassom, Augmented Reality: Law, Privacy, and Ethics (2015), Chapter 3: Privacy, at 56.

[15] Ellysse Dick, “Balancing User Privacy and Innovation in Augmented Virtual Reality”, Information Technology and Information Foundation (4 March 2021) <https://itif.org/publications/2021/03/04/balancing-user-privacy-and-innovation-augmented-and-virtual-reality/> (accessed 5 September 2023).

[16] Sara Cabonneau, ‘What makes VR/AR privacy concerns different from existing privacy concerns?’, The Extended Mind (9 Jun 2022) <https://www.extendedmind.io/the-extended-mind-blog/what-makes-vrar-privacy-concerns-different-from-existing-privacy-concerns> (accessed 5 September 2023).

[17] Extended Reality Safety Initiative, The XRSI Privacy and Safety Framework, xrsi.org (September 2020) <https://xrsi.org/publication/the-xrsi-privacy-framework> (accessed 30 August 2023).

[18] Meta, Responsible Innovation Principles, Meta (2021) <https://about.meta.com/metaverse/responsible-innovation/?utm_source=about.facebook.com&utm_medium=redirect > (accessed 5 September 2023).