by Rachel Treasure, the Legal Forecast

Computer Vision Threatens Current Notions of Privacy

Traditional notions of privacy are being challenged with advancements in technology, particularly computer vision. This technology has increased with popularity, accuracy and performance in recent years, to the point where computer vision surpasses human performance in some instances.[1] Computer vision technology has been increasingly applied in the commercial sector. The issues with these rapid advancements is that the extent of the private sector’s development and ability to deploy this technology is unknown and requires a more thorough understanding to regulate effectively.

The Privacy Act 1988 (Cth) (‘Privacy Act’) was introduced as Australia’s response to its international obligations found in the ICCPR. The Privacy Act only applies to private organisations with a turnover of more than 3 million dollars, meaning there is a ‘small business’ exemption. This creates a concerning gap within the framework as 94% of businesses do not meet this threshold.[2] The predominant issue with the current framework is that while it imposes a ‘transparency’ framework and a consent requirement for the collection of sensitive information, computer vision challenges these concepts as it innovates the way that data can be collected and it can seamlessly collect data without requiring consent.

 

Commercial uses of Computer Vision

Computer vision can now be used to track a customer walking past a store by identifying them on camera. This can then allow a company to send a notification to that person’s phone about potential sales in the shopping centre.[3] Computer vision also has applications in billboard customisation, where the technology can identify a customers’ gender, age and other specific factors and can use this information to create personalised advertisements in real time.

 

Going beyond Privacy Principles

It is likely that additional privacy principles are required to safeguard against emerging technologies.[4] The three most prominent additional principles are:[5]

  1. Transparency;
  2. Dynamic Consent; and
  3. Privacy by design.

 

1. Transparency

There are two important ways that companies can facilitate transparency. These are by developing and publishing privacy policies and providing notice that computer vision is being used.[6]  The principle of transparency also requires that any information addressed to the public should be easy to understand and easily accessible.[7]

2. Dynamic Consent

The large amount of personal information collected by vision sensors challenge existing concepts of consent and notification in surveillance and privacy. It is difficult to gain consent where individuals are identified or profiled against other data sets.[8]To gain dynamic consent, businesses should try and disclose all conceivable future uses of the data and when new ways to use the data arise, businesses should seek permission to use the data in this way.[9]

3. Privacy by Design

Privacy by design means building in reasonable privacy and security controls at all stages of product development. It includes promoting consumer privacy and data security throughout one’s organisation.

 

Conclusion

The tension between technological advancements and the right to privacy is a constant battle in law. This technology demonstrates that as personal information can now be seamlessly and unknowingly obtained from individuals, amendments to the privacy framework are required in order to provide adequate protection. Australia, as an innovation nation, plans to be at the forefront of advancements in technologies. It is critical that outdated laws do not hinder this ambitious goal.

 

The Legal Forecast

[1] Anupam Das et al., ‘Assisting Users in a World Full of Cameras: A Privacy-Aware Infrastructure for Computer Vision Applications’ (2017) IEEE Conference on Computer Vision and Pattern Recognition Workshops 1387, 1388.
[2] Charles Power, ‘Eye on the spy’ (2008) Monash Business Review 10,11.
[3] Office of the Privacy Commissioner of Canada, Automated Facial Recognition in the Public and Private Sections (2013) 7.
[4] Ann Cavoukian, Privacy by Design and the Emerging Personal Data Ecosystem (October 2012) Privacy by Design < https://www.ipc.on.ca/wp-content/uploads/Resources/pbd-pde.pdf>.
[5] Federal Trade Commission, Facing Facts: Best Practices For Common Uses of Facial Recognition Technologies, An FTC Staff Report (2012).
[6] Stacey Gray, ‘Privacy principles for Facial Recognition Technology’ (2015) Future of Privacy Forum 1, 13.
[7] Norberto Nuno Gomes de Andrade, Aaron Martin and Shara Monteleone, ‘“All the Better to See You With, My Dear”: Facial Recognition and Privacy in Online Social Networks’ (2013) 11(3) IEEE Security & Privacy 21, 24.
[8] Carl Gohringer, ‘Face Recognition: Profit, Ethics and Privacy’ (2013) Allevate Limited 1.
[9] Timothy Pilgrim, Submission to Office of the Privacy Commissioner of Canada, Discussion Paper – Consent and Privacy, July 2016, 6.

%d bloggers like this: