NZ Herald Reported Two Tertiary Institutions
NZ Herald Reported Two Tertiary Institutions
life. In addition to the more obvious security, public safety and surveillance applications
(which have led, for example, to Hong Kong protesters wearing masks and felling
lampposts equipped with facial recognition CCTV cameras in order to avoid
identification), the technology is increasingly being used as a retail tool to track customer
behaviours and movement to generate predictive consumer analytics. And New Zealand
is following suit. Last year the NZ Herald reported on a major supermarket chain having
deployed facial recognition CCTVs in some of its stores and more recently on two
tertiary institutions trialling the technology to monitor student attendances in New
Zealand. But how does this increasingly common technology fit within global trends
towards greater privacy, control and transparency over data collection and use?
There are three key privacy issues that tend to arise in facial recognition privacy
discussions:
First, facial recognition software can only work alongside a rich database of facial
images, so the recognition algorithm can be trained to detect faces, and to then match a
detected face to an identity in the database. Populating and using facial image
databases raises all sorts of questions regarding the source of those images, the extent
of any consent or authorisation obtained, the potential uses to which the images will be
put and how they will be protected through storage and security. There have been
reports that some facial recognition software products already in use have been trained
on images of individuals mined from the internet and obtained without consent. In the
absence of appropriate consents, these products may simply breach the law.
Second, independent tests of facial recognition technology have repeatedly shown that it
is far from perfect or reliable, presenting a meaningful challenge to compliance with
Information Privacy Principle 8 under New Zealand’s (current) Privacy Act and
equivalent legislation around the world. Principle 8 requires that “An agency that holds
personal information shall not use that information without taking such steps (if any) as
are, in the circumstances, reasonable to ensure that, having regard to the purpose for
which the information is proposed to be used, the information is accurate, up to date,
complete, relevant, and not misleading.” If an agency was collecting images and using
unreliable technology to match that image to an individual it is difficult to see how it could
comply with this principle. Facial recognition technology relies heavily on the quantity
and quality of the data fed into it. The potential for misidentification and error rates due to
gender and racial biases in algorithms are known to be high, even with the best software
of its kind, with disproportionately higher error rates among certain ethnic and racial
groups. Depending on the function, flaws in facial recognition software could have
catastrophic impacts.