One of the benefits of the lockdown has been the chance to read more and I recently finished the excellent Tools and Weapons: The Promise and the Peril of the Digital Age by Brad Smith. Smith has been Microsoft’s Chief Legal Officer for nearly 20 years and taken the company through the huge anti-trust battles in the late nineties and continues to be involved as an industry leader in all of the key topics that affect the software industry from privacy and security to biometrics and diversity.
I would recommend the book to anyone in the software industry as it provides a great overview and insight to many of the regulatory and compliance issues that we as industry need to address.
I am not going to give a review of Tools and Weapons, but there was one chapter that particularly sparked my interest, Chapter 12: AI and facial recognition. The chapter talked about the challenges of biometric readers face recognition. This is a subject we at ShopWorks spend a lot of time debating with customers, we sold thousands of face recognition biometric scanners for the T&A (time and attendance) module of ShopWorks in 2019 and GDPR has made it a hot topic.
We are often asked to respond to media articles, some of which are sensationalist to allay customer concerns. The concerns all relate to the issues covered by Brad Smith in tools and weapons and hardly any relate to our time and attendance tool.
At a certain level face recognition is a controversial subject and Smith explained how Microsoft had found the subject controversial and decided to publish their six principles which guide their involvement in face recognition.
I am pleased to say that ShopWorks complies with all Microsoft’s principles, we did so before we even knew they existed, because what we do is such a restrictive use of the technology compared with what Microsoft were reacting to when they drew up their principles.
For a time and attendance system to work, a reader is installed locally and the employee is registered on the device. When that person attends work, they simply present their face to the reader and they are “clocked in”. The reader then sends a simple message to the time and attendance software, in this case ShopWorks saying the employee has clocked in plus the time.
Now consider some of the concerns that are represented in the press, that Brad Smith is addressing in his book and that Microsoft’s principles are designed to combat:
These include Bias, Mass surveillance and collecting data without permission. I thought I would use this blog post to explain to existing and future customers of ShopWorks why these discussions are a million miles away from a time and attendance system that uses biometrics.
The main public and media concerns are largely related to artificial intelligence driven systems that can access enormous numbers of digital images of faces that are collected either online say via social media or offline, say using CCTV. Issues with analysing huge collections of face images include:
Mass surveillance: This is the underlying cause of public concern – systems that analyse millions or billions of photos often without permission. They could be looking for terrorists, or trying to find out if you have been in a store before and offer you a special deal. With a time and attendance system the face recognition reader only records the face of people that work in that location or department. We have readers with 5 people registered. It is definitely not mass surveillance.
A time and attendance system is a long way from this – the person being recorded in the face recognition scanner has to actively register and give their consent. The whole process is regulated under the General Data Protection Regulations (GDPR) and at ShopWorks we share our legal advice with all our customers so they can be sure they are compliant. No data is collected without permission.
Bias: There have been many examples where the artificial intelligence used to sort and categorise photos, for instance using an artificial intelligence (AI) to identify criminals, has shown a bias against a race or gender. The bias’ are thought to be caused by researchers being from similar backgrounds introducing their own bias into the AI when they are training it to know what looks like a criminal and what doesn’t.
The public is quite rightly concerned that large organisations will then use this against them unfairly. When we compare with a biometric reader for time and attendance you can see we just don’t have this issue. Firstly, there is no AI making a decision, the reader is giving a simple yes or no answer to the question “is this the person who registered their face with me and told me they were called John Smith?” The reader is asking that question against a database of tens of people who work in the same location and have given their consent by self-registering.
The time and attendance system is a closed loop of people, the mass surveillance system causing concern is looking at random, anonymous faces and trying to make a decision on some aspect of their character. There is no bias in a time and attendance biometric reader.
A lot has been written about the concerns of facial recognition and hopefully we have given you a sample of the issues here. These concerns are often applied to all facial recognition such that people are in danger of believing that all face recognition is bad. As we have hopefully explained, closed loop, permissive time and attendance systems do not share any of the risks that is creating so much debate. If you would like to know more about our biometric T&A (time and attendance) systems, please contact us at firstname.lastname@example.org.