School of Human Sciences
Institute of Lifecourse Development
University of Greenwich
London SE10 9LS
New research: Face recognition accuracy with and without face masks
Professor Josh P Davis (University of Greenwich)
We regularly create new face recognition tests as we only use them once with super-recognisers (if we use them a second time, super-recognisers often remember the photos). We have had an open invitation on the website to supply face images and recorded voices (depending on research) for about 5 years.
Without stimuli we cannot conduct research. We are always grateful to anyone who helps us.
We need loads more images in 2021 for the new research below please!!!
Most volunteers who provide stimuli are also volunteers taking part in our research. At the bottom of this blog is information about ethics, GDPR, and privacy, and anonymity and how we protect participants’ rights if they supply such stimuli.
Most important to us is that anyone who takes our tests can be assured that the people depicted provided full informed consent. There has been a lot of publicity around some computerised face recognition research being conducted without this guarantee. We want to make sure we are following best research and ethical practice.
We are planning new research examining the impact of face masks on face recognition accuracy. We would very much appreciate your help by taking a few selfies of yourself with and without face masks and supplying some other non-selfie photos too as comparison images (i.e., Facebook/social media, printed photos - even wedding – we can crop out anyone else depicted).
All participants who submit at least 8 images of suitable quality will be entered into a monthly £20 Amazon prize draw.
Please click HERE to participate – you will find more ethics information.
Thank you for your support. Josh Davis
Any questions please e-mail: email@example.com
Our participant database and stimuli database: Ethics
Professor Josh P Davis is a Chartered Psychologist and an Associate Fellow of the British Psychological Society (BPS) and is therefore bound to follow BPS ethics policy (https://www.bps.org.uk/news-and-policy/bps-code-ethics-and-conduct). He is also a member of staff of the University of Greenwich and all projects are approved by the University of Greenwich Research Ethics Committee (UREC) (https://www.gre.ac.uk/research/governance-and-awards/research-ethics-committee), although collaborations with other universities may first be approved by their ethics board and noted by UREC.
All research data stored by the University of Greenwich is retained on a password protected database, and is compliant with General Data Protection Regulation (GDPR) requirements, tailored by the Data Protection Act 2018 (https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/) and the University of Greenwich data protection policy (https://docs.gre.ac.uk/rep/vco/data-protection-policy).
The University of Greenwich will not share any details (e.g., test scores to another university) about anyone with any organisation unless participants provide explicit consent each time, and that this has been approved by the other university’s ethics committee and UREC.
If your data is stored by us and you wish us to delete your data and/or stop sending you e-mails, please contact firstname.lastname@example.org. We would need your anonymous code and/or e-mail address to identify your data.
Data protection and anonymous codes
Please note due to GDPR (and normal ethics processes) we are only allowed to store data that is freely consented to and has an express purpose (in other words, if there is no reason to store data we are not allowed to, and even if there is a good reason, data should not normally be identifiable). We should inform anyone whose data we store exactly how long and where we will store data. However, we normally anonymise most research data soon after initial analyses, so even if you entered our university issued anonymous code when completing a project, we might not be able to identify your data.
There are almost 45,000 people on our volunteer database (which has two parts) from all around the world. Each is only “identifiable” by an anonymous code. In other words, they are not identifiable.
Our two databases are password protected and only Josh Davis and his research assistants have access. They are employees of the university and have signed appropriate contracts and have been trained in data protection. We change passwords regularly.
E-mail addresses (which can contain names) are matched only with the anonymous code in one database. These are kept separate from all other stored information to protect privacy.
In the second database we have scores on up to four face recognition tests (all of which contain white faces only): Cambridge Face Memory Test, Glasgow Face Matching Test, Could you be a Super-Recogniser Test?, 30-60 old-new Adult Face Recognition Test.
And if voluntarily additionally provided: Gender, Ethnicity, Age (we update age each year automatically), Country. About 4/5 of participants have provided us with demographic information. This means we can target certain groups for specific research invites although this is rare.
Note – here we also retain information as to which recent projects participants have been invited to, so that we do not send out invites to the same volunteers over and over again or indeed, miss out any volunteers. However, we do not store information as to which research projects volunteers contributed to.
Fewer than 5,000 volunteers meet our super-recogniser category. We need to invite controls to our research. Many participants might best be described as slightly below super-recogniser level, but we have volunteers with prosopagnosia on the database too. They are interested in our research.
We sent everyone on the database an e-mail when GDPR came into law in 2018 to ensure they were happy with us retaining their data. Anyone not responding was removed.
At any time, volunteers can e-mail us to ask for their data to be removed.
Invites to research (to retain anonymity)
We always randomly select a subset of participants to receive an invite to any research project (sometimes, but rarely, based on specific inclusion criteria, but even then, invitations of those meeting those criteria are still random). From one database, we will generate a list of anonymous codes. This database needs to be closed so that we can open the second database to match the anonymous codes to a list of e-mail addresses.
In some research projects, we ask participants to enter anonymous codes (note some people always enter dodgy codes so we cannot match up). With these projects, we always ask additional consent to access old data stored on the database. This is always specified in advance. We access this in one go at the end of the project, using the method in the opposite direction as described above. In other words, we cannot match e-mail addresses to the data we need for the project.
We never add additional information to the database that was acquired in the new research unless we specifically ask for an extra level of consent in advance. We have not done this for a year or two, although we intend to ask this in a couple of new projects in 2021, as new more reliable tests have been developed.
Once the data is matched up and extracted, and the project finished, we completely anonymise our project-specific data as describe above (in other words we remove the anonymous code from each project-specific database). This means that if a volunteer asks for their score, we will not be able to help.
Importantly, in different projects, we do not ask participants to enter their anonymous code. Indeed, we specifically ask them not to. This is because we have no reason to match up old data with new data. These projects often involve non-face recognition research. This is a requirement of GDPR and ethics. In these, we ask participants to make up a brand-new personal code increasing their privacy but allowing them to retain their ethical rights (i.e., in order to withdraw data later).
Current photo upload project
By 18 December 2020, all 45,000 volunteers on the database were invited to provide images (many as part of an invite to contribute to other research – it is in a little extra box at the bottom of the invite). We also have an invite on the website. Because there are so many people on the database there is virtually zero possibility of us identifying anyone who provides images.
We still need loads more images.
Participants make up a personal code when uploading photos – we do not use their normal anonymous codes.
If someone wants to enter the £20 draw, they can enter their e-mail address at the end. However, this is in an entirely separate URL link that makes it impossible for us to match the e-mail address back to their photos.
All images are anonymously coded on the database, linked only by a common code used to match the images supplied by individual (e.g., someone’s face mask images could be stored as FM001_1, FM001_2 etc etc.). We will retain age, gender, and ethnicity information against FM001 for quick searching – if for instance, we want to find two people from the same demographic who might be mistaken for one another.
Images then may be used in various online tests in future – there are different levels of consent (e.g., we sometimes show images in the media for educational purposes and we ask for extra consent for this in advance).
Most of the other face recognition tests developed around the world display students. There are probably tens of thousands of research projects every year conducted by psychology students using such stimuli, let alone the hundreds, if not more, that are published. In other words, those who created them, know exactly who contributed. We believe that our processes provide far greater anonymity and privacy guarantees. As is noted in the instructions, “only someone who knows you might recognise you. However, we are only interested in unfamiliar face recognition”.
We really do not want to know the true identity of anyone who supplies us images – and if ever images get viewed by, for instance, police officers taking part in our research, they are fully aware that volunteers supplied the images – none are criminals.
Please ask questions at email@example.com if any of the above is unclear.