The challenge of bias in policing data, tools and technologies has been well documented15, and has more recently entered the public consciousness, for example with more debate happening around body-worn cameras16 and facial recognition technologies. Participants in our research showed high awareness of these issues, but generally felt the question was not about avoiding use of certain technologies, but better choosing when to use them. This is despite the high-profile vendor exits from the likes of IBM, Microsoft and Amazon in 2020 from the facial recognition technology market for policing17.
This attitude also does not address the challenges of bias in areas such as policing algorithms. Some participants did begin to identify some of these challenges, with one saying, “If we had algorithms that said 'Joe Bloggs, he has come from this underprivileged background, he lives in this area, he's been excluded from school, he has been involved in non-crime anti-social behaviour previously therefore he's going to be on our systems as a potential criminal' - if that was then disclosed to [authorities] and potentially impact someone's life I think that's ethically wrong. Whereas directing police resources to a general area… I'd say that's very different because it's not targeted at any one individual”. However, this consideration still fails to recognise the potential for bias towards groups or areas in the community and the impact this may have on the relationship between the force and their citizens, particularly for those groups which have historically felt targeted and over-policed.
Other technologies appear to be receiving more focus. For example, most forces we spoke to mentioned their use of body-worn cameras, which can result in bias or perceptions of bias depending on, for example, decisions made around the camera’s point of view and the choice of when to turn the camera on and off. Three forces in particular spoke about having taken their body-worn video footage to independent advisory groups to be assessed by members of the public. These forces are particularly focused on getting feedback on how the officers deal with situations, and on working with citizens to improve satisfaction in the force’s conduct. One participant said, "Body cams are really key for investigations, and they've reduced complaints massively." It is important that feedback from these advisory groups is taken on board within forces and implemented into ways of working.
Overall, our research shows an inconsistent approach to examining potential bias in policing tools and technologies, with specific technologies getting a lot of focus and others less. Furthermore, there appears to be inconsistency in how ethical considerations of bias in technology and tools are managed, and how decisions are implemented and communicated – something we will discuss in more detail in the next sections.