Most of the work we do falls outside data protection legislation as it does not involve processing personal data. However when people express concerns about our work data protection is a phrase we frequently hear. For this reason we’ve created this page to set out how we operate and manage risk and privacy. We’re a business that handles data and we’ve assessed the impact of that carefully to a set of firm principles.
1/ We are open about what we do and we actively work towards informing the public on the work we do. Quite possibly you’re reading this because you’ve seen some information or imagery that we’ve published and have questions or concerns. If you still feel this way after reading the rest of this information, then please talk to us. We work on projects where we are not secretive about what we do or why.
2/When we talk about data we use plain English. We don’t use acronyms or jargon because we don’t think that helps. Occasionally this can make us sound a little blunt, but in person we do our best to be approachable and make time to talk to the public. Collecting information to help manage cities is a necessary and appropriate thing to do, but we appreciate that at times people will want to ask questions.
3/ We don’t publish or produce personal data. Our business is about taking images, usually video, and turning it into anonymous information as close as possible to the source. Most of the information we publish is to support transport projects or the management of town centres, it’s generally that “a person” at “a time” moved through “a space”, it isn’t about you. Occasionally we publish images or small video clips to explain to the public or our customers what the data is. When we do this we ensure people can’t be recognised and we remove any timestamp so the images can’t be tied back to any other information. If for any reason you think we haven’t done a good enough job at this please tell us. We make an assumption that in a connected world it is impossible to stop data or images crossing borders, for that reason we work very hard to anonymise everything we produce.
4/ We don’t share responsibility for handling data. We work hand in hand with lots of other organisations, but we don’t pass on half processed data or rely on others to get things right. When we take in data we then take responsibility for storing it securely and removing any information about individuals as quickly as possible before sending it back out again, sometimes this takes only a few seconds.
5/ Our objective is privacy by design to create information to support better public realm, if possible without touching anyone’s personal data at all. Cameras sited at height, covering a large area top down, linked directly to software which doesn’t record footage are in use today, providing real time data to help manage city centres.
When we install permanent cameras we site them very carefully to if possible avoid capturing anything that would identify individuals. Cameras mounted lower down, for example on street furniture, will be more likely to capture personal data and we take this responsibility very seriously.
6/ We don’t process biometric data nor do we do “facial recognition”. Our software can recognise “a person” or “a cyclist” from the shape of their body without needing high resolution imagery, but it can’t recognise individual people or individual vehicles. When we “track” a person or a vehicle through a space we do so only for a few metres to establish direction, speed or overall movement patterns for a street. We don’t track you and we wouldn’t for example record the fact that you, or anyone had moved from one particular shop to another one.
7/ “Artificial Intelligence” is used to reduce images to merely a pattern of anonymised movement. In some circumstances siting a camera to capture data is difficult or we are asked to use footage collected from cameras which were originally installed for surveillance purposes. We use technology to remove personal data or ensure we never capture it, for this reason we have assessed the use of smart software as being of very low risk.
8/ For short term traffic surveys we will not record footage for longer than seven days and we treat that footage as potentially containing personal data. When we have been asked to carry out a short term survey of an area to establish movement patterns, once we have finished processing the original footage into anonymised data, there is unlikely to be any need to retain the footage in which case it will be disposed of.
For short surveys cameras are deployed typically 8 metres off the ground. Footage from these cameras is more likely to contain personal data, for example when people and vehicles are close to the camera.
9/ We do not ask the consent of those people who move through the area we monitor, to do so would be impossible and intrusive. For those with experience using video in the film production industry this can seem strange. We operate within the legal framework which is applicable to our industry and take our responsibilities very seriously. Whilst a film editor’s job is to enhance and highlight images of individuals ours is the opposite. Our legitimate interests as a business supplying data on transport, mobility and the management of public space are sufficient to allow us to do what we do – We support public and private bodies in the management of public space and transport infrastructure and raise awareness of the use and benefits of machine vision in the built environment.
10/ We discourage merging our data with personal data. Sometimes when we deliver data to a client we know that somewhere there may be someone with the ability and resources to turn it into personal information, for example they might have other information such as stored video footage which could be recombined to create something different. The responsibility to control this and if necessary ensure this doesn’t happen sits with them. If you are considering taking some data produced by us and attempting to do this, you should think very carefully about the responsibilities this entails. Given the types of activities happening in the areas that we work in we assess the risk of this happening as low.
Example 1:
We install cameras with a view of a public space, these do not capture recognisable images of individuals, nor do they record footage but do record data on time, speed and mode of travel. Cycling is banned in the area although some people do use the street, a regular fast and furious 4pm cyclist is observable in the data. The risk exists that if the data is openly available someone with access to their own video footage will attempt to identify the fast 4pm cyclist. Photographs of Jane, who rides through the area regularly at 4pm are extracted from 3rd party footage and she is featured on social media as an example of antisocial British youth. We would assess the likelihood of this happening as low and the consequences to Jane as relatively mild, but we would still accept that our technology creates the risk of this happening even though we ourselves never handle or process any of Jane’s personal data. We recognise that openly available data that covers longer periods or records behaviour which might be perceived badly needs careful consideration by our clients. We also recognises that the availability of large datasets created using methods such as ours puts an additional burden on those who do control personal data which might be linked back to it, such as a CCTV operator.
Example 2:
Whilst reviewing some sample footage from one of our rooftop cameras we spot a highly distinctive orange car with a number 9 on the roof moving through the street, followed closely by a man riding a custom built “tall” bike. Although both of these street users are clearly comfortable attracting attention to themselves we recognise that we have inadvertently captured personal data from both vehicle users, as they are well known local characters recognisable from their choice of vehicle. If we switch on statistical collection for this equipment covering car and bicycle users then this work would come under data protection legislation in that even though our software is working to anonymise the information, this is still classed as processing, however we assess this as being of low risk. We would not retain or publish the sample footage showing these unusual street users, our review process would identify it as not suitable for retention and it would be deleted.