ARTICLES

VIDEOS

PODCASTS
BACK TO
FRONT PAGE

Vicky is one of the creators of CLGdotTV, where she also produces and presents programmes. She has 25 years experience of delivering projects in and for public sector organisations including government departments, local authorities, the NHS, and professional associations. Much of her work has been around digitally-enabled innovation and improvement.

@vickysargent

Public services should be mindful of bias and discrimination in tech and data deployments

SHARE

Image:
© CLGdotTV

Former Camden councillor Sally Gimson, digital consultant Alex Fefega and Adelade Adade of Sleuth Co-op discuss some unintended consequences of tech and data deployments in public services.

All too often, tech has been used by public services to achieve efficiencies for the back office, and overlooks the citizen, so that resulting digital services actually make people’s lives more, rather than less difficult. And for people who do not know how to use digital tools or who do not have the means to connect to them, tech creates discrimination and disadvantage.

Equally, public services like councils tend to collect the data that might lead to savings (waste collection, parking) but not data that reveals the need for costly interventions (domestic violence, child grooming).

The most difficult issues however, are those around the data that digital systems collect and potentially share. People are rightly suspicious of the state holding their data and what they might do with it, especially those that can be used to predict an individual’s future health or the likelihood of criminal and other behaviours.

Organisations are pressing ahead with systems that raise many ethical issues without having the necessary and difficult conversations about their use with the public that they serve. The approaches that have been adopted to deal with the challenging issues around human fertility could be a model for concerns about use and abuse of personal data. Existing safeguards provided by measures like GDPR go nowhere near deep enough.

 

Back to homepage