Why charities must grasp the nettle of digital ethics

As we launch the Charity Digital Skills Report survey asking charities to give their views on the ethical challenges posed by technology, Zoe Amar considers how the charity sector should tackle some of its trickiest ethical questions around digital.

Guest Writer | 8th May 19

This article was written by Zoe Amar, Chair of the Charity Digital Code of Practice, Founder of Zoe Amar Digital and a trustee of Charity Digital.

Take The Charity Digital Skills Report survey. There are £200 of Amazon vouchers to be won by participants, and the deadline for responses is 24 May. The results will be released in June.


Carole Cadwalladr, the journalist who broke the story of how Cambridge Analytica harvested data from 87 million Facebook users and influenced the Brexit vote and the election of Trump, didn’t pull any punches in her recent TED Talk.  She said that the technology which the tech giants had invented was “amazing,” but condemned Silicon Valley as a crime scene where liberal democracy had been broken.

Why should this matter to charities? If you’re running a social welfare charity in Birmingham, the world of big tech companies in California may seem like another planet. Yet the technology which these companies invented is interwoven into the fabric of our lives, from the smartphone in your pocket to the laptop on your desk. And it’s already affecting the people that you help. For example if you work for a debt advice charity, algorithms will be involved in deciding whether your beneficiaries can get a loan or a mortgage.

That’s why in the latest Charity Digital Skills Report we’re asking charities to share their views on the ethical challenges posed by digital innovation, such as social media platforms’ use of data, digital inclusion and algorithm bias, and we also want to hear where you see risks in tech. Everything you tell us will help us map where charities see the opportunities and challenges in digital and create a resource to help the sector.

The platforms and tools invented by big tech have changed the world, helping charities fundraise, communicate with more people and campaign at scale. I’m not suggesting we all burn our ipads and go off grid.  But we should still ask questions about how these organisations are using our data and influencing behaviour, and what this means for how we work and the people we support. Silicon Valley is under pressure to tighten up its ethics, with Facebook’s Mark Zuckerberg admitting that there should be stronger regulation of the internet.

So what are the key issues and how can charities deal with them? I asked two experts to share their thoughts.

Data, people and platforms

Rhodri Davies, Head of Policy & Programme Director, Giving Thought at Charities Aid Foundation says that charities need to ask how they can give supporters, “the benefit of personalised services based on data, but still give them confidence their data will not be used in ways they aren’t comfortable with.” He also thinks charities should be aware of how tech is affecting social interaction and communication, such as online bullying.

Tracey Gyateng, Data Science Manager, at Datakind UK says that, “Charities need to be aware that whilst social media companies provide fantastic platforms for engaging with service users and supporters, they are first and foremost private sector organisations- and whilst there is no/low financial cost to using their platforms, their business model is based on using the information that we willingly share to bring in advertising revenue.”

She highlights how small charities often rely on the platforms as their main communication gateway. Think about how many small organisations you know who are dependent on Facebook to manage volunteers or fundraise in their local communities. What would your charity do if a social media platform started charging, or deleted a service?

Gyateng also believes charities must understand how emerging technology could increase social inequality. She told me that, “As data driven decision making becomes more embedded into society, charities need to be aware of how algorithms can lead to increased disparity, discrimination and bias. We’ve seen this in facial analysis software unable to detect a range of skin tones and facial structures as highlighted by Joy Buolamwini. Also ProPublica’s Machine Bias article, showed a racial bias against black people in risk assessment tools used by US Courts. Finally we also see this in algorithmic credit scoring.”

A good place to start with digital ethics is working your way through the risk and ethics principle in The Charity Digital Code of Practice.

What can charities do about these issues?

The first step is to acknowledge the challenge, and then manage it like any other risk. Davies advises charities to review their data and tech policies with ethics in mind, asking  whether they cover how data can be used in an ethical way, and how to assess digital products to see if they meet ethical standards.

Similarly, Gyateng thinks that due diligence is essential to ensure that partners on tech initiatives meet charities’ standards. There are already charities who are working to improve this. Datakind UK have recently collaborated with the Association of Medical Research Charities on a framework of questions to help charities ensure their ethical positions are understood by tech partners. Meanwhile the Wellcome Trust are looking at the ethics of data science and practical ways to help charities apply digital ethics to their work.

Tech offers huge opportunities to charities, and inevitably this comes with some risks. If we are to make good on our promise as a sector to create equality and fight for social justice, then we need to ask ourselves tough questions about how we are using digital, and push tech companies for the change we want to see.