Over the past twelve months or so, we have seen a big shift in the public attitude towards new technology. More people are becoming aware of the potential abuses of data and other cool stuff. Scandals involving Facebook and other companies have been headline news.
Security professionals have been pushing the idea of security by design for ages, and the push to comply with GDPR has made a lot of people aware of privacy by design. Responsibility by design (RbD) represents a logical extension of these ideas to include a range of ethical issues around new technology.
Here are some examples of the technologies that might be covered by this.
|Technologies such as|
|Benefits such as|
|Dangers such as|
|Principles such as|
|Big Data||Personalization||Invasion of Privacy||Consent|
|Automation||Productivity||Fragmentation of Work||Human-Centred Design|
|Internet of Things||Cool Devices||Weak Security||Ecosystem Resilience|
|User Experience||Convenience||Dark Patterns, Manipulation||Accessibility, Transparency|
Ethics is not just a question of bad intentions, it includes bad outcomes through misguided action. Here are some of the things we need to look at.
- Unintended outcomes – including longer-term or social consequences. For example, platforms like Facebook and YouTube are designed to maximize engagement. The effect of this is to push people into progressively more extreme content in order to keep them on the platform for longer.
- Excluded users – this may be either deliberate (we don’t have time to include everyone, so let’s get something out that works for most people) or unwitting (well it works for people like me, so what’s the problem)
- Neglected stakeholders – people or communities that may be indirectly disadvantaged – for example, a healthy politics that may be undermined by the extremism promoted by platforms such as Facebook and YouTube.
- Outdated assumptions – we used to think that data was scarce, so we grabbed as much as we could and kept it for ever. We now recognize that data is a liability as well as an asset, and we now prefer data minimization – only collect and store data for a specific and valid purpose. A similar consideration applies to connectivity. We are starting to see the dangers of a proliferation of “always on” devices, especially given the weak security of the IoT world. So perhaps we need to replace a connectivity-maximization assumption with a connectivity minimization principle. There are doubtless other similar assumptions that need to be surfaced and challenged.
- Responsibility break – potential for systems being taken over and controlled by less responsible stakeholders, or the chain of accountability being broken. This occurs when the original controls are not robust enough.
- Irreversible change – systems that cannot be switched off when they are no longer providing the benefits and safeguards originally conceived.
Wikipedia: Algorithmic Bias (2017), Dark Pattern (2017), Privacy by Design (2011), Secure by Design (2005), Weapons of Math Destruction (2017). The date after each page shows when it first appeared on Wikipedia.
Updated 12 June 2018