Ethics can be defined as the moral principles governing the behaviour of a person or an organisation. Ethical standards help us to understand and guide us in our behaviour, while providing a system of moral principles and perceptions about right versus wrong.
Digital transformation has brought about new ethical challenges related to how we interact with each other and digital technologies, and how digital technologies interact and affect us. It is important to understand that any technology, at its core, was created by humans. That is why the design of any piece of code, an app, a piece of wearable technology, or a smartphone should be grounded in ethical standards. We need a system of moral principles about what is right and what is wrong in the context of digital transformation.
Digital ethics (also known are information ethics) is defined as a branch of ethics that focuses on the relationship between the creation, organisation, dissemination, and use of information, and the ethical standards and moral codes governing human conduct in society.
You might think that digital ethics primarily relate to our behaviour online or when using digital technology. Indeed, digital ethics are connected to how we communicate, treat others, portray ourselves and protect ourselves online. It is true that we are responsible for our own ethical behaviour and so-called netiquette. However, digital ethics should also be seen as an overarching set of principles to create digital technologies in an ethical way.
You might have heard about examples of unethical behaviours such as algorithmic profiling, misinformation, or unethical design. Ethical technology design should be primarily driven by ethical standards and be grounded in the best interests of young people.
As the 5 Rights Foundation argues:
‘Companies must design digital services that cater to the vulnerabilities, needs, and rights of children and young people by default. To fulfil its potential, digital technologies must be directed towards helping children and young people to flourish. Retrofitting safety features into a service only after under 18 year-olds have experienced harm or allowing their rights to be routinely undermined is simply not good enough. Meeting the needs of childhood development or delivering on children’s rights is not optional. Governments and policymakers need to prioritise the development of robust standards for the design and development of digital technology, and regulate to require that children’s safety, rights, and privacy are upheld by design and default.’
We believe that ethics are central to digital transformation. As the European Commission argues, any digital and AI systems need to preserve and promote:
- respect for human agency;
- privacy, personal data protection and data governance;
- fairness;
- individual, social, and environmental well-being;
- transparency;
- accountability and oversight.
The aim of this section is to explore the importance of digital ethics and its impact on the future of youth-centred digital transformation. Additional digital ethics-related resources can be found in the Resource Pool