Should technology be based on some set of moral values?
Actually, technology is always based on some set of values.

There is no denying that this or another way, technology is a vehicle for some kind of values. Whether these are capitalist, ordoliberal, digital Leninism, some form of digital sovereignty, or something else entirely - it is a different question. Those values may simply be human rights, or European values, or something else. But let’s just hope the set of values is appropriate for the end-use. The question is what is it supposed to be. At their core, the values in question may be useful at the level of technology standards.

It’s something we (with Amelia Andersdotter) tackle with the latest scientific paper published in Internet Policy Review. We speak of technology standards, cybersecurity, privacy, technology policy, regulations. And we put it all in the context of values.

As we write:

“Technology standards define processes for or features of technologies. They support interoperable implementations by different producers, and may serve as security or safety guarantees.
Formal, government-driven standards developing organisations (SDOs) traditionally serve as platforms for compromise between existing national standards. In the past decade, the recognition of the driving role of SDOs in innovation has made them surface into the public debate. The influence of the work overseen by non-formal, industry-driven standards bodies including the W3C, the IETF or the IEEE is increasingly acknowledged as having both direct and indirect impacts on societies, modes of work, technology policy or politics."

When you use technology, it definitely is an expression of some values.

It may uphold freedom of expression. It may support security and privacy. It may help in tackling criminality or dissent. It may allow decryption or hijack of communication.

Finally, it may technically enable - or not - censorship. It’s all a question of values.

While there exists some criticism of values-enabled technology design, our core point is that technology is already a vehicle for values, want it or not. In Europe, such European Values are human dignity, freedom, equality, solidarity, citizens’ rights, and justice (according to the Charter of Fundamental Rights), the European Commission adds also democracy, equality, rule of law, and human rights to this mix, we should of course also keep in mind the universal human rights. The expressions of these values are for example some regulations. Like, for example, the GDPR, the Digital Services Act, or the Artificial Intelligence Act. It’s again all a question of values. It so happens that the core internet technology standards organisations, IETF and W3C, are also well aware of the question of values and ethics. There’s also an important issue of the priority of constituents. For example, according to the IETF, the internet is for the end-users. In the case of W3C, the users are also the priority.

So we conclude: “We reflect that this focus on the end-user - specifically the human - is a vehicle which lends itself to shaping technologies with human dignity, and more generally with human rights and values”. While we are keenly aware of criticism of embedding values into technology design, we easily set such criticism aside because, want it or not: “this connection is already the practical reality. It also has strong historical precedence, for instance in cases where technology standards are used to facilitate free trade (we return to this point below). The link between values and technologies is assumed in formal standardisation organisations (such as ETSI, CEN/CENELEC, ISO) in order to accommodate for geopolitical and corporate diversity. The European framework for formal standards expressly revolves around the realisation that technical standards are used to achieve industrial policy goals (for example, they can be used to maintain non-tariff barriers to trade between EU member states) … Links exist also in other areas such as sustainability, where technical standards that assume policy goals are energy classifications for electrical equipment”. In other words, technologies are already shaped by values. The question is: what values, and how. Sometimes, such values-inspired thinking fail and we all suffer: “The W3C Do Not Track standard (Tracking Protection Working Group Charter (disablement of the group), 2019; Safari 12.1 Release Notes (removal of Do Not Track), 2019;) is a less successful example of values shaping technologies. In spite of a European law enacted to specifically encourage the type of privacy-protections planned for development in this working group (Directive 2009/136/EC, Art 3.5), the endeavour did not successfully engage industry, policymakers, or the public”.

This can be summarised with: “Inclusion of privacy at the design phase of technologies is a positive example of cooperation between regulation and technologies. The case of Do Not Track is a negative example of what happens when technology and policy do not work in sync. These real-world technology policy examples highlight that furthering the case for values in technology design relies not only on legal text, but also on regulatory backing

This all has consequences to cybersecurity, privacy, but also issues of accessibility of technology.

For example, below I repeat some values-inspired interventions in technologies:

  • Access to telecommunication content: Ensuring lawful intercept capabilities in telecommunications networks by influencing the use of encryption in hops between networks, or the use of identifiers inside of networks
  • Accessibility: Upholding of W3C WCAG. Technologies and websites designed with disabled in mind. Assistive technologies
  • On-demand operating system modifications (French policymaker pressure (contact tracing) US requests of iOS unlocking in judicial probe): no effect
  • Privacy in the design phase
  • Reversible encryption / on-demand decryption

Politics has an increasingly complex impact on technologies. Two examples:

  • In 2016, a California court ordered Apple to modify its operating system for law enforcement purposes in a case that eventually brought the interest of the entire world.
  • In 2020, French politicians tried  to convince Apple to modify its operating system in order to allow the pursuit of French sovereign ideas for digital contact tracing applications (link). Notably, those politicians referred to the concept of technological sovereignty.

All of the above are some values-based attempts to influence technologies. Sometimes such links are really complex and touch technology design:

The GDPR foresees a place for the active involvement of industry players in defining their own data protection norms that preserve the high levels of data protection mandated by the law. At the same time industry players lack the mandate (or legitimacy) to establish what these norms should be. The specific interpretation of norms codified in the GDPR is subject to decisions by data protection authorities in the EU member states and ultimately the Court of Justice of the European Union. Subject matter experts in protocol, web browser, or radio technology design are not legal experts and may not be helped by high-level process standards for assessing data protection features when developing new technologies.They could be assisted by templates that are developed for the purposes of helping standards development in their respective organisations

What we do not exactly like is how the European Union approaches the harmonisation of technology standards. Specifically, it takes a lot of time to pick up the work at standardisation bodies, restandardise it, then bless it with a regulation. European technology standardisation regulations can be better than this.

We end with a sober realisations:

The apparent failure of the W3C Do Not Track effort highlights the shortcomings stemming from the lack of European coordination, unity, and ability of rapid work. The EU continues moving at a slow pace while the outside efforts accelerate … The previous major success of European technology standardisation approach in the field of technologies, the popularisation of the 2G telecommunication standard, dates to the 1990s, and could only occur in a radically different technological landscape that was much more nationally fragmented than it is today ”

So 2G, and now GDPR. Those are the success stories. Now what?

Some recommendations:

  • European Union must simplify the current policy of re-standardising the already accepted standards developed by other stakeholders
  • Other simplification measures could consist of relying less on formal certification and certification bodies, and more on self-certification initiatives
  • European Union must create its dedicated, independent, long-term and resilient approach to technology standardisation. The US virtually created the internet itself and maintains an influence over the technical internet architecture through powerful American technology companies in spite of the lack of an official state approach to technology policy. The Chinese medium-term technology standardisation plan is a recent example of a state prioritising technology standards as a strategic area of interest, while the European influence on 2G standardisation is today all but forgotten.
  • European Union must realise how to practically structure its influence over technology standards. The composition of working groups within the technology standards organizations follows a totally different model than in elective democracies. This highlights the importance of early and consistent but long-term involvement. Promoting activities of this kind would require a policy of inducing participation from individuals or organisations well-positioned to analyse proposed technology standards with European values in mind. The policy of active participation should build on a strong understanding of how standards, technology, and technology assessment operate, including in specific domains such as security, privacy, accessibility or even broader like in the case of human rights assessments. Such a policy must encompass the adaptation of the regulation 1025/2012 on the European standardisation (Regulation 1025/2012).

The main take-away for Europe:

“Likely, a dedicated technology standards unit or agency must oversee or drive the activities. Such an agency should not be tasked with any particular enforcement tasks. Rather, it should focus on oversight, research, development, design, and coordination of activities with the EU member states”

Summary

Technologies impact on policy and politics.

But so do policies and policies - they impact on technologies.

This impact is based on some values. The key question is how to choose the right set of values. What happens in a situation of a clash of values is another story.

And this was a great research work.

Did you like the assessment and analysis? Any questions, comments, complaints, or offers of engagement ? Feel free to reach out: me@lukaszolejnik.com