By Yvonne Hofstetter
Photo: Anton Uniqueton of Pexels
Each of has heard these questions: “What are your values? How do your values contribute to the success of your company? How did your values change over time?”
Indeed, the topic of values is very popular today. A value embodies something good, something positive. Values can be moral such as “non-discrimination against people” or non-moral such as “clean oceans” and a “healthy global climate”.
In technology, too, values are discussed more than ever and have stepped out of their niche existence. Treating the question of values in technology is by no means a new discipline. Software and other system designers should at least have heard of value-based design, even if it seemingly lost relevance in every-day engineering practices. Today, our democratic societies suffer heavily from the earlier nonchalance of software programmers. Social media allow frontal attacks on users’ minds, and since the rapid rise of artificial intelligence (AI), the consequences of digital progress cannot be ignored any more.
The demand for ethical AI becomes ubiquitous
Since 2016, hundreds of multinational organizations, national governments, NGOs or corporations have spoken out about AI. Documents on ethical AI amount to a three-digit number, too.
“There have been ethical AI initiatives since 2015,” notes Professor Sarah Spiekermann from the Vienna University of Economics and Business. "They all do the same thing. They write down some value principles without context and then say: Those are important to us.”
In fact, it remains to be seen that hundreds of papers on ethical AI converge against the same claims. Representing many, the EU’s High-level Expert Group on Artificial Intelligence (AI HLEG) in 2019 called for seven characteristics of AI systems: subject to human supervision, technically robust and secure, respectful of privacy, transparent, non-discriminatory, socially and environmentally compatible and accountable. This leaves many questions unanswered. Since most AI ethics committees set up by governments are often made up of business representatives, but hardly any philosophers and ethicists, consequently such ethics committees do not make a clear distinction between system functionality and values. The result: a colorful jumble of claims which lacks any systematic approach.
“Privacy, transparency – these are hygiene factors. When I build a robot, it goes without saying that it should be safe. An autonomous drone should be reliable. For me, these are standard and hygiene values. I don't need an ethical standard just to realize that standard and hygiene values are important”, says Prof. Spiekermann, commenting on the quality of many initiatives.
The new IEEE 7000TM standard establishes a value-centric design process
The new IEEE 7000TM standard for value-based design is differentiating from other initiatives. Coming into effect in September 2021, it implements a value-based system design process. There are actually some novelties about IEEE 7000TM, such as the claim to bring together what does not naturally appear to belong together:
"IEEE 7000TM is about value-based engineering to create a good, beautiful world, that is, about the rehabilitation of the good, the true, and the beautiful in our world, which can be promoted, among other things, by technology," explains Prof. Spiekermann, who chaired the standardization committee in her role as co-chair. "And it is about the question: How can we make technology fruitful for the good in the world?"
This triad, a model of Western thinking that already appears in Socrates, is to be put at the service of technological development, which is to be ensured by a new job profile: the Value Lead. The Value Lead plays a crucial role in the value-based design process.
"People who have studied ethics or philosophy - maybe also literary scholars or lawyers - but who also have a certain technical understanding are best suited as Value Leads," says Prof. Spiekermann, motivating this new occupation. And she also explains why one cannot do without an education in humanities. IEEE 7000TM requires that the Value Lead structures values and explores them in the light of three theories of ethics before they are being prioritized and translated into system functionalities. Utilitarianism, virtue ethics and duty ethics go hand in hand to meet compliance requirements of IEEE 7000TM.
“Discourse ethics, value-sensitive design, ethics canvas… everyone formulates statements or puts sticky notes on the wall. Maybe the sticky notes are even being structured through an ethics canvas – all nice, but this is kid stuff.”
IEEE 7000TM pioneers meet in Vienna
IEEE 7000TM wants to professionalize this process, and so Prof. Spiekermann is offering training as a Value Lead for the first time in February 2022. The first training session of the Pioneer Training: Value-based Engineering with IEEE 7000TM will take place from 16.-18. February in Vienna. The Pioneers’ group consists of highly experienced product and project managers from AWS, DATEV, Siemens and HPI. 21strategies sends Prof. Yvonne Hofstetter, who will also assess IEEE 7000TM as part of a research project. Claiming to stimulate a classic ideal, IEEE 7000TM may not applicable to AI-powered system for national security. But it is precisely that user group with the greatest need for ethical AI.
“When writing the standard and looking at its DNA, the one thing I really didn't have in mind at all was that national security organizations could use IEEE 7000TM,” admits Prof. Spiekermann.
But national security agencies are actively pushing the industry to make systems "smarter" and urging for greater and faster AI adoption. This makes the question of IEEE 7000TM applicability increasingly urgent. Whether it is ultimately enforceable in a global context that democratically legitimate states adhere to values that other nations do not share is not a task for system engineers, though, but for politicians. However, the fact that a standard such as IEEE 7000TM was established globally is trend-setting, since standards are being defined and ruled more and more frequently by the new players of a multipolar world order.