Toward a pragmatic and social engineering ethics

Abstract This paper centers on a five-month ethnographic field study among engineers in a Danish collaborative industrial robotics project, to examine how the everyday work of engineers intersects with existing, formally-adopted engineering ethics approaches.Methods included a literature review of engineering ethics, participant observation in a technical research institute and in machine workshops, document and visual media analysis, object elicitation, and qualitative interviews. Empirical findings from this investigation are used to evaluate existing formalized engineering ethics in relation to engineering praxis. Juxtaposed with engineers’ everyday ethical decision-making practices, professional ethics approaches are shown to be based in deontological and virtue ethics, narrowly focused on the individual engineer as a professional, and thus inappropriate and insufficient for the very practical field of engineering. The author argues for an alternative direction toward a situated pragmatic and social ethics in engineering that disrupts the current social arrangement around robot development through ethnographic intervention in the engineers’ negotiation of values in the design process.

In February 2017, the European Parliament adopted a resolution [2] proposing civil law rules governing robotics and the makers of robot technologies. Years of research and debate preceded the resolution which proposed an insurance or taxation plan for robots, assigned liability to robot makers, and suggested a new code of ethics for robot engineers. During this same period, the European Commission dedicated 700 million euros to robotics research and development projects [3]. With such economic and political interest in robotics and relatively high stakes, the moment is right for a reevaluation of engineering ethics approaches.
This paper is not a critique of engineering education, nor of engineers' own practices, but of the political and regulatory efforts that inappropriately impose a philosophical ethics on robotics and engineering practitioners.
I propose a transition from engineering ethics rooted in professional ethics traditions to a practicable social ethics rooted in shared values and critically dependent on collaboration between engineers and ethnographers. Through an assessment of the social arrangements of decision-making in design processes, and disrupting this social arrangement through provocative ethnographic inquiry, it might be possible to bring diverse groups into dialogue, to blend disciplinary understandings, and to spark ethical reflection and the integration of human values into everyday design activities -thus bringing about more ethical realities.

Historical approaches
Contemporary professional ethics in engineering hark back to the earliest codes of ethics, established in the beginning of the twentieth century [4]. In the decades that followed, there was a long slow process of revision as engineering associations sought to establish engineering as a profession rather than a trade [4,5]. By the 1970s, engineering ethics had emerged as a field [6], but was criticized already in the early 1980s for being too oriented toward philosophy and morals and not sufficiently oriented toward engineering practice. Philosopher and STS (Science and Technology Studies) scholar Heinz Luegenbiehl (1983) argued that the narrow focus on engineers as professionals neglected to consider engineers as practitioners: Underlying the numerous attempts to develop an adequate code of ethics for the engineering profession, however, has been an idea that has become almost an article of faith, namely, that engineering, in order to be considered a profession, must have a code of ethics very much in line with the format of the early codes. It would thus appear that, in the quest for unification and refinement of the codes, a more fundamental concern is being widely overlooked, that being the issue of whether or not a code of ethics is appropriate for the engineering profession at all. [5] Luegenbiehl was perhaps ahead of his time. A new conversation around ethics in engineering burst onto the scene in the mid-1980s after the Challenger disaster, a case which remains a pet example in engineering ethics today [7]. The Challenger space shuttle, launched in the United States in 1986, exploded before even leaving the stratosphere, due to a known engineering vulnerability and the decision to proceed with the launch despite engineers' objections. The explosion, televised live, caused a devastating loss of life and was a national disaster in the United States, resulting in multiple investigations, public hearings, and a broad call for attention to ethics in engineering. In the years leading up to the Challenger disaster, Luegenbiehl had taken up the case against engineering codes of ethics, suggesting instead that codes of ethics be replaced with guides for ethical engineering decision-making [5]. This was a departure from the traditional codes and oaths which targeted the individual engineer's moral compass and attempted to affect a person's ethical thinking or orientation rather than their everyday practices and the structures in which they're embedded.
From the ashes of the Challenger rose a renewed interest in professional codes of ethics, oaths, edicts, filled with principles, canons, and the like targeted at the engineer's professional conduct [8], which caused the pragmatist movement by Luegenbiehl and others to lose traction. These codes of conduct were (and continue to be) based in part on situations like the Challenger, in which there occurs a conflict between an engineer's personal morality and structural elements, such as cultural pressures or hierarchical decisions -what Ronald Lynch and William Kline (2000) term "whistle-blowing" cases [8,9]. These approaches, with an overly narrow focus on such David-and-Goliath, make-or-break situations overemphasize big decisions, major risks, and -most importantlythe onus on the individual engineer. Such an engineering ethics combines aspects of deontological, consequential, and virtue ethics, wherein the engineers' adherence is dependent on rules, consequences, and moral character, respectively [10].
For example, the Institute of Electrical and Electronics Engineers (IEEE), a major international professional asso-ciation for robotics engineers, has the following canon under its code of ethics: We, the members of the IEEE [. . . ] agree: to hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, and to disclose promptly factors that might endanger the public or the environment. [11] The American Society of Civil Engineers' (ASCE) nearly identical canon reads: Engineers shall hold paramount the safety, health and welfare of the public and shall strive to comply with the principles of sustainable development in the performance of their professional duties. [12] Again, the canon is found, simplified, in the National Society of Professional Engineers' (NSPE) code: Engineers shall hold paramount the safety, health, and welfare of the public. [13] These aims are not all that actionable. Holding something paramount amounts to a feeling or an orientation -there is no reference to concrete practices or the social arrangement in which engineering work is situated. The ASCE code of ethics, however, also contains "guidelines to practice under the fundamental canons of ethics," which seem to be the operational principles for each moral canon. For the aforementioned ASCE canon, one guideline reads: Engineers shall recognize that the lives, safety, health and welfare of the general public are dependent upon engineering judgments, decisions and practices incorporated into structures, machines, products, processes and devices. [12] Although intended as a practicable guideline, it seems difficult to enact the verb "recognize" -a word which points more to a mental activity than a material or practical one. The other guidelines for both the ASCE's and the IEEE's codes of ethics are directed at whistle-blowing prescriptions, declarations against fraud and discrimination, and tenets such as "be honest and realistic" [11] or "be dignified and modest" [12]. In a 2011 review of engineering organizations' codes of ethics, the American Association of Engineering Societies (AAES) found many similarities (as demonstrated above) [14].
These professional organizations have in common a focus on the virtuous engineer who subscribes to particular moral principles and behaves according to prescribed norms. These professional edicts have dominated engi-neering ethics from their inception.¹ This period has been marked by a neglect of collective decision-making processes, pragmatic everyday engineering activities, and the cultural and social factors in these processes [5,9,15]. Recently, there has been a new push away from normative rule-based deontological and consequential ethics [10]. Faith in such approaches has fallen, as empirical research has shown that engineers do not modify their decision-making processes when instructed to adhere to a particular code of ethics [16]. Contemporaneously, there have been arguments for a more virtue-based engineering ethics, relying more on an engineering morality rather than rules or oaths about right and wrong. However, there has been a noted failure of existing approaches to produce 'good engineers' or 'moral experts' [10,17]. While a move away from deontology is in order, I argue that it ought to be a step toward a more pragmatic ethics, and not a step toward virtue ethics. Overall, virtue ethics relies too heavily on individual ethical thinking, and not enough on the social praxis that is engineering.
In line with this attitude, there have been calls for a professional engineering ethics that does pay attention to everyday practices, to target the ethical decision-making behaviors of engineers [5,[18][19][20]. These approaches recognize that engineering ethics is more than professional ethics. To move from the professional in the workplace to the practitioner in the workshop, engineering ethics ought to extend to the ethics of the design and execution itself, addressing the more mundane but still important ethical issues that occur in regular practice. Necessary to the shift from ethical orientation to ethical practice, is a shift from individual to social ethics. Social ethics is: . . . an examination of structure and process, [which] involves social relations, their structure, and the norms and policies that characterize them. [21] Such an approach would allow for a focus on hierarchy, power, materiality -and other aspects of situated engineering practices that might affect the end product in design (and thus the ethical impact it has in the world). Nevertheless, new and continuing efforts at formalized engineering ethics still center on the individual and their ethical orientation and professional conduct, through edicts, codes of ethics, and regulations [22]. In the recent EU parliamentary resolution to govern robotics [2], top-down measures with nondescript goals were once again proposed: [W]hereas the Union could play an essential role in establishing establishing basic ethical principles to be respected in the development, programming and use of robots and AI and in the incorporation of such principles into Union regulations and codes of conduct, with the aim of shaping the technological revolution so that it serves humanity and so that the benefits of advanced robotics and AI are broadly shared, while as far as possible avoiding potential pitfalls. [2] The EU parliament further suggests the creation of: [a] framework in the form of a charter consisting of a code of conduct for robotics engineers, of a code for research ethics committees when reviewing robotics protocols and of model licences for designers and users. . . based on the principles of beneficence, non-maleficence, autonomy and justice. [2] The European Commission has invested 700 million euros in robotics research and development under Horizon 2020 from 2014 to 2020 [23] (including a spin-off of the HECHO project described in this paper). Amid diverse concerns about the growing and rapid implementation of robotics [24], and a large portion of a total 100 billion euros expected to be invested in robotics and other digital technologies under the upcoming Horizon Europe framework programme from 2021 to 2027 [25], the impact of robotics and AI will not be insignificant. Therefore, I argue for an urgent reevaluation of the traditional deontological or virtue-based approaches which have thus far been unsuccessful [10,17,26]. In this paper, I suggest a departure from hypothetical, procedural frameworks that strictly target the engineers' moral reasoning, and a cultural shift toward incorporating shared human values [27] into the pragmatic and social design activities of engineers that I have observed ethnographically. One way to go about achieving this shift is through ethnographic interventions which, from a social ethics perspective, would examine and restructure the social arrangement in a particular design setting to allow for the negotiation of ethnographically identified values in everyday decision-making practices.

Methodology
Ethnography, in the anthropological tradition, is a research methodology consisting of theoretical approaches, research methods, and a descriptive practice of documen-tation. Martin Hammersly and Paul Atkinson describe how ethnography is used to study a group of people closely: The task is to document the culture, the perspectives and practices, of the people in these settings. The aim is to 'get inside' the way each group of people sees the world. [28] I conducted a five-month ethnographic fieldwork in order to understand the decision-making processes and practical everyday activities of engineers, and to examine how these practices related to their ethical orientations and actions.
There exist different approaches to ethnographic research in anthropology. I worked from a sociomaterial perspective to define the empirical field, consisting of people (or actors), places, things, and practices. I used a multisited "follow the thing" approach [29] where I built my research around the 'thing' at the center of social practices -in this case, an industrial robotic cell being developed in the HECHO project in Denmark -and then traced the human activity (the engineers' practices) tied to the thing in order to construct the field of inquiry. Following the thing led me into conference rooms, electrical engineering workshops, demonstration halls, factory floors, and lunch rooms. In these places, I found mechanical and electrical engineers, programmers, software developers, system integrators, factory managers, and production engineers.
Qualitative research methods are a cornerstone of ethnography; the use of such methods provides a kind of close knowledge that is especially useful when asking questions about human behavior (ethical decisionmaking, for example). In this study, I relied on participant observation, visual and object elicitation, discourse and document analysis, ethnographic interviews, etc. Through participant observation, I learned about the everyday choices, interactions, and other activities at various sites tied to the collaborative robot development project. These observations brought to my attention particular questions of culture (such as jargon, dress codes, and education) which affected communication and collaboration among the robot developers. By involving the participants in a method of object elicitation, mapping the robotic cell and its components, I learned about the participants' separate object-worlds [30] -the different ways they thought about and related to the robot, viewing it simultaneously as a product, a research object, and a source of labor. These differences in motivations and interpretations of the object seemed to have an impact on their decisions in the cell's development. Through media analysis and qualitative interviews [31], I learned more about the participants' views on ethics and decision-making, which directed my attention to my main finding: the discrepancies between the way the engineers approached decision-making and ethics, and the way their design decisions actually unfolded in their everyday work, leaving ethics relatively unexplored.
Finally, the anthropologist ethnographer traditionally presents research results as detailed analytical descriptions. The ethnographer describes the human situation as situated phenomena, specific to time and place, subject to the researcher's own interpretations, and affected by the researcher's presence -as opposed to scientific truths. The researcher's own position within the field is particularly important in critical ethnography, an applied research methodology [32] in which the ethnographer acknowledges oneself as an actor within the field of inquiry, rather than an objective observer. The research output of this fieldwork was primarily descriptive, but I take the position of critical ethnographer in that I acknowledge my own motivations and political orientation toward the research field. The aim of this investigation was to evaluate the way decisions are made in a collaborative research and innovation project, with the hope of identifying opportunities for more ethical or value-oriented decisions in the design process, and to evaluate the potential role of the ethnographer in bringing these opportunities to fruition by challenging the social arrangement of the development through a social ethics perspective.

Findings: Everyday engineering ethics
In this ethnographic study, I observed engineering as a situated practice that is both pragmatic and social: it involves a process of decision-making that goes between conception and action, and processes of social negotiation mediated by design artefacts. As such, a social and pragmatic design process deserves a social and pragmatic ethics (see the full ethnographic study [1] for a discussion of engineering as pragmatic and social). From a review of engineering and design literature, including an examination of professional engineering association membership documents, I found that existing formalized attempts at engineering ethics have been neither pragmatic nor social. The canons and principles contained in the codes of conduct that professional engineering associations promote are not actionable. These formalized ethics approaches push an ethical orientation or a way of being, rather than a way of acting.
Not surprisingly, such oaths and edicts have been ineffective and clashed with the very pragmatic nature of design activities observed in the HECHO project. Further, I have found that professional engineering ethics is prescriptive and narrowly focused on the individual engineer -not fitting the context of development, comprising dynamic and social negotiations, that I observed amongst the engineers in this study. The HECHO engineers approached ethics abstractly, rarely (if ever) engaging with ethics in practice. In interviews, most of the participants expressed a sense of morality and empathy, but did not locate ethics within their own design practices. After sharing some narratives about ethical consequences of implementation, the engineers began to understand ethics in a new way and could identify ethical issues within their own projects. Nevertheless, they either felt it was not their responsibility to account for these ethical issues in the design process, or that their ethical decision-making agency was constrained by the design plan, by hierarchy, or by the customer. In the following section, I present empirical findings from the ethnographic study that demonstrate why existing virtue-based and individual-oriented approaches are particularly unsuitable for engineering praxis.

Ethics do not apply here
Early in the fieldwork, it became clear that social scientists were not a part of the social arrangement of the HE-CHO project. The project involved a diverse group of what I have collectively termed 'engineers', including those from the computer sciences (software engineers, programmers) and other engineering disciplines (mechanical, robotics, mechatronics, electrical, etc.), but also included other professionals (machinists, production engineer, factory manager). When I asked about ethics, it became clear that this was not a typical conversation the participants were used to having. My questions about ethics were a disruption to the normal social arrangement, wherein tasks are delegated by disciplinary expertise.
One robotics engineer, responsible for overseeing all robot development at his institute, did not think ethics was relevant to the design of industrial robotic cells at all. He held that safety regulations and CE markings served as the markers of ethical design in industrial robotics, saying that if you design within these parameters, your machine would be ethical. Other engineers also equated ethics with physical safety, but not with other forms of well-being, like emotional or social welfare.
"What we're considering, mainly, is: Is it safe to use? ...The one issue is safety. If there is something safety-related, we have to change it. And we do. We consider that very deeply." (Software engineer) Many of the engineers shared the opinion that ethics was important, but that it was not a part of their work to consider the ethical aspects of the design. The very nature of engineering, their problem-solving orientation, was often used against ethical thinking. This thinking presumes that the technical and the ethical aspects are separate, and that ethics is not a part of engineering. "We're just trying to solve a technical problem." (Software engineer) "I think, for most of those actually physically building the solutions, their motivation is the technical aspects. We just want to make cool robots, no matter what happens with them [emphasis added]. And cool solutions. Whereas, I think that most companies would also be run by people who just want to make cool solutions for people. . . .Yeah, so there's a difference in those making a robot, building it, and those actually using it. . . .I'm sure that those actually developing the robot would be technical people, just wanting to create a cool robot." (Mechanical engineer) "When you are trying to solve this problem, you're not thinking about the ethics -you're just trying to figure [out] a solution to something technical. How do I get the robot to move this part from here, to here?" (Robotics engineer) If pressed on where in the design it would even be possible to incorporate ethical decision-making, many of the engineers thought it belonged in the beginning with the decision of whether to develop the robot or not. Indeed, some ethical problems, like job loss, were tied to the automation of the task itself.
"Some of these aspects, like reducing social interaction and maybe someone losing their job, are intrinsic to these projects that we do. So if we are doing a project, some of these questions have already been decided. . . .[And for other types of projects], if you're not in on the design [planning] process, it's pretty much locked in." (Software engineer) One reason that the engineers had trouble locating ethics within their work processes is that they held a very narrow view of ethics, built around automation and job loss (as evident in a national and industry-wide discourse about saving jobs [33]): "Hah! Let me just start by saying, as you are probably aware, that we are preserving jobs rather than losing jobs." (University professor) "An automated workplace is a competitive workplace. And a competitive workplace is a well-staffed workplace." (Technological institute website) With this point of view, the choice to automate is the ethical issue. To reiterate what one software engineer pointed out: ". . . if we are doing a project, some of these [ethical] questions have already been decided." They also had trouble understanding the ethical implications of their work because they only thought about ethics in the hypothetical, and could not envision negative outcomes aside from job loss (which they had already dismissed as an issue) and physical safety (which is already regulated). A possible explanation for this disconnect between their work and the greater impact of their designs could be a lack of training and experience in ethical thinking. The engineers reported that while they'd had coursework on regulations, safety, and legal issues, they had not had any education in ethics. They simply lacked the language and experience for engaging with the topic.

Probing for ethics with ethnographic data
In order to bring the engineers' ethical-thinking out of the hypothetical, I had to conduct another round of interviews using narratives from other HECHO engineers and from professional literature to provoke ethical reflection about the HECHO engineers' own work. The engineers had had trouble extending their views of ethics beyond discursive issues, like replacement and safety. However, when given real-life examples of ethical issues with implementation, the engineers were able to understand ethics in a broader sense and were able to look beyond the ethical dilemma of whether to automate or not to automate. They told stories of industry workers whose work or livelihood was negatively affected by the engineers' own work.
"Back in the shipyard, I also heard something about that. I was not directly involved in this project, so I only heard about it. But it was a robot for welding pipes together. If you have one pipe connecting to another in a T, you have a very special welding seam, which the robot was made for. However, the trouble was that the workers were paid more for doing this complicated welding than for doing the end-welding when they have the flange. So, they indeed sabotaged the robot. Only because they were paid more for doing what the robot was doing! That was completely stupid, because if the company had done something about the payment, of course there would be no problem. Only because the loss of money that was seen from the workers' side." (Robotics engineer) Another engineer shared a story about a process he'd helped to automate, in which a small data-entry task was automated in a process of bending sheet metal from CAD drawings to create parts. Three years after implementation, the workers were still not using the software, but were instead inputting the numbers manually.
"The people on the floor, they still wanted to use the oldfashioned way because that was what they knew, and actually they thought that they put value to the product. By doing things, they could use their skills to do it, even if it was just typing in some numbers that came from a drawing. . . . The leader said that he had seen that it was difficult to take off this responsibility because then they thought that their work was not as valuable. Even if they maybe could do some more metal sheets because they were not using time to enter the numbers. But it was difficult to get people to work like that. If you forced them to do it, they'd think their work was not that interesting anymore. Because putting the sheet metal in was not the interesting part. They thought it was nice when they had a new product: 'Now we have to make this work,' type in numbers, see that it actually forming some part, and say, 'Yes! We did this.' Instead of just starting the machine and putting in sheet metal." (Robot consultant) Both engineers could see that their work negatively affected work processes and the lives of the workers, making their work less interesting, less valuable, or less profitable. Both engineers acknowledged that considering the effects on the workers beforehand might have prevented these outcomes. So although the engineers did not generally see a place for ethics within their practices, they were able to engage with a personal or professional ethical orientation toward design, when provoked. One young engineer, when asked whether he would feel ethical responsibility for a negative outcome resulting from his design, said that he wouldn't feel responsible, but that it would trouble him personally. He would like not to be involved in projects that might cause someone to lose their job, but he felt constrained by his position within the company. Although the engineers typically expressed empathy for the workers, they did not feel responsible or liable for any ethical aspects or outcomes: "Of course I would feel bad, but I wouldn't feel responsible." (System integrator) When as an ethnographer, I put forth ethnographic data from prior research and recalled specific scenarios in the HECHO project itself, the engineers were able to have a conversation about something previously thought to be inaccessible, lying somewhere beyond their competences. In adding my own expertise and language around ethics to the existing social arrangement, we crossed disciplinary boundaries to open for new ways of thinking about the ethical implications of ongoing design practices. Despite the significant headway made with these engineers, they still denied all ethical responsibility. A closer look at the social arrangement of HECHO showed that this was because the engineers associated ethical responsibility with decisionmaking power -something often formally designated in development processes (even if informally negotiated in practice).

Assessing social arrangements around decision-making
The HECHO engineers felt constrained by the social arrangements and procedural frameworks in which decisions were made. Many of the engineers subscribed to a proprietary project management framework (SCRUM, e.g.), in which particular decisions are formally assigned to certain persons, and design processes are meticulously planned, following the procedures set forth in the frameworks. Even informal decisions were delegated to particular actors -in theory. Most of the participants presented a neat design process in which the initial stages involved planning, decision-making, and sometimes ethical considerations, and later stages involved the actual execution of these decisions. Contrary to these attitudes, observed in-practice decisions were continuous, shared, and negotiated throughout the design process. Components as simple as the type of LEDs used to communicate information about the cell to the operators were discussed repeatedly, changed, and then discussed again. Although the HECHO project had formally ended, the design continued to evolve as the original cell was implemented and as new processes (tasks) came up.² STS scholar Lucy Suchman has examined this disjunction between theory and practice in her work on situated plans and actions among engineers and designers. She writes that "plans are best viewed as a weak resource for what is primarily ad hoc activity," [34]. The dynamic and ad hoc decision-making activities observed in practice unfortunately did not explicitly involve discussions of ethics or ethical issues. When design issues came up, the discussions always pertained to the cell's function, not the effects on operators or on the work environment, for example. The engineers did not even recognize these minor social negotiations as a mode of decision-making, even while this activity was characteristic of much of the actual development activities occurring in HECHO. In a way, the engineers were constrained by their linear or schematic thinking about the design process.
The way the HECHO engineers thought about and approached decision-making was consistent with the squareness of prescriptive commands in the professional codes that govern engineering and with the procedural logic that permeates engineering practices. Generally, the engineers thought that ethics belonged in the very beginning of a design process, in the decision of whether to develop the robot or not (or determining whether the robot's purpose is ethical or isn't) -echoing the make-orbreak thinking of existing engineering ethics frameworks [5,8]. The engineers' exile of ethics to the hypothetical first stages of a presumed formal design process precluded space in their thinking for addressing ethics. Ethics was simply not integrated into the everyday decisions in the continuous design process. This is an essential point, because it is in the minute decisions that the opportunities to mitigate risks and consider ethical issues may occur.

Distributing ethical responsibility
Overall, the engineers did not feel that the consideration of ethics was a part of their existing practices, nor did they feel it was their responsibility to integrate it into their practices -despite having seen and sympathized with workers who were negatively affected by the engineers' designs.
One reason for denying responsibility was attributed to a lack of agency in decision-making, especially when decisions are formally assigned or limited by hierarchical, material, or structural aspects. The robot consultant whose project involved sheet-metal bending said he felt frustrated with the company for not having gathered their workers' opinions on which tasks to automate. However, he did not feel that was his decision to make and felt constrained by the agential hierarchy of commercial projects: "It is the customer who takes the last decision. The man with the money decides." Other engineers felt that their ethical decision-making agency was constrained by structural elements like workplace hierarchy, as in the example above, or by budgets. As several engineers explained, everything they do in the design process has to add economic value. In the end, 'productivity' rules -the point of the robotic solution is to make the total human-machine production process more efficient and cheaper. "Sometimes they [operators] are annoyed with the robot, because it doesn't do it the way that they would have liked it to be done. And now they even have to provide it with resources the way that it says it wants it. So, that can sometimes give some negative reactions. . . 'Why do we need the robot if it's slower than us?' Well, because it doesn't have lunch. Or it doesn't go for lunch breaks. It works at night as well. It doesn't complain. It just does what it's told." (System integrator) "That's the logic in industry. Does it make money? No, it doesn't. Then we can't." (Software engineer) Here, where agency is limited, traditional professional ethics might suffice with moral imperatives to take a stand against one's superior, employer, or the customer; however, social ethics would instead suggest a renegotiation of agency or responsibility within the group and a restructuring of the social practices rather than placing the impetus on the already disadvantaged actor.
When asked who did have a responsibility to ensure an ethical product or ethical outcomes, the engineers deferred responsibility. The software developers pointed to the system integrators who pointed to the client (i.e., the factory manager) who pointed to the government. "We don't think about the social perspective of the operator. But that is also because we have these specifications that this machine has to perform against. And if that is causing a bad day for the operator, that responsibility lies not with us, but with the person who specifies and buys the machine -the customer." (Software engineer) "It's the customer's problem, whether the worker is happy or isn't happy." (Robot consultant) "I don't know, you can say that the government is setting the direction, saying that we should install more robots. And they know about the societal ethics, effects that it will have. So they are somehow responsible." (System integrator) Thus, the HECHO engineers may have had an ethical orientation toward design and toward users, but felt no responsibility for the harm their designs might cause. They thought ethics was important, and came to realize that ethics was relevant to their projects, but still did not think it part of their practice to consider ethics. This is symptomatic of their individual moral orientation (their professional ethics) alongside their practical abstinence (lack of pragmatic ethics). The engineers' decision-making agency, limited by the social arrangement of the project, affected the engineers' assignment of ethical responsibility. They not only felt that they had no responsibility to consider ethics, they also felt it was not feasible.

Discussion: A negotiation of values
For the HECHO engineers, it was difficult to engage with ethics on a practical level. This may have something to do with how ethics has been approached within the engineering profession thus far. Devon and Van de Poel, advocates for a social ethics of engineering, argue: Despite the considerable recent growth in the literature and teaching of engineering ethics, it is constrained unnecessarily by focusing primarily on individual ethics using virtue, deontological, and consequentialist ethical theories. [21].
Ethnographic observations of the HECHO engineers, however, showed that it may actually be possible to bring ethics into the design practices by using ethnographic data collected in the project to bring forth existing values. Current engineering ethics approaches focus on ethical or moral orientations, theoria, rather than value-driven practices, praxis [17]. Engineering ethics presupposes that an orientation will effect a consideration for human needs in design. This has not been the case in this empirical study or in experimental studies of ethics in design [16]. A valueoriented engineering, however, would flip the situation, bringing human values first into practice and perhaps transforming the ethical orientation over time. Rather than aiming to produce ethical engineers, the aim would be on reforming practices to ensure ethical engineering. There-fore, I propose the use of ethnography in identifying and challenging values in development practices.
Value comes from the Latin verb valere ('to be able', 'to be worth') [35], denoting an ability and a quality. Ethics, by contrast, assumes a field of knowledge, a set of values, and perhaps a philosophical orientation toward those values [36]. Ethics and morality derive from the respective Greek and Latin terms ethos and moralis ('character', 'manner', or 'conduct'), denoting a way of being. As shown in the previous sections, a particular engineer might identify with an ethics or a moral orientation that does not necessarily effectuate the consideration of values in design practices [16]. Ethics is rather theoretical until operationalized into the values it subsumes.
Values are the practicable element of ethics. They can be tangible and specific, such as flexibility in modular robotics design, which describes the ability to adapt the robotic cell. Whereas ethics typically concerns just one type of value (human values, or moral rights and wrongs), there exists a plurality of value types (functional, economic, e.g.) which are socially and contextually determined [37][38][39][40]. Engineers often have experience incorporating some of these value types into their work, and negotiating their worth. In the HECHO project, the meaning and worth of flexibility was negotiated by the engineers, buyers, and funding organizations involved in the cell's development and implementation. From its introduction into the project via the HECHO work-package, with the goal to develop "hyper-flexible automation," to the selection of particular design features or components. While flexibility is a functional value describing the changeable configuration or applicability of a robotic cell, I argue that engineers might also incorporate flexibility's underlying human values. The human value inclusion, for example, underlies the functional values flexibility and accessibility; by designing the cell to be more flexible or accessible, the engineers might include SMEs or operators with less experience working with a robotic cell. Existing engineering ethics credos tend to call for individually internalized principles citenews-1, while values, like flexibility here, are socially negotiated norms. The ethnographer has the opportunity to act as a broker in these negotiations, helping to identify and communicate the values that matter to operators and to engineers. Thus, a focus on identifying and negotiating values can help in shifting toward a more pragmatic and more social engineering ethics.
Moreover, values are already a part of design discourses. Batya Friedman and Peter Kahn (2006) have already created a value-sensitive design framework [41]. I am not the first to suggest a pivot to a pragmatic engineering ethics [9,18], I simply suggest a direction toward val-ues in realizing this move. Because values and engineering are both pragmatic, and because engineers engage with functional values already -such as flexibility, agility, efficiency -the significant shift will be in expanding the types of values incorporated into their design processes. Here, I will suggest that ethnographers might play a role in provoking such a shift. Perhaps, ethnographers can provoke engineers to consider ethics as human values, rather than moral principles, alongside existing functional values as a part of their everyday design practices. If engineering ethics can be translated into human values, and then embedded in tangible design values, ethics may finally be enacted in the sociomaterial practices of design.

Conclusion: Ethnography as provocation
I have suggested a shift from professional engineering ethics approaches to a more pragmatic and value-oriented social ethics. I propose that by expanding the values that engineers incorporate into their everyday design practices, we might open for more ethical outcomes in engineering. One way that we might realize this shift is through ethnographic research in robotics development. Here, I suggest that ethnographers can enter the social arrangement of design to play a role in bringing forth particular human values-by bridging object-worlds (between engineer and operator), by provoking ethical reflection to identify relevant human values, and by identifying opportunities for valueoriented technical decisions. Ethnography has been used in the past as design provocation. Jacob Buur and Larisa Sitorus (2007) have used ethnographic materials to "provoke engineers to reframe their perception of new designs," [42]. In a participatory-design project, Buur and Sitorus brought machine operators into the design process to engage in direct dialogue with engineers. This helped the engineers to better understand how the operators worked -and how the engineers' new technologies might shape or support their work. They also used ethnographic material (videos) to prompt engineers to discuss how they understand the design product itself. Wynn argues that by creating openings within the boundaries that form such practices, one diminishes the distance between these practices. These openings take place when designers are willing to be more sensitive towards the boundaries. Ethno-graphic material can help these practitioners expose, exchange and reframe their understandings. [43] (as paraphrased in [42]) As discussed further in my full account of the HECHO project [1], engineering involves a social negotiation across object-worlds. If an ethnographer were to provide ethnographic material that provoked a dialogue about ethics and human values, perhaps it would open a space where the engineers could negotiate values as part of their everyday design practices. Indeed, EunJeong Cheon and Norman Makoto Su (2018) are currently attempting to elicit values in design processes, through the use of futuristic autobiographies developed from empirical narratives [44]. While this study is a step in the right direction, it focuses on invoking values within an individual -not within the social context of design. From a sociomaterial perspective, agency (and by extension, moral agency) does not exist a priori, but is negotiated within a situated practice [34,45]. Therefore, such a practice of ethnography as provocation or as interference, should be done with a social ethics approach -taking into account the entire assemblage.
What is the active role of the critical ethnographer [32] beyond describing the engineers, their object-worlds, and their social practices? What is their role as a participating actor in the social assemblage?
In conducting ethnographic research in the HECHO project, I entered the design process with my own objectworld [30] about me: I came with different experiences and understandings of technologies as artefacts, as cultural products, etc. My object-world was changed significantly as I became familiar with the jargon, the parts, and the practices involved in the HECHO project. However, I too, brought with me some expertise and experiences which might have changed the engineers' own understandings. I asked questions about ethics, morals, and consequences. I drew attention to operators' experiences, societal needs, and caused the engineers to reflect on their own practices.
Whereas I had observed that the HECHO engineers' technical decision-making had been marked by social negotiations, this was not the case with ethical decisionmaking. The individualized ethical principles they held, however loosely, never translated to actionable values and did not come through in their work. Whatever their moral orientations, they did not discuss, share, or argue over ethical decisions. However, by having conversations around ethnographic data with the engineers, their understandings of ethics expanded and they were able to connect the ethical effects of their products to their own design processes. My ethnographic research may have altered their object-worlds: A disruption of their relatively structured approach to design might have opened the process up for new ways of understanding the operators and the robotic cells themselves.
The contribution that ethnography may make is to enable designers to question the taken-for-granted assumptions embedded in the conventional problem-solution design framework. [46] (as cited in [42]) In this way, there was some learning occurring between the engineers and myself, signaling that ethnography itself might play a role in the design process-in defining the new realities produced when engineers and ethnographers work together.
Therefore, ethnography might be seen as a form of world-making or knowledge production [47]. John Law and John Urry (2004) suggest that social scientists create certain realities when they select particular methods, perspectives, and inquiries. Citing Donna Haraway, they call social science a system of interference, and argue that if our investigations build new realities, we [social scientists] can make political choices about what type of realities we want to contribute to [48]. As Lucy Suchman writes: The representations ethnographers create, accordingly, are as much a reflection of their own cultural positioning as they are descriptions of the positioning of others. [49] In choosing to pay attention to ethics, and to draw attention to human values, and by writing an ethnography of ethics and decision-making in engineering, I have already instigated a new reality. Just by examining the HECHO sociomaterial assemblages through the lens of social ethics, by raising issues of values and ethics with the engineers, by drawing out ethnographic data and working together around it, I noticed an increase in their attention to ethics, their ability to locate clashes of values or interests in design, and in their orientation toward the design process.
So, if there is to be a move from ethical engineers to ethical engineering, a reform of engineering ethics from hypothetical, procedural, or moral frameworks toward ethical engineering practices, perhaps it can come through ethnographic provocation and social ethics. By examining the social arrangements and decision-making processes involved in a particular design, and by provoking engagement with human values in the design process, by crossing object-worlds and working interdisciplinarily, ethnographers might play a role in a more critical design process.