Jump to ContentJump to Main Navigation
Show Summary Details
More options …


Journal of Interactive Media

Editor-in-Chief: Ziegler, Jürgen

See all formats and pricing
More options …
Volume 16, Issue 3


Common Challenges in Ethical Practice when Testing Technology with Human Participants: Analyzing the Experiences of a Local Ethics Committee

Stefan Brandenburg
  • Corresponding author
  • 26524 Technische Universität Berlin, Department of Psychology and Ergonomics, Chair of Cognitive Psychology and Cognitive Ergonomics, Marchstrasse 23, Sekr. MAR 3-2, 10587 Berlin, Germany, phone: +49 30 31 42 48 38
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Michael Minge
  • 26524 Technische Universität Berlin, Department of Psychology and Ergonomics, Chair of Cognitive Psychology and Cognitive Ergonomics, Marchstrasse 23, Sekr. MAR 3-2, 10587 Berlin, Germany.
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
/ Dietlind Helene Cymek
  • 26524 Technische Universität Berlin, Department of Psychology and Ergonomics, Chair of Work and Organisational Psychology, Marchstrasse 12, Sekr. F7, 10587 Berlin, Germany, phone: +49 30 31 42 59 97
  • Email
  • Other articles by this author:
  • De Gruyter OnlineGoogle Scholar
Published Online: 2017-11-24 | DOI: https://doi.org/10.1515/icom-2017-0023


Ethical aspects are of key importance in research and the development of technical systems. They play a major role when the societal impact of innovative products and new technologies is considered. However, ethics are already essential during technology development, especially when testing these technologies with human participants. The latter is becoming increasingly important when applying for project funding and for publishing peer reviewed journal papers. Responding to these needs, a local ethics committee at the Department of Psychology and Ergonomics at Technische Universität Berlin was founded in 2009. In this paper, we present an analysis of common pitfalls and blind spots that were detected by reviewers of this ethics committee. We studied the reviews of 98 applications for ethical approval. Results show that researchers (a) often lack concrete knowledge about potential ethical issues of their research and (b) that they might benefit from convenient tools to address relevant ethical challenges at early stages of product design. Based on the results of our analysis, we propose a set of six simple rules that can help to detect and to overcome most of the frequently appearing ethical issues.

Keywords: Ethics; human participation in technology research and development; Man-Machine Interaction (MMI); research and development practice


  • We empirically investigated common challenges in ethical practice when humans are included in the development and evaluation of technical products.

  • We show that most ethical issues are easy to overcome.

  • We formulated a set of six easy-to-grasp rules of thumb that can be used to uncover most ethical blind spots and avoid common pitfalls.

1 Introduction

Morality can be described as a set of very basic standards of a society distinguishing between good and bad, right and wrong, fair and unfair [15]. Ethics however are often referring to standards of conduct of specific groups or individuals within society that, for example, hold the same profession or work within the same institution. Medical ethics, political ethics and engineering ethics are good examples of profession-related ethics. A member of a specific profession is ethically obliged to act in accordance with the specific standards of his/her profession. If standards are not met, the quality of service delivered can be unsatisfying and a loss in trust for the entire group or institution may occur.

Being an engineer demands a high ethical standard. After the Second World War, leading physicists like Albert Einstein and Max Born discussed the meaningfulness of an oath for engineers [16]. The idea was that engineers should swear to use their knowledge only for the best of manhood, their work should respect humans’ dignity, and they should oppose to those that disregard peoples’ rights (reprinted in [10]). Many guidelines, standards, and codes of conduct evolved to guide engineers during the process of technology research and development since the time of Albert Einstein and Max Born. In 1996, more than 151 of them existed for engineers only [16] and more have been formulated in the last decades, for example the software code of ethics [7].

Modern (computer) technology development is an interdisciplinary and rapidly evolving endeavour [3]. Emerging trends in today’s technology development, like automation, data mining, networking of systems, smart objects, and the Internet of Things, bring new challenges to ethics and render ethical conduct increasingly important [13]. Data security, privacy, legality, personal autonomy, self-determination, and the question of how we want to live in the future are only some of the most important ethical aspects of today’s highly technologized world (cf. [5], [12]). These questions need to be addressed by politicians, lawyers, engineers, institutions and by each member of our society.

Mundt, Krüger, and Wollenberg’s [14] research illustrates well what kind of new challenges in engineering ethics may arise from new technologies. The authors demonstrated that everyday technology can be used to easily draw inferences about peoples’ behaviour in private situations. They used the data that was provided by modern house installation networks, i.e. movement-sensitive lightning in an office floor, to track peoples’ movements. By collecting and merging different data, like individual walking speed patterns and the knowledge about who works in which office, it was possible to identify individuals and even determine who did not wash hands properly after using the restroom. Could the engineers who developed the house installation networks and the movement-sensitive lightning foresee this kind of analysis? Are they responsible for privacy issues that might arise from their products? If yes, should they recoil from their plan and should they return to selling only manual light switches? Or should they redesign their product in a way that guarantees data privacy?

To further complicate things, an engineer’s ethical obligation does not start with the decision whether they should make an invention public or keep it under tight wraps for the safety of society. The request to consider ethical aspects of technology development already at earliest stages of product development, the stage of research, has been formulated from many directions [1], [9], [13]. Few methods have been developed and implemented to ethically guide the development team on their reflection of ethical issues during the product development process. Manzeschke, Weber, Rother, and Fangerau [11] for example developed workshops for the development of ambient assisted living (AAL) appliances including a professional ethicist for ethical guidance. Furthermore, most technological artefacts are developed to be highly user-centred and usable. From an early design stage on users are often involved to uphold this aspiration. User needs are assessed and prototypes are built based on ideas and suggestions from customers or participants. Burmeister [4] pointed out that the treatment of participants, the protection of intellectual property, the freedom to participate in usability tests, and the privacy of participants are main ethical points in usability engineering.

In 2009, the Department of Psychology and Ergonomics at Technische Universität Berlin formed an ethics committee to ethically support the research phase of product development, to protect the rights, safety, and well-being of research participants. The committee’s major responsibility is to review research proposals and to give suggestions to the investigators to have their research complying with ethical standards before starting it. Literature, checklists and manuals are publicly available to guide the preparing of proposals. A full ethics proposal to this commission comprises a filled-out checklist and the attachment of all study material the participants will be confronted with. The checklist covers five areas:

  • 1.

    General information regarding the proposed research project (e.g., study name, grants, recruitment procedure),

  • 2.

    Conditions of the proposed study (e.g., research aims, sample characteristics, information on the participants physical and psychological strain),

  • 3.

    Detailed explanation on the information that is given to the participants (e.g., study duration, handling of incentives, participants’ exit options),

  • 4.

    Information regarding data privacy (e.g., type of data recorded, participants’ possibility to withdraw their data), and

  • 5.

    Information regarding the use of (incomplete) participants’ information and a transparent debriefing procedure.

Each ethics proposal is handled by three reviewers. A positive statement of possible ethical harmlessness of the research and development project is only given when all reviewers give their agreement. The presented study aims at identifying the most common ethical issues that the local ethics committee detected during their work. The main objective of this paper is to find out about the most frequent blind spots and pitfalls in proposals for research projects investigating human behaviour and experience in man-machine interaction. Based on our analysis we will generate six rules of thumb that help to identify and overcome frequent ethical problems when conducting research including human participants.

2 Method

We analysed anonymized reviews of 98 ethics proposals submitted to the ethics committee at the Department of Psychology and Ergonomics (Technische Universität Berlin) between August 2009 and September 2016. All projects involved human participants at some stage. The formal descriptions of the research projects were reasonably standardized due to the structured checklist provided by the ethics committee. Eleven proposals were excluded from analysis because applicants terminated the review process or the ethics committee refused to review the projects because of formal errors. The final dataset consisted of N=87 cases. Each project description was independently reviewed by three referees who agreed to provide their reviews for data analysis. The three reports per proposal were the basis for data analysis.

Categories and number of substantial problems per category.
Figure 1

Categories and number of substantial problems per category.

Data analysis included the number and type of ethical issues found by the referees. The types of ethical issues were then grouped in two classes of issues, substantial issues and transparency issues. Each class consisted of multiple categories in which the ethical problems were sorted. Substantial issues contained referees’ remarks regarding ethical aspects that were forgotten or inadequately implemented in the formal part of the project proposal. This includes for example an anonymization procedure for user data or a way to ensure that subjects voluntarily participate. Transparency issues consisted of comments indicating that applicants thought of essential ethical aspects but forgot to tell their participants about these aspects before, during or – in case of debriefings – after testing. For example, applicants may have proposed an anonymization procedure for the participants’ data but they might have forgotten to tell them about it in the consent form. The two classes of ethical issues were mostly unrelated. For example, it happened that applicants did think of an anonymization procedure (no substantial problem) but they did not tell their participants about it (transparency problem). It also occurred that researchers guaranteed the deletion of participants’ data at any time (no transparency problem) in the participant information sheet but stated in the formal part of the ethics proposal that they won’t delete any data (substantial problem). Multiple comments of the same issue by different referees were counted once.

3 Results

The total number of substantial problems over all proposals was N=318. In addition, reviewers identified N=198 transparency problems. On average, reviewers found 6 ethical issues per proposal (M=5.93, SD=5.55). Most of the problems per proposal belong to the class of substantial issues (M=3.66, SD=4.10), less to the class of transparency issues (M=2.28, SD=2.42). Eleven of eighty-seven (12%) proposals were ethically uncritical and received no comments. Thirty-six (41%) received 1 to 5 comments, thirty (34%) 5 to 10 comments and ten (11%) received more than 10 comments. Figure 1 depicts the frequency of substantial problems divided into topical categories. The description of the planned research (30%), handling of personal data (21%), warranty of participants’ anonymity (16%), and the strain that could be imposed on participants (strain, 15%) were categories, that captured most of the reviewers’ comments. Less deficits were detected in the categories incentives (8%) and voluntariness of participation (7%). A χ2-test showed that the number of substantial problems was not evenly distributed over the categories of this class, χ2(5)=71.11, p<0.001.

Common description problems included missing statements of the research objectives and the procedure. It was not easy for applicants to describe in detail when they were going to do what to their participants. Sometimes reviewers were also unable to find relevant information such as the starting date of the planned research or sample characteristics. Anonymity issues resulted from missing information regarding anonymization procedures. Applicants, for example, stated that their institution complies with German data protection rules. However, they forgot to say how they would anonymize their participants’ data. In addition, researchers and developers intended to use video and audio recordings in lectures and presentations without thinking about possible anonymity issues. Data handling issues arose when researchers and developers did not describe how they wanted to store participants’ data. Data handling issues also included information about giving other researchers access to study data and deleting (raw) data. Voluntariness of study participation was questionable when students were required to take part in a specific study as part of their course credit. Reviewers also commented on proposals where employees (e.g. in a hospital) were required to become test participants. The category of strain comprised comments on test durations longer than an hour without breaks. Asking people to execute multiple tasks at once for a longer period of time also led to comments. Finally, reviewers criticized when participants did not receive partial incentives for partial completion of the study. It was also discussed whether researchers and developers could lower participants’ incentives in case of bad performance.

Categories and number of transparency problems per category.
Figure 2

Categories and number of transparency problems per category.

Figure 2 shows the most important categories in the class of transparency problems. These were due to missing or incomplete descriptions of the test procedure to participants (47%), insufficient explanation of data handling procedures (20%), and deficient participant information to ensure their voluntariness to participate and the freedom to exit the test anytime (12%). The explanations regarding the incentive handling procedure (8%), the preservation of participants’ anonymity (6%) and the strain participants might be exposed to during testing (5%) have rarely been mentioned by the reviewers. A χ2-test showed that the number of description problems was not evenly distributed over the categories of this class as well, χ2(5)=153.33, p<0.001.

4 Discussion

In this paper we presented evidence showing that it is necessary to consider ethical aspects in all stages of product research and development, when humans are involved. However, readers should keep some limitations in mind when interpreting the results. First, the papers empirical basis builds upon the work of one local ethics committee only. In addition, the qualitative content analysis has its own limitations. Both issues limit the generalizability of the findings to other research and development projects. Future studies are needed to show whether other local ethics committees report similar findings and to validate our results. Furthermore, the ethics proposals that were analysed are probably not a random sample. It can be assumed that most applicants have a strong interest in getting their projects approved. They might have read up on research ethics in advance and therefore might have avoided some ethical pitfalls prior to their application to the ethics committee already. Many projects that did not require approval were not part of the analysis. It remains an open question whether researchers and developers that do not seek ethical approval might even know less regarding the ethically relevant aspects of their research.

The evidence presented in this paper showed, despite its limitations, that there is a need for ethical guidance in projects that explore the attitudes, needs, opinions, judgments, and the behaviour of human beings. Only a small amount of ethics proposals passed the review process without any comments. Two-thirds of the proposals received between one and ten remarks. Most of these did not address cosmetic issues, but basic ethical aspects of research with human participants. This means that most researchers and developers did not know which aspects of their research were ethically relevant. This finding indicates a lack of ethical knowledge when it comes to ethics in product development ([1], [2]; cf. [16]). Most ethical issues fell into the categories of the project description and the handling of test data. Applicants therefore need support when describing their projects to the (scientific) ethics committee and to their (mostly non-scientific) human volunteers in a way that both groups understand the procedure, methods and possible risks of an encounter with the technological system and the testing environment in general.

It seems, that the huge amount of existing informative literature on research ethics either does not educate people adequately (cf. [1]) or overwhelms researchers because of its’ quantity and length. A practical approach to ethics might overcome the shortcomings of extensive rules, standards, codes of conduct, etc. [11]. We believe that developing and disseminating hands-on material for practitioners that provide ethical guidance for research process including human participants might be of great value. The ethics committee for example uses checklists [8]. The Deutsche Gesellschaft für Psychologie (DGPS) freely offers templates that can be used to communicate with test participants [6].

Based on the results of the present study, we formulated six simple rules of thumb that might help to avoid common ethical issues when testing technology with human participants. The issues per category were analysed using qualitative content analysis. We grouped the issues per category in a way, that we could inductively agree upon rules that represent most of the issues in this category. Then we summarized these rules per category in an iterative discussion of the three authors until one remained. This final rule should represent the essence of the issues per category.

  • 1.

    Category description: Describe your study in a way that any person understands what you are doing and what is going on, even a non-specialist, a less educated person, or a child.

    This rule points out that information sheets must be well structured and provide short and comprehensible sentences. In some cases, it might be helpful to read the information to all participants to avoid misunderstandings. A further example is the use of frequently asked questions from a participant’s point of view. In a pre-study, the comprehensibility of all information should be tested and optimized.

  • 2.

    Category Anonymization: Provide full anonymization – in any case and always.

    Ideally study data is anonymized by a code that is generated by the participants following a given rule. For example, a combination of personally meaningful letters and numbers (e.g., first letter of father’s name, second of the mother’s name and month of birth) are the best way of data anonymization. This code does not allow for linking study data with individual information. A random code or ordered numbers are problematic, since they cannot be reconstructed by participants, e.g. in case of withdrawing consent after completion of the study.

  • 3.

    Category Data Handling: Be aware that data belong to people. Data are not yours. Treat them as a loan and tell people what you are going to do with it.

    Participants must be informed about the purposes and the use of data acquisition before the experiment starts. This includes the presentation and publication of fully anonymized data as well as making it available to other researchers (e.g., open source or reanalyses). A special consent should be declared with regard to video/audio data and the use of them after the study is completed. Data must be stored confidentially and safe at any time. Personal data like name and address should be stored separately from anonymized study data as to performance measures, usability and user experience ratings. Ideally, applicants should provide a period of time (e.g. two weeks) where participants can withdraw from the study after completing it. The time of deletion should be addressed in the participants’ information sheet as well.

  • 4.

    Category Voluntariness: Never force your participants. Always provide an opt-out without any negative consequences for them.

    The consent of participation should include a statement such as “Your participation is voluntary. You may withdraw your consent at any time during the study without giving any reason and without expecting any negative consequences.” Voluntariness is a basic right from a participant’s point of view and essential for a good scientific practice. Researchers should also provide the possibility to withdraw their consent and the full deletion of all data within an acceptable time interval (e.g., within two weeks) after study completion.

  • 5.

    Category Strain: Handle humans with care. Provide breaks, set up a comfortable setting, etc.

    Many studies in human-computer interaction investigate complex environments and are therefore demanding for participants (e.g., simulators, EEG and fMRT studies). Breaks are essential for recreation and well-being. Typically, participants do not demand for breaks, even if they are necessary. Therefore, the study procedure should be standardized with regard to breaks. Depending on the actual task a break of at least five minutes for each hour should be implemented.

  • 6.

    Category Incentives: Show your participants that you appreciate their collaboration (e.g., reward them). Partial support receives partial reward.

    Empirical research without participants is hardly possible. Beverages and/or snacks (e.g. a glass of water) should be provided during the study. To our understanding an acceptable level of monetary compensation for participation ranges between 8 and 10 Euro per hour. Vouchers, discounts or a tombola are alternative approaches. Be aware that many participants are also interested in your findings. It could be a reward for them to get to know the study using an appropriate communication channel like e-mail, letter, or oral presentation.

These six rules are simple, easy to understand, and easy to remember. They might help to avoid common ethical blind spots and pitfalls. However, they can be regarded as rules of thumb or heuristics only. They might guide technology researchers in some, but not all cases. This easy-to-use toolbox cannot replace ethical reflection or the benefits of ethicists that are embedded in the research and product development process. The list should rather be understood as a low level starting point helping researchers to check their work on a regular basis. Our rules could help them to increase their practical knowledge about the meaning and application of ethics.

5 Conclusion and Final Remarks

The present study firstly revealed interesting results of the work of a local ethics committee. It provided valuable information for engineers, human factors specialists, and designers about ethical challenges that need to be addressed when testing technology with human participants. The study also showed that empirical research can add knowledge to the discussion of ethics in technology research and product development. The six rules of thumb that were formulated added a practical perspective to a rather philosophical discussion of ethics and technology. We hope that these simple rules of thumb motivate researchers and developers to consider basic ethical principles even if they do not need an approval of an ethics committee. As authors, we strongly encourage further empirical research on the understanding of ethics in practice. This knowledge helps researchers and practitioners to identify the relevant ethical questions for specific research projects and to discuss critical issues concerning human participation at early stages of product development processes.


We would like to thank all reviewers and members of the ethics committee of the Department of Psychology and Ergonomics of the Technische Universität Berlin for their conscientious and highly valuable work.


  • [1]

    Brandenburg, S. (2015). Ethik in Technikforschung und Technikentwicklung: Erfahrungen. In Mensch und Computer 2015 Tagungsband, 299–302. Google Scholar

  • [2]

    Brandenburg, S. (2016). The ethics of technology development – a human factors practitioners’ perspective. Poster presented at the annual meeting of the HFES Europe Chapter, Prague, Czech Republic. DOI: 10.13140/RG.2.2.29548.97921. CrossrefGoogle Scholar

  • [3]

    Brynjolfsson, E. & McAfee, A. (2012). Technology’s influence on employment and the economy. In E. Brynjolfsson & A. McAfee Race Against the Machine: How the Digital Revolution Is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy. MIT Press: USA. Google Scholar

  • [4]

    Burmeister, O.K. (2001). HCI professionalism: ethical concerns in usability engineering. In J. Weckert, (ed.). Proc. Selected Papers from the 2nd Australian Institute of Computer Ethics Conference (AICE2000), Canberra. CRPIT, 1. ACS. 11–17. Google Scholar

  • [5]

    de Montjoye, Y.-A., Radaelli, L., Singh, V.K. & Pentland, A.S. (2015). Unique in the shopping mall: on the reidentifiability of credit card metadata. Science (347)6221. 536–539. CrossrefWeb of ScienceGoogle Scholar

  • [6]

    DGPS (2017). Informationen und Vorlagen für Antragsteller. Internetquelle: https://www.dgps.de/index.php?id=186. (letzter Abruf: 06. März 2017). Google Scholar

  • [7]

    Gotterbarn, D., Miller, K. & Rogerson, S. (1997). The software engineering code of ethics. Communications of the ACM 40(11), 110–118. CrossrefGoogle Scholar

  • [8]

    IPA (2017). Ethik-Kommission des Instituts für Psychologie und Arbeitswissenschaft der Technischen Universität Berlin. Internetquelle: http://www.ipa.tu-berlin.de/menue/einrichtungen/gremienkommissionen/ethik_kommission/ (letzter Aufruf: 06. März 2017). Google Scholar

  • [9]

    Kaiser, M. (2004). Ethics, science and precaution: a viewpoint from Norway. Report of the National Committees for ethics research, (NENT), Oslo: Norway. 

  • [10]

    Lenk, H. & Ropohl, G. (1993). Technik und Ethik. Reclam Verlag: Stuttgart. Google Scholar

  • [11]

    Manzeschke, A., Weber, K., Rother, E. & Fangerau, H. (2013). Ergebnisse der Studie, “Ethische Fragen im Bereich Altersgerechter Assistenzsysteme”. Thiel Gruppe: Ludwigsfelde. 

  • [12]

    Manzeschke, A. (2014). Lebensqualität und Technik – Ethische Perspektiven auf einen bipolitischen Diskurs. In M. Coors & M. Kuhlehn (Hrsg.). Lebensqualität im Alter. Kohlhammer: Stuttgart. 111–125. Google Scholar

  • [13]

    Manzeschke, A. (2015). Angewandte Ethik organisieren: MEESTAR – ein Modell zur ethischen Deliberation in soziotechnischen Arrangements. In M. Maring (Hrsg.). Vom Praktisch-Werden der Ethik in interdisziplinärer Sicht: Ansätze und Beispiele der Institutionalisierung, Konkretisierung und Implementierung der Ethik. KIT: Scientific Publishing: Karlsruhe. Google Scholar

  • [14]

    Mundt, T., Krüger, F. & Wollenberg, T. (2012). Who refuses to wash hands? – Privacy issues in modern house installation networks. In Proceedings of the IEEE 2012 Seventh International Conference on Broadband, Wireless Computing, Communication and Applications. 271–277. Google Scholar

  • [15]

    Resnik, D.B. (2005). The Ethics of Science: An Introduction. Routledge: New York. Google Scholar

  • [16]

    Ropohl, G. (1996). Ethik und Technikbewertung. Suhrkamp: Frankfurt a. Main. Google Scholar

About the article

Stefan Brandenburg

Stefan Brandenburg, PhD studied psychology at the Technische Universität Chemnitz and the University of Oklahoma, USA. Since 2008 he is a research assistant at the chair of Cognitive Psychology and Cognitive Ergonomics at the Technische Universität Berlin. He is a co-founder and chair of the IPA ethics commission. His research interests include the integration of ethical aspects in human factors research, temporal changes of affect and emotion, and the design of highly automated driving systems.

Michael Minge

Michael Minge studied psychology at Freie Universität Berlin and Human Factors at Technische Universität Berlin. He received a doctoral degree in engineering from Technische Universität Berlin. His research interests focus on user-centered design, usability engineering, user experience, gamification and emotions in HCI. Currently he is postdoctoral researcher at Technische Universität Berlin and works on the development of medical devices supporting mobility.

Dietlind Helene Cymek

Dietlind Helene Cymek studied business psychology at Leuphana University of Lüneburg and Human Factors at Technische Universität Berlin. Currently she pursues her PhD in psychology at Technische Universität Berlin. Her research interests include human-automation-interaction and human redundancy. Dietlind Helene Cymek is a member of the ethics committee since December 2014.

Published Online: 2017-11-24

Published in Print: 2017-12-20

Citation Information: i-com, Volume 16, Issue 3, Pages 267–273, ISSN (Online) 2196-6826, ISSN (Print) 1618-162X, DOI: https://doi.org/10.1515/icom-2017-0023.

Export Citation

© 2017 Walter de Gruyter GmbH, Berlin/Boston.Get Permission

Comments (0)

Please log in or register to comment.
Log in