Skip to content
BY-NC-ND 4.0 license Open Access Published by De Gruyter March 8, 2017

Analysis of ergonomic and unergonomic human lifting behaviors by using Inertial Measurement Units

Jan Kuschan, Henning Schmidt and Jörg Krüger


This paper presents an analysis of two distinct human lifting movements regarding acceleration and angular velocity. For the first movement, the ergonomic one, the test persons produced the lifting power by squatting down, bending at the hips and knees only. Whereas performing the unergonomic one they bent forward lifting the box mainly with their backs. The measurements were taken by using a vest equipped with five Inertial Measurement Units (IMU) with 9 Dimensions of Freedom (DOF) each. In the following the IMU data captured for these two movements will be evaluated using statistics and visualized. It will also be discussed with respect to their suitability as features for further machine learning classifications. The reason for observing these movements is that occupational diseases of the musculoskeletal system lead to a reduction of the workers’ quality of life and extra costs for companies. Therefore, a vest, called CareJack, was designed to give the worker a real-time feedback about his ergonomic state while working. The CareJack is an approach to reduce the risk of spinal and back diseases. This paper will also present the idea behind it as well as its main components.

1 Introduction

Inertial Measurement Units (IMU) are sensor systems, which return a relative position through acceleration, angular velocity and magnetic field. Modern IMUs can be worn on textiles in many life situations without special preparation. Nowadays, action recognition is often realized by cameras, but the usage is limited by the flexibility and complexity of the environment which lead to shadowing of the persons, the cameras field of view and the amount of persons. On the one hand static camera setups are nowadays good to handle, but on the other hand is the complexity wearable camera setups also very high. Using IMUs instead makes a new set of applications in the field of action recognition possible. To name a few:

  • ergonomic workspace classification

  • teacher feedback module for training ergonomic motions in work situations

  • sports workout feedback

  • intention based control of exo-skeletons or protheses

  • health monitoring by recognition of changes in actions.

The safety and health at work report, released in 2014 by the Federal Institute for Occupational Safety and Health (BAuA) [1], has shown that back related diseases caused by longtime holding or carrying heavy load were the most frequently reported diseases in the field of mechanical impact. Five thousand four hundred and ten suspected cases were recorded in 2014 and 237 new pensions are payed to all the persons concerned. Since the study takes only those cases into account where it was no longer possible for the person concerned to work in their job the actual number of people with back issues is probably many times greater as many cases remain undetected. Occupational groups with increased back injury risk are for example assembly line workers, nurses, butchers, masons, construction workers and deliverymen. These types of jobs are if at all possible hard to automate. Oftentimes this is the case when the jobs are non-stationary or when they require sensory and motoric abilities, flexibility or cognition. The work from Simon et al. [2] shows that 54% of the persons with jobs in the automotive industry report pain in their lower back and 43% report neck and shoulder pain. Studies on jobs in the field of patient and geriatric care show that 82–86% of the employees deal with at least muscoskeletal pain [3]. Deliverymen are also at high risk, since the amount of goods one can order over the internet is growing and used more and more. Especially people who carry heavy loads like beverages are at great risk. While lifting is required in these fields it can not easily be reduced, but it is still possible to take countermeasures versus back pain by working ergonomically. Because the ergonomic and unergonomic movement are distinct and very well distinguishable they will serve as sample movements in this paper.

2 Related work

Improving ergonomics in manufacturing environments is vital to companies, that want to retain human resources and establish a good work life balance. The ergonomics of workplaces are typically rated by standardized sheets, like the Ergonimic/European Assessment Worksheet (EAWS), which considers topics like force, workload, manual material handling and posture. While most of the people working in such environments are familiar with the guidelines for correct posture and workload, oftentimes the employees adapt to wrong movements and postures over time. This leads to a decrease of the workplace ergonomics, which is not recorded. To handle problems like this, different camera based systems are subject to current research. Typically, the worker is tracked by a camera system, the pose is estimated and the environment is changing to a more ergonomic scenario. This is done for example by Nguyen et al. in [4]. The idea of tracking human poses or motions is an old one, but still the realization is not good enough for the industry and care sector. Especially for camera systems, there are a lot of different approaches with their own strengths and weaknesses. The use of IMUs for motion tracking however is also a frequently observed topic. One differentiates between pose estimation and motion tracking. For motion tracking Mannini et al. [5] gave a good overview about performance of different features and classifiers to recognize typical human movements by IMUs. In [6] and [7] the authors present an algorithm, that auto generates a regression function for the movement and checks an incoming signal for correlation for this function. This method is not very robust against time related variances and by using further features, the tracking should also be possible for time varying signals.

3 System

To capture, support and feedback the movements of the worker, a system, called CareJack, was designed which is shown in Figure 1 and inherits the following components:

  • Vest (orthetic or flexible)

  • Sensors

  • Signal processing

  • Action recognition

  • Feedback

Figure 1 Prototype with five IMUs (red), two strain gauge (purple), one battery (orange), the feedback module (blue) and a processing unit (yellow) [6]. (A) CareJack front view, (B) CareJack back view.

Figure 1

Prototype with five IMUs (red), two strain gauge (purple), one battery (orange), the feedback module (blue) and a processing unit (yellow) [6]. (A) CareJack front view, (B) CareJack back view.

Since the main features and advantages were presented already by Kuschan et al. in [6] and [7], the figures and component list should only give a rough idea. The hardware itself is not explained in detail.

4 Movement

In theory it is possible to distinguish between movements using IMUs, which shows a significant difference in acceleration or angular velocity. The absolute position and especially the relative position between the sensors can be useful for a pose estimation which is not part of this paper. The main idea is to recognize the current action of a worker and give him a feedback, if he is in an ergonomic or unergonomic sequence in real-time. In order to realize this idea the system has to meet the following requirements: It must track, recognize and make proper distinctions between movements, otherwise the cry wolf effect occurs leaving the user unsatisfied. One may think it is sufficient to detect solely the unergonomic behavior, but especially for indication of the ergonomics over a long period and also training it is important to have a prediction possibility about the quality of movement.

The ergonomic correct and incorrect lifting of boxes will be observed and a set of interesting features will be presented. The two sequences are shown in Figure 2. They are a snapshot of the critical moment when the person grabs the box. In this very moment the user needs an instant feedback, if possible even earlier. The problems that arise include different movement execution depending on the external facts like the size and agility of the user which has a direct influence on the signal. This has been solved by choosing eight different persons with following characteristics:

  • five males and three females

  • body height from 1.58 to 1.85 m

  • weight from 64 to 89 kg

Figure 2 Observed movements in final position [6]. (A) Ergonomic correct box lifting, (B) Ergonomic incorrect box lifting.

Figure 2

Observed movements in final position [6]. (A) Ergonomic correct box lifting, (B) Ergonomic incorrect box lifting.

Further, the persons where asked to perform the actions with different execution speed, from slow over medium to fast. A total set of approximately 560 lifting sequences have been gathered for the five IMUs with 3-DOF accelerometer, 3-DOF gyroscope and 3-DOF magnetometer. In Figure 3 and 4 an exemplary acceleration profile is plotted. Each cycle is separated by a black line. The graphs show the variance of the movement for one person as well as the importance of a sensor specific feature set.

Figure 3 Acceleration profile of 13 ergonomic lifting cycles.

Figure 3

Acceleration profile of 13 ergonomic lifting cycles.

Figure 4 Acceleration profile of 16 unergonomic lifting cycles.

Figure 4

Acceleration profile of 16 unergonomic lifting cycles.

Evaluating the acceleration and velocity profiles, a histogram for every profile was created as shown in Figure 5. The breaks between every cycle were not deleted and have to be considered. For this sensor and axis the acceleration of the pause states are in the region of a ≈ 0 m/s2. The histogram shows, that the acceleration at y-axis of sensor two has a high density between 4.5am/s26.5 for ergonomic and 7.5am/s29 for unergonomic movements. The density itself is a good indicator between these two actions and shows vast differences between the ergonomic and unergonomic movement even for different persons and execution speed.

Figure 5 Histogram of Sensor 2 Y-Axis for ergonomic and unergonomic box liftings of different persons.

Figure 5

Histogram of Sensor 2 Y-Axis for ergonomic and unergonomic box liftings of different persons.

In Figure 6 only one cycle is plotted allowing a more detailed view. It reveals yet another issue, which has to be considered. Since it is necessary to give feedback before the person performs the unergonomic movement it has to happen before the box is lifted. Thus the region of interest is visualized in this graph. The duration for calculating the mean and standard deviation highly depends on the type of movement. If the length is chosen to small, the influence of noise is to high and if it is chosen to big, the amount of features will be to small eventually causing two movements to blur into each other. For this movement a good length is in the area of n ≈ 22 samples or Δt ≈ 660 ms. Furthermore, the observed signal length varies between 2.0ts4.5. This is due to the different execution speeds of each person.

Figure 6 Exemplary mean values and standard deviation of one ergonomic movement cycle.

Figure 6

Exemplary mean values and standard deviation of one ergonomic movement cycle.

5 Conclusion

For two distinctive movements, a few different approaches for feature extraction and classification were presented. A first feature list should inherit the acceleration density, mean and standard deviation of the last t ≈ 4.5 s for slow and down to t ≈ 2 s for fast movements. As discussed, this long timerange is necessary to track slow as well as fast motions. For normal statistical features, it seems reasonable to classify not only as ergonomic and unergonomic, but also as at least three execution speeds. Due to real-time and preventive requirements, it was shown that only the first part of signal is relevant for the classification.

6 Further work

Additionally to this work further features, like gradients or relations between the sensors will be explored more in detail. There will be also an evaluation on the performance of the features presented in this paper for different classifiers.

Author’s Statement

Research funding: This research is funded by the German Federal Ministry of Education and Research (BMBF) in the joint project Care-Jack. Conflict of interest: Authors state no conflict of interest. Informed consent: Informed consent has been obtained from all individuals included in this study. Ethical approval: The research related to human use complies with all the relevant national regulations, institutional policies and was performed in accordance with the tenets of the Helsinki Declaration, and has been approved by the authors’ institutional review board or equivalent committee.


BMAS/BAuA. Auszug aus dem Bericht “Sicherheit und Gesundheit bei der Arbeit 2014”.Search in Google Scholar

Simon M, Tackenberg P, Hasselhorn HM, Kümmerling A, Bücher A, Müller BH. Auswertung der ersten Befragung der NEXT-Studie in Deutschland; 2005.Search in Google Scholar

Kleina T, Brause M, Horn A, Wingenfeld K, Schaeffer D. Qualität und Gesundheit in der stationären Altenhilfe – Eine empirische Bestandsaufnahme; 2012.Search in Google Scholar

Nguyen TD, McFarland R, Kleinsorge M, Krüger J, Seliger G. Adaptive Qualification and Assistance Modules for Manual Assembly Workplaces. Procedia CIRP. 2015;26:115–20.10.1016/j.procir.2014.07.117Search in Google Scholar

Mannini A, Sabatini AM. Machine learning methods for classifying human physical activity from on-body accelerometers. Sensors 2010;10:1154.10.3390/s100201154Search in Google Scholar PubMed PubMed Central

Kuschan J, Schmidt H, Jung E, Oestermann U, Winkler U, Schreivogel A, et al. Verbesserung der Ergonomie am Arbeitsplatz mittels einer intelligenten Orthesen-Weste; 2016.Search in Google Scholar

Kuschan J, Schmidt H, Krüger J. Improved ergonomics via an intelligent movement and gesture detection jacket; 2016.Search in Google Scholar

Published Online: 2017-3-8
Published in Print: 2017-3-1

©2017 Jan Kuschan et al., licensee De Gruyter.

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.