Jump to ContentJump to Main Navigation
Show Summary Details
More options …

Open Engineering

formerly Central European Journal of Engineering

Editor-in-Chief: Ritter, William

1 Issue per year


CiteScore 2017: 0.70

SCImago Journal Rank (SJR) 2017: 0.211
Source Normalized Impact per Paper (SNIP) 2017: 0.787

Open Access
Online
ISSN
2391-5439
See all formats and pricing
More options …

A survey of telerobotic surface finishing

Thomas Höglund / Jarmo Alander / Timo Mantere
Published Online: 2018-05-31 | DOI: https://doi.org/10.1515/eng-2018-0018

Abstract

This is a survey of research published on the subjects of telerobotics, haptic feedback, and mixed reality applied to surface finishing. The survey especially focuses on how visuo-haptic feedback can be used to improve a grinding process using a remote manipulator or robot. The benefits of teleoperation and reasons for using haptic feedback are presented. The use of genetic algorithms for optimizing haptic sensing is briefly discussed. Ways of augmenting the operator’s vision are described. Visual feedback can be used to find defects and analyze the quality of the surface resulting from the surface finishing process. Visual cues can also be used to aid a human operator in manipulating a robot precisely and avoiding collisions.

Keywords: surface finishing; telerobotics; remote manipulator; visuo-haptic feedback; mixed reality

1 Introduction

Surface finishing is a complex process that may require the dexterity of human hands and intelligent use of feedback that a human worker gets when he or she touches and looks at the surface being treated, to achieve high quality. On the other hand, automation may provide superior repeatability and lower costs. It can be difficult to automate the process if the surface finishing tool does not perform consistently or if machine vision is difficult to implement for analyzing the result [1]. Automation can also be prohibitively expensive if the production line often needs to be reprogrammed for new products in the case of short-series production. In this case, it may be more feasible to perform the surface finishing manually, but this has disadvantages. A human operator cannot operate heavy, vibrating machinery for extended periods of time. The environment can be very uncomfortable and have an adverse effect on the operator’s health. Therefore, telerobotics would be useful for allowing manual control of the surface finishing process at a distance.

This paper is organized as follows: Robotic surface finishing is briefly introduced in Section 2. The benefits of using a teleoperated robot versus a fully manual or a fully automated process are presented in Section 3. The means of providing visual information to the teleoperator using mixed reality are discussed in Section 4. Section 5 deals with haptic feedback and force control for teleoperation of surface finishing. In Section 6, methods of optimizing haptic control are discussed. Finally, Section 7 proposes directions for future research and concludes the paper.

2 From manual to robotic surface finishing

Common surface finishing tasks include chamfering, deburring, grinding and polishing. Hashimoto et al. [2] described a wide range of commonly used abrasive fine-finishing processes in their recent article with a significant number of sources. In general, these processes can also be applied in robotic applications. Kuhlenkoetter and Zhang [1] gave a nice introduction to how the surface finishing process can be manual, partially automated or fully automated.

A grinding machine is strenuous to move around and press against a surface manually, and the vibrations can cause damage to hands and arms in the long term. There are safety regulations in place to protect workers from being exposed to harmful vibrations. The European directive 2002/44/EC – vibration [3], describes how mechanical vibrations should be considered to ensure the health and safety of workers. The directive refers to the ISO 5349-1:2001(E) standard [4] for measurement and evaluation of human exposure to hand-transmitted vibration. As an example, an operator using a handheld orbital sanding machine should not be exposed to vibrations of more than 2.5 m/s2 RMS over eight hours [5]. Placing such a tool in a teleoperated manipulator frees the operator from the uncomfortable and potentially harmful mechanical vibrations, but allows the operator to use his senses to guide the process. This also removes the limit on the time the operator can operate the tool and the limit on the allowed instantaneous vibration.

3 Benefits of teleoperation

Teleoperation is the manual or automatic remote control of a remote machine. Telerobotics deals with teleoperated robots. Conway et al. [6] categorize methods of teleoperation (teleautonomous control) into the following, going from direct to more automatic methods: direct continuous control, shared continuous control, discrete command control, supervisory control, learning control and guidance of remote nonexpert humans by local experts. All these schemes, except the last one, are feasible for telerobotic surface finishing. Milgram et al. [7] described a taxonomy for classifying human-mediated control of remote manipulation systems, based on three dimensions: degree of machine autonomy, level of structure of the remote environment and extent of knowledge or modellability, of the remote world. Figure 1 presents an example of the components a telerobotic surface finishing system could consist of.

Diagram of a telerobotic surface finishing system
Figure 1

Diagram of a telerobotic surface finishing system

Traditionally, telerobotics has been mostly applied to dangerous environments and critical tasks, such as space robotics, subsea robotics, and telesurgery [8], but also to tasks that require human skill and supervision [6], such as operating forestry cranes. If a telerobotic system is used instead of a manual tool, the process can be made semiautomatic to relieve the operator of mundane and repetitive tasks and to increase precision or speed when necessary.

The teleoperator can choose when to perform manual control and when to perform automatic functions. The operator’s inputs can also be automatically corrected, for example to smooth movements or to align a tool automatically to a surface normal. It is difficult to program a robot to follow a surface that has a complex shape if the surface finishing machine has to be controlled in a very constrained manner not to collide with the object and not to touch parts that should be avoided. For example, when grinding a small concave part of a surface, care must be taken not to damage the surrounding ridges. As Nagata et al. [9] point out, automation of the sanding process after machining is difficult because it requires delicate and dexterous skills not to spoil the surface. A skilled teleoperator using a good haptic interface and audio and video feeds could probably outperform a simple robot program that relies only on a fixed path and force control, with no knowledge of the surface finishing result and vibrations.

A human teleoperator could be located at a remote facility and be tasked with controlling a fleet of machines in different locations intermittently [6, 10]. In this case, the operator has to rely on limited visual, auditory, and haptic feedback from sensors and cannot perform other manual actions such as changing and servicing tools. Teleoperation can also be performed locally to allow the operator directly to inspect visually and aurally the process and tools with his own eyes and ears. If the control does not involve a remote-control system, it can be called augmented manual fabrication [11]. It is easier for a local operator to sense and react to faults and errors in the process than for a remote teleoperator. Advanced sound processing could be useful in telerobotics for giving the operator appropriate auditory feedback of how the machine is performing, and for detecting unusual sounds that may indicate faults such as broken mechanical parts, bad alignment of a surface finishing tool to a surface, or wrong magnitude of force.

A surface finishing process that handles small series of products is costly to automate because the system frequently needs to be reprogrammed when changing products. In this case, a manual or teleoperated process could adapt to the varying tasks without incurring any additional costs. On the other hand, employing an operator all the time may be as expensive as reprogramming a robot in the long run. When new techniques and tools are to be implemented for increasing the quality of the surface finishing, a human operator can immediately start working with these, whereas an automated system would require the process to be halted for reprogramming. Checking the condition of a sanding disc in an abrasive process and changing the disc when worn out or clogged is difficult to automate, but could easily be performed manually.

Having a human operator instead of automation can also make it safer to collaborate with other human workers on the production line. A human operator can detect dangerous situations and prevent accidents. On the other hand, control restriction is needed for safety to limit the speed and force of a teleoperated robot collaborating with humans.

According to Kuhlenkoetter and Zhang [1], grinding using an abrasive belt is difficult to simulate and automate because the removal rate and cutting properties of the abrasive continuously vary. An elastically deformable contact wheel is suitable for allowing the abrasive to curve slightly around surfaces, producing a higher quality surface finish. This elasticity further introduces variation that is unreliable to calculate. Because of these and many other factors, estimating the removal rate is a rather experience-based process. Kuhlenkoetter and Zhang go on and present methods for simulating removal rate in a robotic grinding process, but being an experience-based process makes it suitable for manual control or telemanipulation, where the operator can sense the results [1].

4 Augmenting the operator’s vision

Machine vision can be used for analyzing a surface. A single or multiple cameras and light sources can be employed for detecting light in different directions and in different parts of the spectrum, for example visible, infrared, or thermal radiation. Machine vision can be superior to manual visual inspection for many applications, but it can be difficult to set up and for some applications, manual visual inspection is more reliable. Instead of teaching a machine correctly to classify surface finish defects, a human operator can perform the inspection while he is teleoperating the surface finishing tool. For example, if a small defect is being treated with an abrasive, the actual type of defect may become clear only after treating it for some time. An automatic imaging system can take pictures of such a defect that the operator analyses while treating the defect. Live camera feeds and graphics can be displayed to the user in the form of augmented reality (AR) or mixed reality (MR), as described in [10]. Mixed reality is a mix of reality and virtual reality. It is generally the combination of real pictures and computer generated graphics. Augmented reality is mixed reality where most of the content is real and only augmented by virtual content.

Already in the 1990s, virtual reality (VR) telerobotics was a popular research subject. Bejczy [12] describes how VR, in combination with CAD, can be applied to robotic manufacturing. VR can provide accurate 3-D situational awareness, giving the user information about the model and the surface finishing process in real time, handling large data sets. Paul Milgram defines a reality–virtuality continuum, where the real environment and the virtual environment are at the extremes, and the region in between is termed mixed reality, which contains AR and augmented virtuality [13, 14]. AR and MR are also described well by Chintamani in [15], where he analyzes how they can be applied in telerobotics. Bejczy [12] gives a broad introduction to telerobotics, MR and associated interfaces, but the focus is on space applications, not manufacturing. He also proposes ways of optimizing the remote-control system to be as intuitive and efficient as possible. Different camera configurations are analyzed for their suitability for visual feedback in teleoperation and image-guided navigation techniques are summarized.

The teleoperator’s vision can be augmented by overlaying useful information, such as numerical displays, graphs, indicators and color maps on top of a live video or onto AR glasses. For example, the applied force could be shown numerically and the temperature of the object being processed could be overlaid as a heat map based on input from a heat camera. A heat map would be useful in grinding applications for avoiding overheating the object. Chin, Madsen and Goodling [16] used infrared thermography for sensing the arc welding process. Such sensing would be very useful as a live overlay while, for example, welding aluminum, the heat of which is difficult to see. Rivers [11] developed a system that projects guidance graphics directly onto a sculpture that is being sculpted. The same technique could be used for indicating which parts of an object need machining.

Chintamani [15] proposes an advanced method of end-effector guidance for avoiding collisions and achieving smooth manual control. A planning agent produces visual aids for the operator as AR. Chintamani points out that AR is to be preferred over VR because AR allows the operator to see the actual environment and the visual aids in a single view. AR also makes visible the translational and rotational matching accuracy of the virtual content with the actual world.

If there is an automatic machine vision system for detecting surface defects such as particles, scratches, blobs, dents, and thin paint layer, these can be indicated by graphics. The operator can also benefit from a graphical overlay of which areas of the object have already been treated, to cover all areas once and only once, or to show which areas required the most treatment.

5 Haptic feedback for teleoperation

A teleoperator can make use of a haptic feedback device for feeling and controlling the forces a telerobot is applying and the vibration it is subjected to, thereby gaining a sense of touch. Pagilla and Yu [17] classify methods of force control for surface finishing into three categories: hybrid position/force control, impedance control and constrained mechanical systems modeled as differential–algebraic equations. Accurate force control is imperative for precise surface finishing. If, for example, an abrading tool touches a surface with an initial force peak, the surface can get dented or scratched. According to [17], switching from motion in free space to constrained motion on the workpiece surface can cause significant force control stability problems due to the nonzero velocity of impact, which can lead to discontinuities in the system equations.

Pagilla and Yu [18] developed an estimation algorithm for estimating the coefficient of grinding friction. The normal and tangential forces of the robotic tool are related by this coefficient, but it is varying and difficult to estimate. The coefficient of grinding friction would be a useful parameter for optimizing the grinding result. The normal force can be controlled by a force control actuator, such as the Active Contact Flanges by FerRobotics Compliant Robot Technology GmbH, mounted between the robot and the surface finishing tool to control the force without having to control exactly the robot joints to produce the force. Controlling the applied force only using the robot can lead to vibrations and a slow control loop due to the weight of the robot parts that have to move rapidly to compensate for vibration and uneven surfaces. Pagilla and Yu [17] described the dynamic model of a robot for surface finishing operations and present control algorithms for each phase of a robotic surface-finishing task.

The teleoperator requires low-latency force feedback to be able to perform haptic control. In the case of an abrasive or polishing process, the force will strongly influence the resulting surface finish. If too much force is applied the surface may be damaged when the abrasive cuts too deeply or the temperature becomes too high.

If the tool is very large or very small, the operator may need to apply a force that is scaled up or scaled down, respectively, compared with what the operator feels from the force feedback device. Telerobotics enables operators to control much heavier tools than they could manually handle. The force input from the operator can be combined with a function for applying, for example, a sinusoidal force component to generate a structured surface finish. The force could also be controlled automatically and, for example, be ramped up or down over time while the operator controls the location and direction of the tool.

6 Optimizing haptic control

Haptic or tactile sensors are useful in various robotic applications. A haptic sensor measures physical interaction with the environment, such as forces and temperature, and are used for providing a user with haptic feedback. Force information can for example be provided by a load cell and used as feedback to a manual force control device for controlling the force that a robot exerts on an object. In this survey, we only examine haptic sensors and their use in robotics from the artificial intelligence and especially from the computational intelligence and soft computing perspective when trying to use them in controlling robot behavior.

Haptic sensors provide information, the nature of which differs in many respects from other sensors, but also has much in common with other sensor signals. Vision and haptic sensing can, for example, be represented as two-dimensional information for surface finishing applications and share much in common, such as many kinds of 2-D mapping possibilities and map processing algorithms. Having a human sense correspondence like machine vision and sound recording, haptic signal processing has a natural linking to artificial intelligence and the modeling of human behavior.

Girão et al. [19] published a review of haptic sensing in the context of robotics. They present several types of haptic and tactile sensors and their uses in robotics. According to [19], the growth of the use of haptic sensors in robotics has been limited by the lack of a market-oriented driving force and by low-cost, established alternatives. Haptic sensing adds complexity to both hardware and software. Girão et al. [19] see the future of utilizing haptic sensors to be unclear: “The number of robots with dexterous grippers and manipulators requiring tactile and haptic sensing is expected to increase but it is almost impossible to foresee up to what extent.”.

Recently, Bimbo et al. [20] have used global optimization by a genetic algorithm (GA) to estimate the pose of a given object by tactile sensing. Their application is gripping objects but a similar approach might be useful also in other robot operations like surface finishing. Pahlavan et al. [21] have also used GA optimization for the laparoscopic gripping of the aorta. There is also work on mobile robots using haptic sensors and evolutionary algorithms to evolve control for robot behaviors (Hoffmann et al. [22]). Surface finishing has some parallels to mobile robots moving in a more or less known environment.

Pattananupong et al. [23] have used genetic programming and neural networks in identifying the state of contact in an experimental haptic sensing system for medical applications, i.e., working on soft materials. Kis [24] has presented a dexterous telemanipulation textile quality control system based on tactile sensing and image processing such as DSP methods.

Gonzálezlez-Quijano et al. [25] used a human-based GA for robot learning of simulated in-hand manipulation tasks. A human-based GA means here a GA that the user can interactively control (feedback from the user). In the context of grinding tasks, their approach could be a system where the operator teaches grinding via a GA and selects the best cases for further fine-tuning while the computer (and GA) is keeping track of the various technical parameters involved in the control. Remember that the fitness function in GA-based optimization can be complemented in a flexible way by clearly defined digital functions, any feedback from the user or auxiliary devices.

7 Conclusion and directions for future research

It can be concluded that telerobotics has mostly been used in dangerous environments, for critical tasks such as space robotics, subsea robotics and telesurgery, and for tasks that require human skill and supervision, but not so much for manufacturing or surface finishing. There would be clear benefits in using telerobotics for surface finishing, such as more advanced motion planning, the possibility to react to unusual situations and rapid adaptability to changes in the manufacturing process.

Augmented reality displays can be used to aid a human operator in quality control, precise robotic manipulation and supervision of process parameters.

Different force control methods can be used for surface finishing in cooperation with a human operator, in a range of control modes going from direct to automatic motion or force control. Teleoperation can allow the human operator to control more precisely the forces than is possible manually. Heavier tools can be operated and the operator is protected from the noisy and dusty processing environment. The dynamics of a telerobot can be optimized to allow for precise handling of objects, whereas it is difficult for a human precisely to move very heavy or light/small objects.

The authors plan to conduct further research on technically and economically feasible applications of mixed reality telerobotic surface finishing, especially abrasive fine-finishing. No research that combines mixed reality (augmented reality, virtual reality), telerobotics and surface finishing in one publication was found.

Acknowledgement

We acknowledge the financial support from KAUTE Foundation and Mirka Ltd for the research reported in this paper.

References

  • [1]

    Kuhlenkoetter B., Zhang X., A robot system for high quality belt grinding and polishing processes, Cutting Edge Robotics, 2005, 755-770 Google Scholar

  • [2]

    Hashimoto F., Yamaguchi H., Krajnik P., Wegener K., Chaudhari R., Hoffmeister H., et al., Abrasive fine-finishing technology, CIRP Ann. - Manufacturing Technology, 2016, 65, 597-620 Google Scholar

  • [3]

    Directive 2002/44/EC – vibration, 25 June 2002, https://osha.europa.eu/en/legislation/directives/19.[Dec 12, 2016] 

  • [4]

    Mechanical vibration – Measurement and evaluation of human exposure to hand-transmitted vibration – Part 1: General requirements, ISO 5349-1:2001, May 2001, http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=32355 

  • [5]

    myMirka. http://www.mirka.com/mymirka/, 2016 [Dec 12, 2016] 

  • [6]

    Conway L., Volz R.A., Walker M.W., Teleautonomous systems: Projecting and coordinating intelligent action at a distance, IEEE Trans. Robot. Autom., 1990, 6, 146-158 Google Scholar

  • [7]

    Milgram P., Rastogi A., Grodski J.J., Telerobotic control using augmented reality, Proc. 4th IEEE Int. Workshop on Robot and Human Communication, 1995 Google Scholar

  • [8]

    Freund E., Rossmann J., Projective virtual reality: Bridging the gap between virtual reality and robotics, IEEE Trans. Robot. Autom., 1999, 15, 411-422 Google Scholar

  • [9]

    Nagata F., Kusumoto Y., Fujimoto Y., Watanabe K., Robotic sanding system for new designed furniture with free-formed surface, Robotics and Computer-Integrated Manufacturing, 2007, 23, 371-379 Google Scholar

  • [10]

    Tzafestas C.S., Virtual and mixed reality in telerobotics: A survey, Industrial Robotics: Programming, Simulation and Applications, 2006, 437-470 Google Scholar

  • [11]

    Rivers A., Augmented manual fabrication methods for 2D tool positioning and 3D sculpting, PhD thesis, MIT, Cambridge, MA, 2013 Google Scholar

  • [12]

    Bejczy D.A.K., Virtual reality in manufacturing, Re-Engineering for Sustainable Industrial Production, Springer, 1997, 48–60 Google Scholar

  • [13]

    Florins M., Trevisan D.J., Vanderdonckt J., The continuity property in mixed reality and multiplatform systems: A comparative study, Computer-Aided Design of User Interfaces IV, 2005, 323-334 Google Scholar

  • [14]

    Milgram P., Colquhoun H. Jr, A Taxonomy of Real and Virtual World Display Integration, Mixed Reality - Merging Real and Virtual Worlds, Ohmsha, Ltd., Springer Verlag, 1999 Google Scholar

  • [15]

    Chintamani K., Augmented reality navigation interfaces improve human performance in end-effector controlled telerobotics, PhD thesis, WSU, Detroit, MI, 2010 Google Scholar

  • [16]

    Chin B. A., Madsen N.H., Infrared thermography for sensing the arc welding process, The Welding Journal, 1983, 227-234 Google Scholar

  • [17]

    Pagilla P.R., Yu B., Robotic surface finishing processes: Modeling, control, and experiments, J. Dynamic Systems, Measurement, and Control, 1999, 123, 93-102 Google Scholar

  • [18]

    Pagilla P.R., Yu B., Adaptive control of robotic surface finishing processes, Proc. American Control Conf., 2001, 630-635 Google Scholar

  • [19]

    Girão P.S., Ramos P.M.P., Postolache O., Pereira J.M.D., Tactile sensors for robotic applications, Measurement, 2013, 46, 1257-1271 Google Scholar

  • [20]

    Bimbo J., Kormushev P., Althoefer K., Liu H., Global estimation of an object’s pose using tactile sensing, Advanced Robotics, 2015, 29, 1-15 Google Scholar

  • [21]

    Pahlavan P., Najarian S., Afshari E., Moini M., Artery cross-clamping during laparoscopic vascular surgery; a computational tactile sensing approach, Biomedical Materials Engineering, 2013, 23, 423-432 Google Scholar

  • [22]

    Hoffmann F., Zagal Montealegre J.C.S, Evolution of a tactile wall-following behavior in real time, Soft Computing and Industry, 2002, part VII, 747–755 Google Scholar

  • [23]

    Pattananupong U., Tongpadungrod P., Chaiyaratana N., Genetic programming and neural networks as interpreters for a distributive tactile sensing system, Congr. Evol. Computation, 2007, 4027-4034 Google Scholar

  • [24]

    Kis A., Tactile Sensing and Analogic Algorithms, PhD thesis, Péter Pázmány Catholic University, Budapest, Hungary, 2006 Google Scholar

  • [25]

    González-Quijano J., Abderrahim M., Bensalah C., Al-kaff A., A human-based genetic algorithm applied to the problem of learning in-hand manipulation tasks, Proc. of the IROS 2012, 2012 Google Scholar

About the article

Received: 2017-08-19

Accepted: 2018-03-07

Published Online: 2018-05-31


Citation Information: Open Engineering, Volume 8, Issue 1, Pages 156–161, ISSN (Online) 2391-5439, DOI: https://doi.org/10.1515/eng-2018-0018.

Export Citation

© 2018 Thomas Höglund et al.. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. BY-NC-ND 4.0

Comments (0)

Please log in or register to comment.
Log in