5G Based Machine Remote Operation Development Utilizing Digital Twin

: Remote operations of mobile machinery require reliable and flexible wireless communication. 5G networks will provide ultra-reliable and low latency wireless communications upon which remote operations, real-time con-trol and data acquisition can be implemented. In this paper we present a demonstration system and first experiments for remote mobile machinery control system utilizing 5G radio and a digital twin with a hardware-in-the-loop development system. Our experimental results indicate that with a suitable edge computing architecture an order of magnitude improvement in delay and jitter over exiting LTE infrastructure can be expected from future 5G networks.


Introduction
Our goal has been the development of methods supporting remote operation of autonomous machines in different situations, including real-time and autonomous, semiautonomous, and manual remote control. Even in autonomous systems, manual intervention may be needed in special situations that may be dangerous to human beings or to their property. High quality video feedback is essential, but haptic feedback improves further the user experience and reliability as it includes force feedback (providing kinesthetic information of force, torque, position, and velocity) and tactile feedback (providing information about surface texture and friction).
In remote operation -haptic or based on video feedback only -the end-to-end delay is crucial. With mobile machinery, wireless connectivity is necessary and new 5G technologies provide a great potential for low latency wireless communication. International Telecommunication Union -Radio-communication Sector (ITU-R) M.2083 has defined three usage scenarios for 5G, and one of them is ultra reliable (99.999%) and low latency (1 ms) communications, (URLLC) [1]; this is comparable to some humanmachine applications, where the sensations of sight and touch must have an end-to-end delay of 1 ms to avoid cyber sickness. High reliability is relevant for critical applications. Many of the topics discussed here are covered in more detail in [2,3].
ITU-R has set the requirement for the delay to 1 ms [1,11]. However, the delay requirement depends on the task. The most demanding user interface tasks in this respect are dragging on a touch screen (1 ms requirement) and inking or line drawing using a stylus (7 ms requirement) [12,13]. On the other hand, mouse control needs a maximum delay of 60 ms [14], and the conventional requirement for real-time operation is 100 ms [13,15]. The effect of delays in bilateral control has also been earlier studied, first time by Ferrell in 1966 [4]. In haptic feedback the problems include instability, slowness of convergence, and cybersickness [5][6][7], which is a form of motion sickness [8,9]. McCauley and Sharkey [10] published the first paper on cybersickness caused by the delay of the sensations of sight and touch.
Here the use case was related to the future smart machinery operations. The main application scenario considers remote haptic control of an excavator. Earlier similar work has been done in mining industry [16,17] and other heavy machinery [18,19]. Related Finnish efforts include [20] on a haptic joystick for remote-operated mobile machinery and [21] on digital feel for accurate operation of work machines. Our results have been demonstrated using a testbed with a wireless network with 5G technologies, joystick actuators and a digital twin environment with hardware-in-loop control of the digital machine model.

Delay and reliability
Delay plays an essential role considering the performance and speed of remotely controlled machine operations. In the current 4G mobile networks, the actual delays in the data link layer are in the order of 50-300 ms [22]. The delay and reliability requirements vary depending on the applications, for real time gaming 50 ms delay with a packet loss rate of 10 −3 , for interactive gaming 100 ms delay with a packet loss rate of 10 −3 , and for streaming and file downloading 300 ms delay with a packet loss rate of 10 −6 . These are user plane or data plane delays, which can be as low as 10-20 ms because of the short sub-frame of 1 ms [22].
Reliability is part of a larger concept of dependability [23][24][25]. Low reliability may have similar effects in tactile interfaces as a large delay [26]. Reliability is a measure expressing when the system is available [27]. Reliability may have different definitions, but in [27] reliability is the number of sent packets or more generally protocol data units (PDUs) successfully delivered to the destination within the time constraint required by the targeted service, divided by the total number of sent packets. Some required QoS classes for different applications are given in Table 1.

Test environment
The goal in our case is to operate a machine manually with a remote connection, either fixed or on a mobile platform, for object handling and processing operations. The control interval of remote manual operations should be as short as possible, even down to 5 ms. If the reliability of the connectivity is reduced, it has impacts on the usage of the system. Lost control packets will cause jitter like behaviour, from which practically non-deterministic behaviour may be followed, and in any case it implies slower usage of the system. In remote operation also visual feedback, i.e., video is inevitable, though the video needs not to have HD quality. It should also be noted, that in general these applications cannot be based on a public network. The architecture of our test environment is shown in Figure 1. The 5G test network (https://5gtn.fi/) is used as the test environment. The communication between the remote operator and the mobile machine may use one of several communication options: a) over public fixed Internet b) over wireless 5G test network using 4G LTE c) over 5G proof-of-concept (PoC) radios d) over 4G LTE multi-access edge computing (MEC) The 5G PoC radios and LTE pico cells used in 5G test network are depicted in Figure 2. The 5G PoC radios from Nokia consist of a base transceiver station (BTS) and a user equipment (UE) units forming the actual wireless 5G radio link between the units. This first evolution concept of 5G radios optimises the system communication paths and exploits the massive multiple-input multiple-output (MIMO) beamforming. The radio part is run at sub-6 GHz frequency enabling as low as 1 ms delay in the link and increased capacity up to several Gbps exceeding the capacity of present LTE.
The connection to the desired network is obtained via Ethernet, wireless access point or mobile connection using the test network SIMs. Thus, virtual private networks (VPNs) can be used for accessing the network from public networks. As it can be seen from Figure 1, two PC machines are needed as gateways for connecting the network between remote controller and machinery. The 5G PoC radios can be considered as a pipeline connection via Ethernet input and output sources. LTE MEC part is currently considered as a future research in transferring the data processing to the mobile edge.
The controlled devices are connected to the IP network with a gateway, in our case via laptop computer. The user interface is based on an industrial joystick controller or a haptic controller. In the former case, an industrial joystick controller sends Controller Area Network (CAN) frames according to the CANopen protocol though a CAN-USB adapter to a PC which forwards them to the 5G network. In the latter case, a haptic control device with a direct USB interface to the PC is connected to the controllable target and measurements from the target are returned with an ad hoc CAN protocol. The CAN frames of the haptic control signals are received from the 5G network through a gateway PC and relayed via USB-CAN adapter further to the machine controller. A real machine controller controls the digital twin, i.e. the simulation model, but it can control also a real machine or an industrial robot just by recon-  necting the control signals to a physical machine instead of the simulation PC. Also video signal from the simulated or real machine can be sent to the controller.

Telerobotic schemes
Remotely operated machinery can be used in hazardous environments to keep human operators safe and substantially reduce operating costs. In the future -and in some cases even today -machine fleets are operated from central operating rooms. In robotics this is represented by "telerobotics", meaning literally robotics at a distance. This is generally understood to refer to robotics with control, where there is a human operator in the loop. Any highlevel, planning, or cognitive decisions are made by the human operator while the robot is responsible for their actual execution [28]. Many telerobotic systems involve at least some level of direct control and accept the user's motion commands via a joystick or similar device in the user interface. The local and remote robots are called a master and a slave. To provide direct control, the slave robot is programmed to follow the motions of the master robot, which is positioned by the user. Some master-slave systems provide force feedback, such that the master robot not only measures motions but also displays forces to the user. The user interface becomes fully bidirectional and such telerobotic systems are often called bilateral. Both motion and force may become the input or output to/from the user, depending on the system architecture. The bilateral nature of this setup makes the control architecture particularly challenging: There are multiple feedback loops and even without environment contact or user intervention, the two robots form an internal closed loop. The communication between the two sites often inserts delays into the system and this loop, so that stability of the system can be a challenging problem [28].
Remote and local sites represent their own characteristics in telerobotics. The local site includes the human operator and all the elements necessary to support the system's connection with the user, e.g. joysticks, monitors, keyboards, and other input/output devices. The remote site includes the robot, supporting sensors and control elements, and the environment to be manipulated. The haptics technologies consider bidirectional user interfaces, involving both motion and force, though they more commonly interface the user with virtual instead of remote environments [28].
Two basic control architectures couple the master and slave devices together: position-position control and position-force control. In the position-position control architecture, both the master and slave device are commanded to track each other's position and velocity. Both sites implement a tracking controller, often a proportional-derivative (PD) controller, which may be also interpreted as a spring and damper between the tips of each device. If the position and velocity gains are the same, the two forces are the same and the system effectively provides force feedback. While this is very stable, it also means the user feels the friction and inertia internal to the slave robot, which is obviously not desirable. In the position-force control architectures there is typically a force/torque sensor at the flange of the slave device and feedback of the force is acquired from there. This allows the user to feel only the external forces acting between the slave and the environment and presents a more clear sense of touch to the environment. However, this architecture is less stable: The control loop passes from master motion to slave motion to environment forces, and back to master forces. Stability may be compromised in stiff contact and many systems exhibit contact instability in these cases [28]. A general control system, originally presented by Lawrence [29], captures both the architectural aspects given above. This teleoperator control system will equalize both the operator and environment forces as well as the master and slave motions. It is desirable to measure both force and velocity (from which position may be inte-grated or vice versa) at both sites. With this complete information, the slave may, for example, start moving as soon as the user applies a force to the master, even before the master itself has moved [28].
In the following paragraphs we present two different remote control schemes and their applications: an open loop control application, where the remote device is controlled in a position/velocity control mode, and a closed loop control application, where the remote device is controlled in a force control mode with a haptic control device. The goal was to set up development environments where different communication solutions, especially including 5G technologies, can be tested against the requirements of different control solutions.

Remote open loop control of a virtual prototype of a heavy duty machine
A remote operator controls a virtual machine in an open loop control mode. In the target system, a real machine control system is connected to digital twin, a virtual machine implemented in a dynamic simulation system for hardware-in-the-loop development. The simulated machine system is an excavator and the actuators of the machine are remotely manually controlled. The actuators are in practice represented by the joint positions or joint velocities of the hydraulic boom of the excavator or related cylinders via the valves. The control signals for these are remotely and manually created from a multi-joystick control device. The feedback from the remote machine is carried out with a video feedback to the remote operator. The control principle is illustrated in Figure 3.  A test system was developed based on a real embedded machine control system relying on the CAN bus. A remote operator controls the virtual machine with an industrial CAN-bus compatible wireless radio transmitters (by Hetronic; www.hetronic.com). The CAN-bus control signals from the wireless transmitter UI were bridged and  tunneled over TCP/IP and the 5G PoC radio to the CAN bus of the real machine control system (see Figure 4). The whole remote control setup and system was tested and demonstrated at the European Robotics Forum (ERF) at Tampere in March 2018 so that the remote operator was controlling the virtual machine in Tampere and the virtual machine with the real control system hardware was running in Oulu ( Figure 5). Several persons tested the system successfully, though the round-trip delay in the level of 100-120 ms was actually observed due to public internet.

Remote haptic control of a virtual prototype of a heavy duty machine
A haptic user interface can extend the user experience of the remote operator substantially. Haptic devices recreate a sense of touch to the user by applying forces, vibrations, or motions to the user. Here a low cost commercial haptic device, Novint Falcon, initially intended to replace the 2D mouse in video games with a 3D comparable. Following user's motions in 3D (x, y, z), the device keeps track of where the handle is moved and creates forces by sending currents to the motors in the device. The sensors can keep track of the position to sub-millimeter resolution, and the motors are updated with a 1 kHz frequency, resulting to a realistic sense of touch. The control principle is illustrated in Figure 6.
In this application Novint Falcon was used as a haptic interface to control a virtual excavator with a real machine control system, connected to the Mevea dynamic simulation system for hardware-in-the-loop development (https://mevea.com). Virtual force sensors implemented in the dynamic simulator and connected to the CAN control bus of the real machine control system were used to provide the force feedback from the remote virtual machine to the operator. In addition, the remote operator was also provided with a video feedback. The control scheme is illustrated in Figure 7.
The joint velocity targets were acquired from the haptic user interface device and transmitted to the CAN bus and further on used as control signals in the real machine control system. The force signals from the virtual sensors were transmitted via the CAN bus adapter to the haptic user device as target force signals of the haptic device. All the communication between the haptic user interface and the machine control system was going via the internet and 5G connectivity provided by the 5G PoC radios (see Figure 7).
The haptic remote control setup and system was tested and demonstrated at the Mevea Seminar 2018, in Vantaa in October 2018 (https://mevea.com/seminar-2018/) so that the remote operator was controlling the virtual machine in Vantaa and the virtual machine with the real control system was running in Oulu ( Figure 8). The haptic control introduced the feeling of "touch" to the operator, though the negative effect of the communication delay was clearly observed, again due to public internet, degrading the level of user experience.

Experimental results
In order to assess the benefit of future 5G connectivity over the existing LTE infrastructure we instrumented the 5G PoC and LTE pathways of Figure 1 and created traffic generators approximating the video and control signals of the remote control system in Figure 4. The measurement points were at the application layer between the network elements and control or video devices. Table 2 presents the measurement results for delay and jitter and the used bandwidth parameters for video and control traffic. The 5G PoC pathway provides significant improvement both in delay and jitter over LTE. The measured 5G PoC pathway delay is very close to the 1 ms specified for URLLC. However, it should be noted that the delay is specified for the radio network only. In this case our experimental network architecture includes only minimal edge functionality for the 5G PoC pathway but full EPC core functionality for the LTE pathway putting the latter to a disadvantage. The comparative benefit from commercial 5G networks will depend on the details of the deployed network architecture and its implementation. A more detailed presentation of our methodology and results can be found in [30].

Conclusions
Our goal was the development of methods and techniques for supporting operation of autonomous machines in different situations, including real-time and autonomous, semiautonomous, and manual remote control through a new 5G wireless infrastructure of the 2020's. A 5G test network was used by harnessing the network into virtual demonstration environment. Virtual private networks were used for mobilizing the laboratory tests into real-life scenarios. Two application scenarios were outlined for remote control and operation of machines and robots, and pilots were designed and implemented for these. Control schemes included feedforward velocity control with video feedback to the remote operator and feedback control with haptic feedback. Both control schemes were implemented in a virtual mobile machine. The effect of delays were clearly observed, degrading the level of user experience, especially in the case of haptic control.
In cases excluding the use the public internet our experimental results indicate that with a suitable edge computing architecture an order of magnitude improvement in delay and jitter over exiting LTE infrastructure can be expected from future 5G networks.
In the future, both the virtual mobile machine and real robot pilot systems will serve as excellent development and testing environments in the future research and development work.