Microelectronics Journal 45 (2014) 1603–1611
Contents lists available at ScienceDirect
Microelectronics Journal journal homepage: www.elsevier.com/locate/mejo
Reliable orientation estimation for mobile motion capturing in medical rehabilitation sessions based on inertial measurement units HansPeter Brückner n, Benjamin Krüger, Holger Blume Institute of Microelectronic Systems, Appelstraße 4, 30167 Hannover, Germany
art ic l e i nf o
a b s t r a c t
Article history: Received 27 December 2013 Accepted 16 May 2014 Available online 23 June 2014
Fully mobile and wireless motion capturing is a mandatory requirement for undisturbed and nonreactive analysis of human movements. Inertial sensor platforms are used in applications like training session analysis in sports or rehabilitation, and allow nonrestricted motion capturing. The computation of the required reliable orientation estimation based on the inertial sensor RAW data is a demanding computational task. Therefore, an analysis of the computational costs and achievable accuracy of a Kalman ﬁlter and a complementary ﬁlter algorithm is provided. Highly customized and thus lowpower, wearable computation platforms require lowlevel, platform independent communication protocols and connectivity. Stateoftheart small sized commercial inertial sensors either lack the availability of an open, platform independent protocol, wireless connectivity or extension interfaces for additional sensors. Therefore, an extensible, wireless inertial sensor called Institute of Microelectronic Systems Inertial Measurement Unit (IM)2SU, featuring onboard inertial sensor fusion, for use in home based stroke rehabilitation is presented. Furthermore, a Quaternion based, singularity free orientation estimation accuracy error measure is proposed and applied. To evaluate orientation estimation accuracy an optical system is used as golden reference. Orientation estimation based on a Kalman ﬁlter and a complementary ﬁlter algorithm is evaluated. The proposed IMU provides high orientation estimation accuracy, is platform independent, offers wireless connection and extensibility and is low cost. & 2014 Elsevier Ltd. All rights reserved.
Keywords: Wireless inertial sensor Inertial sensor fusion Platform independence Wireless sensor network
1. Introduction Motion capturing is used in a wide range of applications, ranging from rehabilitation to virtual reality [1,2]. Highly reliable camerabased systems require a complex stationary setup and a permanent line of sight between tracked objects and multiple cameras. In contrast, mobile and wearable inertial sensors can be easily attached to the human body [2]. This is mandatory for human motion capturing in sport training sessions or medical applications. They also overcome the spatial limitations of a stationary camera setup. The target application of the proposed wireless inertial sensor platform is motion capturing during rehabilitation sessions, where the captured movement data is used to generate an audio feedback [3]. Recent research showed that patients in stroke rehabilitation remarkably beneﬁt from this so called soniﬁcation of movements [4]. Rehabilitation training tasks include drinking from a glass of water. The established inertial sensor system comprises a wireless sensor network of up to 10 inertial sensors ﬁxed to the patient's body to capture complex upper body movements and force sensors at the ﬁngers to detect grasping activities. Orientation estimation based on the data of the IMUs attached to the upper arm and forearm allows
n
Corresponding author. Tel.: þ 49 511 762 5039. Email address:
[email protected] (H.P. Brückner).
http://dx.doi.org/10.1016/j.mejo.2014.05.018 00262692/& 2014 Elsevier Ltd. All rights reserved.
the computation of upper limb positions, angles between limbs and velocities using forward kinematics and a connected rigid chain body model. The system architecture shown in Fig. 1 incorporates sensors attached to the upper arm and forearm, the wearable computation platform and wireless audio feedback via headphones or hearing aids. Additional force sensors for the assessment of the performed grasping tasks could be integrated via expansion interfaces in future. The software architecture shown in Fig. 1 incorporates a graphical user interface. IMU data processing tasks are data acquisition, soniﬁcation and sound synthesis. For sound synthesis the Sound Synthesis Toolkit (STK) [5] framework providing a variety of signal generators is used. Available sensor platforms are highly integrated and hence smallsized and lightweight [6,7]. As discussed in Section 2, stateoftheart commercial and academic inertial sensor platforms are limited regarding extensibility, platform independence, wireless connectivity or accuracy, respectively. Therefore, a wireless, low power sensor platform using a platform independent protocol is presented in this paper. The following aspects highlight the innovation of the approach:  low power RF module and 8bit MCU.  full access via platform independent interfaces.  realtime onboard orientation estimation.
1604
H.P. Brückner et al. / Microelectronics Journal 45 (2014) 1603–1611
Wearable Computation platform link to wireless hearing aids Force Sensor data acqusition
IMU
Graphical User Interface Wearable Platform
Patient Wearing Sensor System
sonification parameters calculation
soundsynthesis
signal generators
Wearable computation system Fig. 1. Movement soniﬁcation system architecture [3].
In human motion capturing, common algorithmic approaches for orientation estimation like integration, vector observation, complementary ﬁltering, or Kalman ﬁltering are applied. These algorithms work on IMU RAW data acquired by triaxial gyroscopes, accelerometers and magnetometers. Orientation estimation accuracy is evaluated based on Euler angles, thus suffering from singularities. Therefore, a Quaternion based measure is proposed and additionally provided within the orientation estimation accuracy evaluation. Furthermore, the following topics regarding orientation estimation are presented.  Quaternion based error measure for reliable orientation estimation accuracy assessment.  Reduced complexity Kalman ﬁlter.  Comparison of reduced complexity Kalman ﬁlter and complementary ﬁlter orientation estimation accuracy. The paper is organized as follows: Section 2 presents related work, highlighting previously developed wireless IMUs in academia and industry. The hardware and software architecture of the proposed (IM)2SU are presented in Section 3. An introduction to Kalman ﬁlter and complementary ﬁlter based IMU orientation estimation and a comparison of the computational costs of two exemplarily implementations are shown in Section 4. Section 5 presents the Euler angle and the proposed Quaternion error measure for orientation estimation accuracy evaluation. The results of the system and sensor fusion latency and orientation estimation accuracy evaluation of the (IM)2SU sensor system are shown in Section 6. Conclusions are drawn in Section 7.
2. Related work From the application point of view, soniﬁcation on mobile platforms based on inertial sensors is explored in [8,9]. A mobile system for improving running mechanics is developed in [8]. The system comprises a mobile phone and triaxial accelerometers and gyroscopes connected via Bluetooth. The sensor is worn at the sacrum and accelerometer data is captured. The computed runner's average center of mass is provided as an objective running technique feedback. Due to limited computing capabilities of the hardware platform, soniﬁcation is based on the playback of sound ﬁles. A custom processing platform comprising accelerometers is
utilized in [9] for soniﬁcation of rowing boat velocity for use in elite rowers training. The soniﬁcation is used to display the actual boat velocity trend in order to achieve a continuous velocity proﬁle. The heterogeneous RISC/DSP platform presented in [3] does not meet the power consumption limitations for the presented movement soniﬁcation application. Therefore, a custom ASIP based platform will be designed in future, requiring low level sensor data access. The Xsens MTw [6] wireless inertial sensor system provides a sampling frequency of 50 Hz for up to 12 sensor modules and battery operation of E 3.5 h. In contrast to the sensor system proposed here a linepowered base station is required for communication with a Windows PC based host platform. Orientation estimation based on gyroscope, magnetometer and accelerometer sensor fusion provides a dynamic accuracy of E21 RMS. Philips [10], Shimmer [7] and Intersense [11] also provide comparable wireless inertial sensor systems. In comparison to the proposed (IM)²SU the Shimmer system lacks onboard sensor fusion. A latency of 40 ms of the Intersense system is inappropriate for realtime, low latency feedback in rehabilitation exercises. In [12] the authors present a wearable motion capturing system providing orientation estimates based on gyroscope, accelerometer and magnetometer data using an extended Kalman ﬁlter for inertial sensor fusion. The wireless link to a host platform is achieved via a ZigBee transceiver module. For onboard sensor fusion a 32bit RISC MCU @ 72 MHz is used. The orientation estimation evaluation shows worst case accuracy of 6.061 RMS. Update rates of 25 Hz for the roll and pitch coordinate and 10 Hz for the yaw coordinate are achievable. Preliminary development results of this platform are presented in [13]. In contrast the (IM)2SU proposed in this paper achieves higher update rates and precision. A wireless inertial sensor unit equipped with gyroscopes, accelerometer and magnetometers is presented in [14]. The wireless transceiver allows communication in the 433 MHz and 868 MHz ISM bands. For onboard data processing the dsPIC 30F3014 16bit MCU is used. A 120 mAh LiPo battery allows up to 140 min operation. The onboard orientation estimation provides insufﬁcient accuracy with results of E101 RMS on average. In summary, there are various IMU types differing in the sensors used, achievable update rates and above all the orientation estimation accuracy. Besides commercial platforms there are continuous academic research activities in developing and optimizing IMUs and sensors. The platform independent accessibility,
H.P. Brückner et al. / Microelectronics Journal 45 (2014) 1603–1611
a major application requirement, is also a differentiator. At present, there is no IMU fulﬁlling all required aspects.
1605
for grasping task assessment. The PCB dimensions are 60 mm by 60 mm, the area optimized PCB dimensions are 23 mm by 31 mm.
3. Mobile, wireless inertial measurement unit 3.2. Software architecture The system architecture of the Institute of Microelectronic Systems Inertial Measurement Unit (IM)²SU is split into a hardware part and a ﬁrmware part. In the next two sections the hardware aspects and ﬁrmware implementation are presented. 3.1. Hardware architecture
HARDWARE
FIRMWARE
Comparable to stateoftheart inertial sensors the (IM)²SU comprises triaxial accelerometers, magnetometers and gyroscopes. The hardware architecture is shown in Fig. 2. A fully detailed description of the hardware architecture and detailed information of the utilized sensors is presented in [15]. Therefore, only the key features of each sensor are shown in Table 1 and the hardware and software architecture are shown in Fig. 2. The Atmel ATZB24A2 performs communication tasks with the onboard sensors and also communicates wirelessly with the central node attached to the processing platform. The board is powered by a lithium ion battery with a maximum charge of 650 mAh, chargeable via the USB interface. The device can also be used as bridge device between the wireless sensor network and a wired host platform via USB or UART. The (IM)²SU prototype is shown in Fig. 3. For debug and development purposes, JTAG and ISP interfaces are provided. Custom expansion headers enable the connection of force sensors
Motion capturing applications require the acquisition of movement data at speciﬁc positions of the observed body. For processing of that measurement data exact timing relations of the incoming samples have to be known. The software of the (IM)²SU is designed to guarantee synchronous motion capturing for up to 10 wireless devices, deterministic latency behavior and robust communication performance while exposing low latency. The (IM)²SU modules constitute a pointtomultipoint wireless network according to the 2.4 GHz IEEE 802.15.4 standard. For interfacing the MAC layer in the RFIC, the open source mController Radio Communication Library [16] is used. In the proposed network topology one additional (IM)²SU device acts as coordinating master. This master bridges between the IMU network, comprising a conﬁgurable number of sensor devices and a custom host platform. The software processing steps of the (IM)²SU devices are shown in Fig. 4. A deterministic communication is achieved by implementing a TDMA protocol on a preconﬁgured 802.15.4 channel. The master and each slave start a slot timer after sending or receiving the broadcast message, respectively. Each slave associates it's network address with distinct time slot #i. The captured data is transmitted in the corresponding time slot. A more detailed description on the ﬁrmware can be found in [15].
Sensor Interfaces
TWI / I²C
SPI
B
p
a
UARTIF
µracoli
UART
SPI
VCP
MAC PHY
Processing platform
ADCIF
Timer
Wireless interface
ADC
Battery
Fig. 2. Hard and software architecture of the (IM)2SU.
Table 1 Key features of the utilized sensors. Sensor
Manufacturer
Type
Resolution
Interface
Maximum sampling rate
Gyroscope Accelerometer Magnetometer
InvenSens ST Microelectronics Xensor Integration
ITG3200 LIS3DH XEN1210
0.451/s 0.60 6 m/s2 15nT
I2C I2C SPI
8 kHz 5 kHz 3.9 kHz
MCU
Application (Communication Protocol, Inertial Sensor Fusion)
1606
H.P. Brückner et al. / Microelectronics Journal 45 (2014) 1603–1611
In general, both algorithms compute the orientation estimation based on cyclic prediction and correction steps according to Fig. 5.
4. Reliable orientation estimation based on IMU data
60 mm
Besides sensor fusion techniques for orientation estimation based on inertial sensor data, like gyroscope integration, and vector observation, Kalman ﬁlters are optimal estimators with respect to the minimization of the error covariance [17]. In the literature, there are various proposals for Kalman ﬁlters used in inertial sensor fusion [18]. The ﬁlters basically differ in the state vector size and preprocessing steps. An alternative approach to Kalman ﬁlter based sensor fusion in the literature is the complementary ﬁlter algorithm [19].
4.1. Kalman ﬁlter Based on the results of a previous study [20] evaluating the performance of eight different sensor fusion algorithms, the reduced state vector Kalman ﬁlter according to Lee [21] is chosen for sensor fusion on the proposed sensor platform. The evaluation results in [20] show a good tradeoff between accuracy and computational complexity of this Kalman ﬁlter. Kalman ﬁlter based orientation estimation has been explored in detail in [17,21]. 4.1.1. Reduction of computational demands The authors in [22, 23] propose the reduction of computational complexity by using precomputed a posteriori and a priori error covariance matrices on their proposed Kalman ﬁlter. It has been shown that this modiﬁcation has negligible inﬂuence on the orientation estimation accuracy. This concept is applied to the Kalman ﬁlter according to Lee and Park [21] enabled by knowledge about constant sensor noise characteristics. The inﬂuence on the ﬁlter structure is shown in Fig. 6, highlighting the reduction of required computation steps. Instead of computing the corresponding ﬁlter matrices highlighted in yellow in each ﬁlter step ﬁxed values are assigned based on a previous ﬁlter analysis and ofﬂine computations. Using precomputed matrices results in less required operations. The Kalman gain matrix also becomes independent from the actual ﬁlter step and can be computed ofﬂine due to the constant a priori and a posteriori error covariance matrices. The precomputation of the Kalman gain matrix achieves a signiﬁcant reduction of the computational demands, as this eliminates the computation of a 4 4 matrix inverse in each ﬁlter step.
60 mm Measurements
Time Update (Predict)
Measurement update (Correct)
23 mm 31 mm
Estimated orientation 2
2
Fig. 3. (IM) SU prototype (a) and area optimized (IM) SU unit preview (b).
Master device START
Fig. 5. General orientation estimation algorithm structure.
Sensor node
start new acquisition cycle
restart slot timer #i
send broadcast message
wait for broadcast message
start measurement slot timer
collect sensor data
host communication
process response #i
wait for response
send data
wait for time slot #i
sensor fusion
n times Fig. 4. Communication and processing steps of the (IM)2SU unit.
H.P. Brückner et al. / Microelectronics Journal 45 (2014) 1603–1611
1607
Kalman Filter [Lee and Park 2009] Measurement Update (Correction)
Gyroscope (3D)
Time Update (Prediction)
State Transition Matrix Computation
Process Noise Covariance Matrix Computation
Kalman Gain Matrix Computation
Compute a priori State Estimate
A priori Error Covariance Matrix Computation
A posteriori Error Covariance Matrix Computation
Compute a posteriori State Estimate
Orientation Estimate
Required computation step
Magnetometer (3D)
Preprocessing (O2OQ)
Accelerometer (3D)
No iterative computation due to modification
Fig. 6. Proposed modiﬁed two step Kalman ﬁlter structure. (For interpretation of the references to color in this ﬁgure legend, the reader is referred to the web version of this article.)
Table 2 Required ﬂoatingpoint operations per ﬁlter step. Operation
‘ADD’/‘SUB’
‘MUL’
‘DIV’
‘cos 1’
O2OQ Original Kalman ﬁlter Modiﬁed Kalman ﬁlter Complementary ﬁlter
147 579 60 36
197 524 37 39
31 46 20 1
1 0 0 0
Complementary Fitler [Calusdian et al. 2011]
Gyroscope (3D)
Quaternion Rate Computation
+
Quaternion Integration
+
Orientation Estimate
Gain Magnetometer (3D) Accelerometer (3D)
Preprocessing (O2OQ)
+

Fig. 7. Structure of the complementary ﬁlter algorithm.
A comparison of ﬂoatingpoint operations required for a single Kalman ﬁlter step of the modiﬁed Kalman ﬁlter compared to the initial version is given in Table 2. The modiﬁcation signiﬁcantly reduces the number of operations. The values for the predeﬁned a priori and a posteriori error covariance matrices and the Kalman gain matrix were established by tracing the matrices on a PC based implementation using captured (IM)²SU RAW data.
switching of the gain factor has been proposed to adapt the gain factor according to the actual movement. Fig. 7 shows the structure of the complementary ﬁlter algorithm for IMU data based orientation estimation. In contrast to the Kalman gain matrix the gain value is a scalar quantity. The computational requirements measured as the number of ﬂoatingpoint operations per iteration are listed in Table 2.
4.2. Complementary ﬁlter The goal of the complementary ﬁlter algorithm is to combine the static accuracy of the accelerometer and magnetometer and the short term accuracy of the gyroscope, within dynamic movements. In contrast to the Kalman ﬁlter there is a constant gain factor. Recently in [24] a movement situationdependent
4.3. Comparison of computational demands Table 2 lists the required amount of ﬂoatingpoint operations per iteration for orientation estimation based on the different sensor fusion algorithms.
1608
H.P. Brückner et al. / Microelectronics Journal 45 (2014) 1603–1611
For computation of an orientation estimate based on the accelerometer and magnetometer the O2OQ algorithm according to [21] is utilized in both Kalman ﬁlter versions and the complementary ﬁlter. Therefore, the total amount of operations for both Kalman ﬁlter versions and the complementary ﬁlter is the sum of the number of operations required for O2OQ computation and the number of operations required for the residual ﬁlter computations. For example, the total amount of ﬂoatingpoint operations per ﬁlter step for the modiﬁed Kalman ﬁlter is 493. The results presented in Table 2 show the reduced computational costs of the complementary ﬁlter algorithm compared to the modiﬁed Kalman ﬁlter. In sum, the number of ﬂoatingpoint operations of the complementary ﬁlter is about 10% less than the modiﬁed Kalman ﬁlter computations.
5.2. Angle between Quaternions as error measure To overcome the deﬁciencies of the Euler angle quality measure this paper proposes the use of a Quaternion based error measure. In contrast to the Euler angle representation the Quaternion orientation representation does not suffer form singularity issues. To compute the error measure based on quaternions ﬁrst the distance between the IMU Quaternion and the optical system Quaternion in 3D space is computed according to (1). Finally, the RMS value of the Quaternion distance dependent on the number of captured samples N of the dataset is computed according to (2). w3D ¼ 2nacosðjquat IMU nquat Optical jÞ
RMSQ uat 5. Error measure for IMU orientation estimation accuracy
ð1Þ
sﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃ 1 N ∑ jw3D j ¼ Nk¼0
ð2Þ
A common way of analyzing the orientation estimation accuracy of IMUs is to use an optical tracking system as golden reference [19,20]. Therefore, the ﬁrst step is to attach a sufﬁcient number of markers to the sensor in order to achieve reliable tracking by the optical system. The number of markers depends on the optical system conﬁguration, e.g. the number of cameras and the 3D measurement volume. The wooden measurement piece used for tracking the Xsens MTx and the proposed (IM)²SU is shown in Fig. 8. 14 reﬂective markers and both IMUs are attached to the wooden board. Initial orientation offsets of both IMUs where corrected in a preprocessing step, before evaluation the orientation estimation accuracy. In general, the achievable accuracy is highly dependent on the movement and the quality of the motion capturing of the optical reference system. Particularly the occlusion of multiple markers may result in inaccurate reference data therefore, a large number of markers is used. Maintaining a sufﬁcient distance between the individual markers avoids using multiple markers to compute a single marker position. Computing a root mean square value (RMS) by subtracting the orientation estimated by the IMUs from the optical reference using Euler angles is the common way of measuring accuracy in motion capturing.
θ(t) [deg]
In case of Euler angles the error measure comprises roll, pitch, and yaw RMS components. Optimizing the overall quality might therefore lead to a tradeoff in achieving a high accuracy on average or having a high accuracy for a single axis. The Euler angle RMS error measure is very common in the literature [13,14,20–23]. Despite the intuitive understandability of Euler angles the representation especially suffers form singularities. Therefore, movements resulting in orientations close to the poles might result in 1801 jumps of the dependent axis. This is especially problematic when tracking complex multiaxis movements, like in the proposed application.
Fig. 9. Quaternion orientation representation abstracted as unit length vectors pointing to the unit sphere surface. (For interpretation of the references to color in this ﬁgure legend, the reader is referred to the web version of this article.)
Yaw
100
IMU 1 IMU 2
0 100
0
100
200
300
400
500
600
sample (k) Pitch
0 φ(t) [deg]
5.1. Euler angels as error measure
50 100
0
100
200
300
400
500
600
400
500
600
sample (k) Roll
ψ(t) [deg]
100 0 100
0
100
200
300 sample (k)
Fig. 8. IMUs and markers attached to wooden measurement object.
Fig. 10. Movement displaced sensors.
H.P. Brückner et al. / Microelectronics Journal 45 (2014) 1603–1611
In general, a Quaternion comprises four scalars (x, y, z, w). The vector part (x, y, z) represents the rotation axis and the scalar part (w) corresponds to the rotation angle. A descriptive and abstractive display model of the Quaternion orientation representation reduced to 3D is used to understand the unit Quaternion as vector pointing from origin to a point on a unit sphere surface, as shown in Fig. 9. This vector represents the rotation axis, or the given orientation represented by the Quaternion. The red and the green solid circle represent the trace of two IMUs rotated around 3601 on the Y axis. An initial and constant orientation displacement around the Y axis results in the constant distance between the two circles. The red and green dash dotted arrows indicate exemplarily orientations.
5.3. Comparison of Euler angle and quaternion error measure This section highlights the reliability of Quaternion orientation measure based on a showcase comprising two sensors with a constant displacement of approximately 201 on the yaw axis. During the captured movement the displaced sensors are rotated along the pitch axis for approximately 901 and back to starting position.
1609
The Euler angle representation of both IMUs is shown in Fig. 10. After a resting position the rotation along the pitch axis starts and afterwards the sensor are rotated back to starting position. In general, due to the ﬁxed displacement of both IMUs the expected error value is the displacement itself, e.g. approximately 201. In the initial orientation in Fig. 11 Euler angle representation and Quaternion measure both show a displacement of 201. During the movement only the Quaternion measure fulﬁlls the expectation of a constant difference between both ﬁxed sensors, as also previously shown in Fig. 9. In contrast, the Euler angle measure shows a non static behavior during the captured movement. Therefore, in contrast to the commonly used Euler angle measure, the Quaternion measure is more suitable for the evaluation of multiaxis movements. Furthermore, the Quaternion error measure is a single error indicator, while the Euler angle error comprises three individual parameters. Therefore, the Quaternion error measure simpliﬁes further ﬁlter optimization. Nevertheless, the results in Section 6.3 will provide both measures for comparability issues. However, the evaluation of IMU performance is highly dataset dependent, as already mentioned in Section 5.
6. Evaluation of the proposed IMU
180 Roll Pitch Yaw
140 120 100
This evaluation section considers sensor fusion algorithm latency and latency induced by the communication protocol applied. Furthermore, the orientation estimation accuracy is evaluated based on a dataset comprising consecutive movements along roll, pitch, and yaw axis sampled at 100 Hz. Fig. 12 shows the captured movement.
80
Yaw
60 40
Xsens XKF Modified Kalman filter Vicon (reference)
0 50
20
0
2000
4000
6000
8000
10000
samples (k)
0
100
200
300
400
500
600
sample (k)
0 50
180
Pitch
50
φ(t) [deg]
0
50
θ(t) [deg]
Euler Angle Error / [deg]
160
0
1000
2000
3000
4000
5000
6000
7000
8000
9000 10000
7000
8000
9000 10000
samples (k) ψ(t) [deg]
Quaternion Error / [deg]
160 140 120
Roll
100 0 100
0
1000
2000
3000
4000
5000
6000
samples (k)
100
Fig. 12. Orientation estimation of the proposed IMU and the Xsens MTx compared to Vicon as golden Reference over 7600 samples.
80 60
Table 3 Sensor fusion latency.
40 20 0
0
100
200
300
400
500
600
sample (k) Fig. 11. (a) Euler angle error representation of the performed movement, (b) evaluation of the movement using Quaternion RMS error measure.
Filter
AT1281
AT32UC3A
Cortex A8a
PCb
Original Kalman ﬁlter Modiﬁed Kalman ﬁlter Complementary ﬁlter
24061.50 ms 9430.25 ms 8400.00 ms
1899.81 ms 853.60 ms 788.24 ms
42.21 ms 13.55 ms 12.35 ms
1.05 ms 0.65 ms 0.09 ms
a b
Cortex A8 AM335x at 1 GHz. Core i5 at 2.8 GHz using Windows 7 64.
1610
H.P. Brückner et al. / Microelectronics Journal 45 (2014) 1603–1611
The sensor is rotated within a range of approximately7501 along each axis with signiﬁcant time in resting pose between each rotation. Furthermore, in the time period between sample 7000 and 8000 there is a movement affecting all three rotation axes.
6.1. Sensor fusion latency evaluation Table 3 shows the computational latency introduced by the Kalman ﬁlter and the complementary ﬁlter for sensor fusion on different hardware platforms. In order to enable realtime operation on the Atmel MCU, the Kalman ﬁlter [21] is modiﬁed to estimate the orientation, based on static error covariance matrices, as described in Section 4.1.1. This ﬁlter is denoted modiﬁed Kalman ﬁlter (KF) in the evaluation section. Alternatively, the complementary ﬁlter algorithm can be used for realtime onboard sensor fusion. The Kalman ﬁlter modiﬁcation demonstrates a signiﬁcant speedup compared to the original Kalman ﬁlter. Sensor fusion based on the complementary ﬁlter provides a further speedup compared to the modiﬁed Kalman ﬁlter.
6.2. Protocol and sensor platform latency evaluation According to Fig. 4 the latency of the overall acquisition cycle is composed of the processing steps in the master and n time slots, which are used for measuring, processing and transmission by the slaves. The time tbtx for setting up and sending a broadcast request by the master is 697 ms. For each time slot response decoding and host communication take the additional time tuart of 233 ms. The processing time at each slave is composed of time tbrx for broadcast decoding of 17 ms, the measuring time tmeas of 3251 ms, bus communication time tbus of 233 ms, time for sensor fusion based on the modiﬁed Kalman ﬁlter tsf,KF of 9430 ms, or based on the complementary ﬁlter tsf,CF of 8400 ms and time for sending the resulting quaternion tqtx of 1211 ms. The computation time for sensor fusion tsf is listed in Table 3. While excluding the transmit time tqtx, the latency of one slave can be expressed as given by Eq. (3). The critical latency path depends on the computation time of the ﬁrst slave tslave and the number of time slots n as given by (4). t slave ¼ t brx þ t meas þ t bus þ t sf
ð3Þ
t lat ¼ t btx þ t slave þ n U t qtx þt uart
ð4Þ
The minimum length of one time slot tslot is equivalent to tqtx. In order to improve reliability, a security margin of 20% has been added. Sensor fusion based on the original Kalman ﬁlter limits the sampling rate for onboard sensor fusion to 33.3 Hz (single IMU). Therefore, the modiﬁed Kalman ﬁlter algorithm or the complementary ﬁlter is used to achieve the application demanded sampling rate of 50 Hz. Utilizing the modiﬁed Kalman ﬁlter results in a slot time tslot of 1.45 ms. For an application with n ¼4 time slots, the latency per acquisition cycle is 19.63 ms. Thus, the acquisition rate is 50.9 Hz utilizing the modiﬁed Kalman ﬁlter algorithm for onboard sensor fusion. Computing orientation estimation based on the complementary ﬁlter algorithm does not reduce the slot time tslot, as an orientation quaternion still has to be transmitted for each sensor. Due to the lower sensor fusion time tsf of the complementary ﬁlter compared to the modiﬁed Kalman ﬁlter a sampling rate of 52.4 Hz is achieved.
Table 4 Accuracy comparison. Filter
Original Kalman ﬁlter (proposed IMU) Modiﬁed Kalman ﬁlter (proposed IMU) Complementary ﬁlter (proposed IMU) Xsens Kalman Filter (Xsens MTx)
Orientation estimation error Roll/ [1RMS]
Pitch/ [1RMS]
Yaw/ [1RMS]
Quaternion/ [RMS]
1.9
1.5
2.9
3.4
1.9
1.5
2.8
3.4
2.6
1.3
4.2
4.8
1.8
1.9
5.3
5.8
6.3. Sensor fusion performance evaluation Sensor fusion based on the original Kalman ﬁlter limits the sampling rate for onboard sensor fusion to 33.3 Hz (single IMU). Therefore, the modiﬁed ﬁlter algorithm or the complementary ﬁlter can be used to achieve the application demanded sampling rate of 50 Hz. For evaluation the orientation estimation performance of the different ﬁlter algorithms based on a single dataset the RAW data of a (IM)²SU and a Xsens MTx sensor is synchronously sampled at 100 Hz. The dataset comprises 10,700 samples. Furthermore, a Vicon Nexus [25] optical motion tracking system is used as golden reference. The wooden rigid body shown in Fig. 8 with 14 markers is used to capture the orientation data based on the optical system. In the evaluation dataset a movement rotating the sensor three times in each axis at about 7 501 and a movement affecting all axes is captured. Between each rotation a period in default position was introduced to depict longterm stability. Fig. 12 demonstrates the sensor fusion performance for orientation estimation of the proposed IM2SU IMU and the Xsens MTx IMU compared to the Vicon system data. The ﬁgure shows the comparable accuracy of the two IMU systems and a drift free operation for the measurement period. In comparison to the Xsens MTx sensor the (IM)²SU provides a more reliable orientation estimation on the yaw axis, while having less precision on the pitch axis. Comparing both IMUs in capturing roll axis movements, there is no signiﬁcant difference. In general, both IMUs are sufﬁcient for mobile motion capturing in stroke rehabilitation session. The root mean square error (RMS) comparison of the Euler angle and Quaternion error measure in Table 4 shows an improved accuracy of the IM2SU IMU utilizing the complementary ﬁlter compared to the Xsens MTx system using the Vicon Nexus system as reference. The complementary ﬁlter shows improved orientation estimation accuracy compared to the Kalman ﬁlter. The difference in the orientation estimation accuracy is caused by a slightly different parameterization.
7. Discussion and conclusions According to the bandwidth limitation the achievable sampling rate for up to four (IM)²SU units using onboard sensor fusion is E50 Hz. Compared to the Xsens MTx sensor the proposed (IM)²SU achieves a slightly increased orientation estimation accuracy, while providing full access to the sensor data at a signiﬁcant lower price. Additionally, the proposed IMU allows the connection of further force sensors required for grasping task assessment. In general, an orientation estimation accuracy of 2.11 RMS (Euler) on average or 3.4° RMS (Quaternion) is achieved.
H.P. Brückner et al. / Microelectronics Journal 45 (2014) 1603–1611
Furthermore, the proposed reliable Quaternion error measure used for analysis conﬁrms the Euler angle based accuracy evaluation results. The performance decrease of the modiﬁed Kalman ﬁlter of E0.11 RMS (Euler) on average compared to the original Kalman ﬁlter [21] is negligible. Therefore, a partitioning of the sensor fusion task becomes a design aspect to unload the demanding task from the sensorplatform. Alternatively, a higher sampling rate can be achieved, if required by the application. Furthermore, this enables exploring future sensor fusion algorithms to achieve improved performance. In general, the achievable sensor fusion accuracy for onboard or hostplatform computation is within the range of the angular resolution of human perception, which is 21 to 61. Hence, the proposed wearable and wireless IMU is well suited for human motion capturing in stroke rehabilitation applications, due to the reference measurements based on the optical Vicon Nexus capturing system. The proposed (IM)²SU achieves an overall sensor system latency of 19.63 ms using four (IM)²SU sensor units in quaternion data mode. The wireless nodes can operate for over 300 min based on the attached battery and are therefore suitable for motion capturing in mobile body attached sessions. Removing debug and extension interfaces from the prototype PCB layout, a dimension of 23 mm 31 mm is achievable. This is a 20% reduction compared to the current design. In summary, the proposed (IM)²SU inertial sensor enables wireless, platform independent and low latency capturing of human movements. Sensor fusion on an 8bit MCU is enabled by algorithmic modiﬁcations of the utilized Kalman ﬁlter. The sensor system is suitable for the demanding latency requirements in realtime acoustic movement feedback.
Acknowledgments The authors thank Rochus Nowosielski and Henning Kluge for their contributions in design and initial operation of the proposed IMU. Furthermore, the authors thank Prof. Rosenhahn and Florian Bauman from the Institut für Informationsverarbeitung at Leibniz Universität Hannover for support and making the Vicon optical motion tracking system available. References [1] E. R. Bachmann, R. B. McGhee, X. Yun, M. J. Zyda, Inertial and magnetic posture tracking for inserting humans into networked virtual environments, in: Proceedings of the ACM Symposium on Virtual Reality Software and Technology, 2001. [2] H. Zheng, N.D. Black, N.D. Harris, Positionsensing technologies for movement analysis in stroke rehabilitation, Medical and biological engineering and computing 43 (4) (2005) 413–420.
1611
[3] H.P. Brückner, M. Wielage, H. Blume, Intuitive and interactive movement soniﬁcation on a heterogeneous RISC/DSP platform, in: Proceedings of the 18th Annual International Conference on Auditory Display, 2012, pp. 75–82. [4] I. Wallis, T. Ingalls, T. Rikakis, L. Olson, Y. Chen, W. Xu, H. Sundaram, Realtime soniﬁcation of movement for an immersive stroke rehabilitation environment, in: Proceedings of the 13th International Conference on Auditory Display, 2007. [5] G. Scavone, P. Cook, RtMidi, RtAudio, and a synthesis toolkit (STK) update, in: Proceedings of the International Computer Music Conference, 2005. [6] Xsens Technologies BV, 〈http://www.xsens.com〉. [7] Shimmer Research, 〈http://www.shimmerresearch.com〉. [8] M. Eriksson, R. Bresin, Improving running mechanics by use of interactive soniﬁcation, in: Proceedings of the 3rd International Workshop on Interactive Soniﬁcation (ISon 2010), 2010. [9] N. Schaffert, K. Mattes, Designing an acoustic feedback system for onwater rowing training, Int. J. Comput. Sci. Sport 10 (2) (2011) 71–76. [10] Philips Research, 〈http://www.research.philips.com〉. [11] Intersense, 〈http://www.intersense.com〉. [12] C. Brigante, N. Abbate, A. Basile, A. Faulisi, S. Sessa, Towards miniaturization of a MEMSbased wearable motion capture system, IEEE Trans. Ind. Electron. 58 (2011) 3234–3241. [13] N. Abbate, A. Basile, C. Brigante, A. Faulisi, “Development of a MEMS based wearable motion capture system, in: Proceedings of the 2nd Conference on Human System Interactions, 2009, pp. 255–259. [14] A. D. Young, M. J. Ling, D. K. Arvind, Orient2: a realtime wireless posture tracking system using local orientation estimation, EmNets 0 07, in: Proceedings of the 4th Workshop on Embedded Networked Sensors, ACM, 2007, pp. 53–57. [15] H.P. Brückner, R. Nowosielski, H. Kluge, H. Blume, Mobile and wireless inertial sensor platform for motion capturing in stroke rehabilitation sessions, in: Proceedings of the, 2013 5th IEEE International Workshop on, Advances in Sensors and Interfaces (IWASI), 2013 pp. 14–19. [16] ”mracoli  The mController Radio Communication Library, 2012 〈http://uracoli. nongnu.org/〉. [17] G. Welch, G. Bishop, An introduction to the kalman ﬁlter, in: Proceedings of the International Conference on, Computer Graphics and Interactive Techniques, 2001. [18] Roetenberg D, Luinge HJ, Baten CTM, Veltink PH, Compensation of magnetic disturbances improves inertial and magnetic sensing of human body segment orientation, Neural Syst. Rehab. Engg., IEEE Tran. 13 (3) (2005) 395, 405. [19] A. Young, Comparison of orientation ﬁlter algorithms for realtime wireless inertial posture tracking, in: Proceedings of the Sixth International Workshop on Wearable and Implantable Body Sensor Networks(BSN), 2009, pp. 59–64. [20] H.P. Brückner, C. Spindeldreier, H. Blume, E. Schoonderwaldt, E. Altenmüller, “Evaluation of inertial sensor fusion algorithms in grasping tasks using real input data, in: Proceedings of the Ninth International Conference on Wearable and Implantable Body Sensor Networks(BSN), 2012, pp. 189–194. [21] J. K. Lee, E. Park, A minimumorder kalman ﬁlter for ambulatory realtime human body orientation tracking, in: Proceedings of the IEEE International Conference on Robotics and Automation, 2009. [22] S. Sabatelli, F. Sechi, L. Fanucci, A. Rocchi, A sensor fusion algorithm for an integrated angular position estimation with inertial measurement units, in: Proceedings of the European Conference on Design, Automation Test and Exhibition (DATE), 2011. [23] S. Sabatelli, M. Galgani, L. Fanucci, A. Rocchi, A double stage Kalman ﬁlter for sensor fusion and orientation tracking in 9D IMU, in: Proceedings of the IEEE Sensors Applications Symposium, 2012. [24] J. Calusdian, X. Yun, E. Bachmann, Adaptivegain complementary ﬁlter of inertial and magnetic data for orientation estimation, in: Proceedings of the IEEE International Conference on Robotics and Automation, 2011, pp. 1916–1922. [25] Vicon Motion Systems, 〈http://www.vicon.com/products/nexus.html〉.