143 Pages
English

High-fidelity haptics in multimodal human-robot interaction [Elektronische Ressource] / Zheng Wang

Gain access to the library to view online
Learn more

Subjects

Informations

Published by
Published 01 January 2010
Reads 15
Language English
Document size 5 MB

Lehrstuhl fu¨r Steuerungs- und Regelungstechnik
Technische Universit¨at Mu¨nchen
Univ.-Prof. Dr.-Ing.(Univ. Tokio) Martin Buss
High-Fidelity Haptics in Multimodal
Human-Robot Interaction
Zheng Wang
Vollst¨andiger Abdruck der von der Fakult¨at fu¨r Elektrotechnik und Informationstechnik
der Technischen Universit¨at Mu¨nchen zur Erlangung des akademischen Grades eines
Doktor-Ingenieurs (Dr.-Ing.)
genehmigten Dissertation.
Vorsitzender: Univ.-Prof. Gordon Cheng, Ph.D.
Pru¨fer der Dissertation:
1. Univ.-Prof. Dr.-Ing.(Univ. Tokio) Martin Buss
2. Univ.-Prof. Dr.-Ing., Dr.-Ing.habil. Alois Knoll
Die Dissertation wurde am 21.06.2010 bei der Technischen Universit¨at Mu¨nchen einge-
reicht und durch die Fakult¨at fu¨r Elektrotechnik und Informationstechnik am 29.10.2010
angenommen.Foreword
This dissertation concludes four years of my research conducted at the Institute of Auto-
maticControlEngineering(LSR),TechnischeUniversit¨atMu¨nchen. Theworkissupported
by European Union FP6 project Immersence.
First of all, I would like to thank mein Doktorfater, my supervisor Prof. Dr.-Ing/Univ.
Tokio Martin Buss, who not only offered me a research position in one of the best estab-
lishedroboticsgroup, butalsoshowedmethepaththroughacademiatowhereIam today.
He had always shared with me his vision upon research topics while leaving me enough
freedom for exploration and innovation.
My next most sincere appreciation goes to Dr. Angelika Peer, with whom I shared
uncountable scientific discussions. She had always helped me with her experience and
wisdom during the last two years of my research at LSR.
A special thank goes to Prof. Louis Phee, my current supervisor at Nanyang Techno-
logical University, Singapore, whose kindness and help made it possible for me to finish
writing the dissertation.
MuchoftheworkinthisdissertationwasconductedincollaborationwithmyImmersence
partnerswithin,andoutsideLSR.MysincerethankstoJensHoelldampf,RaphaelaGroten,
Ansgar Bittermann, for all the inspirations and help in LSR; to Elias and Mel at UPC,
to Nicola, Pasquale, Mario, and other members of the Italian team I cannot name all,
at UNIPI, to Benjamin, Juan, Paul, and Abder at LSC, to Christos at UBIRM, to Max
and Marc at MPI, to Javier and Manuel at UPM, for all the good collaboration and
achievements in Immersence.
A brief, but very special appreciation goes to Mr. Andreas Schweinberger. He had
offered me so much help when I first started my life in Deutschland, and kept being a
friend and supporter of me during my hardest times. Little kindness goes a long way.
Next I would like to thank my LSR colleagues, with whom I shared past four years of
my life. Special thanks go to Chih-Chung Chen, Tingting Xu, Tianguang Zhang, Hao
Ding, and Haiyan Wu, for lunch, dinner, and other uncountable events and help. Thanks
to Kwang-kyu Lee, my first roommate and best friend, who shared with me beers and
wisdom. I also would like to thank Daniela Feth, Carollina Passenberg, Thomas Schauss,
and Nikolay Stefanov, for the fruitful discussions in the haptic group; to thank Tobias
Goepel, Matthias Rungger, Iason Vittorias and Andreas Schmid, for sharing an office with
me.
My next appreciations go to Dr. Dirk Wollherr, who had always kindly helped me in
administrative issues, and the secretaries and technicians of LSR, Fr. Schmid, Fr. Werner,
Fr. Renner, Hr. Jaschik, Hr. Gradl, Hr. Kubick, and Hr. Lowitz, without your help,
nothing would have been possible.
I would also like to thank all students who had worked with me during the past years,
especially Jun, Ziqing, Ji, Yang, Lei, Qixun, Mingxiang, Licheng, Rubens, Hong, and
iiiShuning, who contributed their hard work to the development of the systems as well as
the experiments.
My final and most sincere thanks go to my family, and to the ones who had always put
˘their faith in me, no matter how remote they might be. Xi`e Xi`e Ni M´en.
Singapore, June 2010 Zheng Wang
ivTo life...
vContents
1 Introduction 1
1.1 Problem definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Main contributions and structure of the dissertation . . . . . . . . . . . . . 2
2 Haptic rendering of arm dynamics 1: modeling and replay 4
2.1 Physicality and challenges to haptic rendering . . . . . . . . . . . . . . . . 4
2.2 A framework for human haptic modeling . . . . . . . . . . . . . . . . . . . 5
2.2.1 Handshake: a process-oriented human motor skill . . . . . . . . . . 5
2.2.2 Related works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.3 A framework for human haptic modeling . . . . . . . . . . . . . . . 6
2.3 Developing a handshake robot with realistic arm behavior: overview . . . . 8
2.3.1 Pilot study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3.2 Robotic interfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.4 Basic controllers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.4.1 The first modeling iteration . . . . . . . . . . . . . . . . . . . . . . 11
2.4.2 The second modeling iteration . . . . . . . . . . . . . . . . . . . . . 13
2.4.3 The third modeling iteration . . . . . . . . . . . . . . . . . . . . . . 14
2.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3 Haptic rendering of arm dynamics 2: towards an interactive controller 18
3.1 Human behavior model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.2 Interactive controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.2.1 Fast online parameter estimation . . . . . . . . . . . . . . . . . . . 21
3.2.2 Symbol abstraction and HMM intention estimation . . . . . . . . . 26
3.2.3 Trajectory planning and parameter adaptation . . . . . . . . . . . . 27
3.2.4 Refined trajectory planning . . . . . . . . . . . . . . . . . . . . . . 29
3.3 Performance validation tests . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.3.1 HBP estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.3.2 HMM estimator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
3.3.3 Overall system performance . . . . . . . . . . . . . . . . . . . . . . 37
3.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
4 Haptic rendering of hand dynamics 43
4.1 Gesture data acquisition . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
4.1.1 The CyberGlove . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
4.1.2 Pisa glove . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
4.2 Haptic data acquisition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
4.2.1 Tactile sensing technology . . . . . . . . . . . . . . . . . . . . . . . 46
viiContents
4.2.2 Glove design and implementation . . . . . . . . . . . . . . . . . . . 47
4.2.3 Measuring handshakes with TSG gloves . . . . . . . . . . . . . . . . 52
4.3 Robotic hand actuation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.3.1 BarrettHand . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
5 Visual and sound rendering 63
5.1 Visual rendering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
5.1.1 The rendering workflow . . . . . . . . . . . . . . . . . . . . . . . . 64
5.1.2 Virtual human characters . . . . . . . . . . . . . . . . . . . . . . . 66
5.1.3 Virtual hand model . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
5.1.4 Virtual environments . . . . . . . . . . . . . . . . . . . . . . . . . . 68
5.1.5 Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
5.2 Auditory rendering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
5.3 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
6 System integration and optimization 74
6.1 User dependency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
6.1.1 User dependency in haptic subsystem . . . . . . . . . . . . . . . . . 75
6.1.2 User dependency in vision subsystem . . . . . . . . . . . . . . . . . 77
6.2 Real/virtual world integration . . . . . . . . . . . . . . . . . . . . . . . . . 80
6.2.1 Incorporating real world data . . . . . . . . . . . . . . . . . . . . . 80
6.2.2 Registration of real world object . . . . . . . . . . . . . . . . . . . . 82
6.3 Integration of a second human input . . . . . . . . . . . . . . . . . . . . . 83
6.3.1 Problem definition . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
6.3.2 Proposed remedy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
6.4 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
6.4.1 Improving natural haptic interaction . . . . . . . . . . . . . . . . . 86
6.4.2 Minimizing computational load . . . . . . . . . . . . . . . . . . . . 87
6.4.3 Time delay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
6.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
7 Experiments and evaluation studies 90
7.1 Plausibility and questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . 90
7.2 Experiment 1: Robotic control algorithms comparison . . . . . . . . . . . . 91
7.2.1 Haptic rendering algorithms for HLC controller . . . . . . . . . . . 92
7.2.2 The experiment and results . . . . . . . . . . . . . . . . . . . . . . 93
7.2.3 Analysis and discussions . . . . . . . . . . . . . . . . . . . . . . . . 94
7.2.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
7.3 Experiment 2: Evaluation with haptics and audition . . . . . . . . . . . . . 98
7.3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
7.3.2 Experimental setup . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
7.3.3 Experimental procedures . . . . . . . . . . . . . . . . . . . . . . . . 101
7.3.4 Analysis and results . . . . . . . . . . . . . . . . . . . . . . . . . . 102
7.3.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
viiiContents
7.4 Experiment 3: Study of haptic-visual integration . . . . . . . . . . . . . . . 106
7.4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
7.4.2 Experimental setup . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
7.4.3 Experimental design . . . . . . . . . . . . . . . . . . . . . . . . . . 109
7.4.4 Results and discussion . . . . . . . . . . . . . . . . . . . . . . . . . 110
7.4.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
7.5 Experiment 4: Evaluation with vision, haptics, and audition . . . . . . . . 111
7.5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
7.5.2 Experimental design . . . . . . . . . . . . . . . . . . . . . . . . . . 111
7.5.3 Experimental procedure . . . . . . . . . . . . . . . . . . . . . . . . 113
7.5.4 Results and analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 114
7.5.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
7.6 Discussions and conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . 116
8 Conclusions and future directions 118
8.1 Concluding remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
8.2 Vistas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
Bibliography 121
ixNotations
Abbreviations
DOF Degree of freedom
HBP Human behavioral parameter
HLC High-level controller
HMM Hidden Markov model
HRI Human-robot interaction
LLC Low-level controller
LS Least-squares
LSMW Lease-squares with moving window
fMRI Functional magnetic resonance imaging
OI OpenInventor
PAC Position-based admittance control
PbD Programming by demonstration
PC Position control
RLS Recursive least-squares
SOA State-of-the-art
VE Virtual environment
ViSHaRD7 Virtual Scenario Haptic Rendering Device with 7 actuated DOF
ViSHaRD10 Virtual Scenario Haptic Rendering Device with 10 actuated DOF
VR Virtual reality
Conventions
x variable scalar
X constant scalar
x vector
X matrix
f(.) scalar function
f(.) vector function
x˙ time derivative
x¨ second order time derivative
xˆ estimation of x
x˜ value of x after calculation
x