Speech I: Day 2 (Nov. 15) 9:00-10:00
- Mandayam Srinivasan
- Director, MIT and UCL TouchLabs
Professor of Haptics, Department of Computer Science, University College London, UK
- Title: Hurdles in the Hunt for a Haptic Killer App
Abstract & Biography
Finding a haptic killer app would greatly accelerate the investment and growth of research and development in all aspects of haptics. Despite decades-long efforts around the world, success has been limited, owing to multiple hurdles that span scientific, technological, and business related issues. In this talk, I will trace our adventures in this hunt for a haptic killer app and identify the hurdles in human, machine, and computer haptics that have been overcome as well as those that still remain. I will illustrate my reasoning with autonomous, virtual reality and teleoperator systems we have developed for a variety of application areas such as robotics, entertainment, training, rehabilitation, and healthcare.
Dr. Srinivasan’s research over the past 3 decades on the science and technology underlying information acquisition and object manipulation through touch has played a pivotal role in establishing the multidisciplinary field of modern Haptics. He has been recognized worldwide as an authority on haptic computation, cognition, and communication in humans and modern machines such as computers and robots. His pioneering scientific investigations of human haptics involving biomechanics, neuroscience and psychophysics has led to significant advances in our understanding of how nerve endings in the skin enable the brain to perceive the shape, texture and softness of objects through the sense of touch. His work on machine and computer haptics involving design and development of novel robotic devices, mathematical algorithms and real-time control software has enabled touching, feeling and manipulating objects that exist only virtually as programs in the computer. He has also demonstrated novel haptic applications such as virtual reality based simulators for training surgeons, real-time touch interactions between people across continents and direct control of robots from brain neural signals. More recently, he has been working on developing haptic aids for blind people, smartphone based healthcare for underserved populations, novel robotic fingertips, and teleoperation systems for micro/nano manipulation capable of performing surgery on a single cell with micron precision.
Speech II: Day 2 (Nov. 15) 13:30-14:30
- Katherine J. Kuchenbecker
- Director, Haptic Intelligence Department, Max Planck Institute for Intelligent Systems, Stuttgart, Germany
- Title: Telerobotic Touch
Abstract & Biography
I define a haptic interface as a mechatronic system that modulates the physical interaction between a human and his or her tangible surroundings. After describing three archetypal haptic interface designs and explaining how such systems typically function, this talk will trace the trajectory of my research on telerobotic touch from 2002 to the present. Motivated by applications from robot-assisted surgery to household robotics, my co-authors and I have looked for clever ways to enable a human to feel what a teleoperated robot is touching. The haptic interfaces we have created tend to focus on providing naturalistic tactile cues, such as high-frequency vibrations and fingertip contact, rather than more commonly studied kinesthetic cues like force and torque. We have repeatedly found that well-designed tactile cues enhance system usability and increase operator performance because they convey rich manipulation-relevant information without compromising the teleoperator’s closed-loop stability.
Katherine J. Kuchenbecker directs the Haptic Intelligence Department at the Max Planck Institute for Intelligent Systems in Stuttgart, Germany. She earned her Ph.D. in Mechanical Engineering at Stanford University in 2006, did postdoctoral research at the Johns Hopkins University, and was an engineering professor at the University of Pennsylvania before she moved to Max Planck in 2017. Her research centers on haptic interfaces, which enable a user to touch virtual and distant objects as though they were real and within reach, as well as haptic sensing systems, which allow robots to physically interact with objects and people. She delivered a TEDYouth talk on haptics in 2012 and has been honored with a 2009 NSF CAREER Award, the 2012 IEEE Robotics and Automation Society Academic Early Career Award, a 2014 Penn Lindback Award for Distinguished Teaching, and various best paper and best demonstration awards. She co-chaired the IEEE Technical Committee on Haptics from 2015 to 2017 and co-chaired the IEEE Haptics Symposium in 2016 and 2018.
Speech III: Day 3 (Nov. 16) 9:00-10:00
- Dong-Soo Kwon
- Director, Human-Robot Interaction Research Center and Center for Future Medical Robotics, KAIST, Korea
- Title: Journey of KAIST Research in Haptics
Abstract & Biography
Telerobotics and Control Laboratory (TCL) at KAIST was established in 1995, and has 23 doctoral and 46 master’s graduates. The beginning of the haptics was research on 6 DoF haptic master for telerobotics, and the field has continued to develop a wide range of technology from actuators that generate various tactile sensation in mobile devices to tactile displays that render texture and shape on touch surfaces. Using the knowhow from research, TCL is broadening the use of its technology by developing haptic braille pad and braille display module for the visually impaired working with a startup company. During TCL’s growth, graduates from the program also planted the “seed” of haptic research to grow and bear fruit in diverse fields. Research on telerobotics has become a “seed” of rehabilitation and surgical robots, and research on tactile display has become a “seed” of soft robotics. Much of the research that originally started from the field of haptics has expanded into a variety of fields such as virtual reality interface and deep-learning based haptic interaction, and the journey has been remarkable. Therefore, I would like to take this opportunity to introduce the research history of TCL, haptics group graduates, and their research journeys.
Dong-Soo Kwon is a Professor in the Department of Mechanical Engineering at the Korea Advanced Institute of Science and Technology (KAIST), Director of the Human-Robot Interaction Research Center, Director of the Center for Future Medical Robotics. He is serving the IEEE Robotics and Automation Society (RAS) as a member of the Administrative Committee (AdCom). In addition, He is the founder CEO of EasyEndo Surgical Inc., Chairman of the board of directors of Korea Institute of Robot and convergence (KIRO), and a member of National Academy of Engineering of Korea (NAEK).
His research deals with Medical Robotics, Haptics, and Human-Robot Interaction. He has contributed to the advancement of several robot venture companies by technology transfer. Recently, he has established a start-up company based on his medical robot research results.
He had worked as the Research Staff in the Telerobotics section at Oak Ridge National Laboratory from 1991 to 1995. He was a Graduate Research Assistant in Flexible Automation Lab. at Georgia Institute of Technology from 1985 to 1991, and the Section Chief, Manager at R&D Group of Kanglim Co., Ltd from 1982 to 1985. He received the Ph.D. in the Department of M.E. at Georgia Institute of Technology in 1991, M.S. in the M.E. at KAIST in 1982, and B.S. in the M.E. at Seoul National University in Korea in 1980.