Sander Olson Interviews Charlie Kemp about the future of Healthcare robotics

Here is the Charlie Kemp interview by Sander Olson. Sander Olson has provided dozens of first hand interviews to Nextbigfuture. Sander is an online journalist who has been working parttime with Nextbigfuture. Dr. Kemp is a Professor at the biomedical engineering department at Georgia Tech and Emory University. His lab is focused on developing robots that would be capable of aiding the sick and elderly. His lab has developed several custom robots, and has also received a PR2 robot from Willow Garage. Dr. Kemp believes that the healthcare industry could become one of the primary drivers of robotics development.

Healthcare robotics website

Question: You founded the Healthcare Robotics Lab at Georgia Tech. What is the focus of your lab?

Answer: I founded my lab in 2007 to investigate how mobile robots that physically manipulate the world can improve healthcare. The type of robot we work with is called a mobile manipulator, because it can move from place to place in an environment and physically manipulate the world. Three of the mobile manipulators in my lab (EL-E, Cody, and GATSBII) have wheels and arms, and are approximately the size of a person. Human-scale mobile manipulators are exciting, because they have the potential to perform a wide variety of helpful tasks in human environments. For example, their arms and human-scale let them reach important locations, such as table tops. Their size also makes it easier for them to have the strength to perform tasks, such as opening a refrigerator door. My lab is especially interested in how this type of robot can assist older adults, people with disabilities, people incapacitated by illness or injury, and healthcare workers.

Question: Do you believe healthcare will become one of the major drivers of robotics?

Answer: I don’t know if healthcare will be a major driver for robotics, but it definitely shows great promise. Medical robots, such as the da Vinci robot from Intuitive Surgical, are already playing an important role in the growth of the robotics industry. In the long-term, I’m confident that mobile manipulators can provide valuable and economical assistance to people. Millions of people across the world require daily physical assistance that is currently provided by healthcare professionals, family members, and service animals. Due to aging populations in the US, Europe, and Japan, the number of people who require assistance is increasing and there are worsening shortages of healthcare workers. So, there’s a real opportunity for robots to help both care receivers and caregivers. Robots could offer advantages over other forms of care, such as greater privacy, greater independence, extreme vigilance, 24/7 personalized care, and more consistent performance. In the long term, I’m optimistic the cost will be reasonable, too, since mobile manipulators are a general purpose technology which should enable economies of scale.

Question: Your team recently won a PR2 robot from Willow Garage. What are you going to do with it?

Answer: We’re investigating how human-scale mobile manipulators, like the PR2, can benefit older adults at home. One of our motivations is that many older adults would prefer to live at home, instead of moving to an assisted living facility. Robots like the PR2 may be able to help people live at home longer. At Georgia Tech, we’ve named our PR2 GATSBII, which stands for GATech Service Bot with Interactive Intelligence. Our project is multi-disciplinary with faculty from biomedical engineering, psychology, and interactive computing. It’s an exciting blend of expertise. For example, Prof. Wendy Rogers from the School of Psychology is an expert in technology for older adults. We’re all working together to better understand what older adults would want, and how to enable the PR2 to do useful things for them. We’ll be working a lot with both older adults and GATSBII. GATSBII is currently in my lab, but will spend 3 months a year in the Aware Home.

Question: What is the Aware Home?

Answer: The Aware Home is a free-standing, two story house located on the Georgia Tech campus. We will be using it as a realistic environment in which to test GATSBII with older adults. We’ll also be using it to test the robustness of our methods in a realistic home environment. It’s not uncommon for robots to work in a particular lab and then fail when they’re moved to a different environment. Enabling robots to work robustly outside of the lab under real-world conditions is one of the great challenges for researchers right now. I believe that working with GATSBII in a real house for 3 months a year will help us address this challenge.

Question: How does the PR2 robot compare with your robot Cody?

Answer: Cody is a human-scale mobile manipulator that my lab designed and assembled using components from various manufacturers, including Segway and Meka Robotics. Cody and GATSBII have comparable capabilities. For example, each robot has two compliant arms with 7 joints, and an omnidirectional mobile base. GATSBII has a richer sensor suite for visual perception, but Cody has better force sensing using the force-torque sensors at its wrists. I think they’re both good examples of human-scale mobile manipulators, and representative of the types of capabilities that are desirable for this type of robot.

Question: What are the major differences?

Answer: We’re still learning about the strengths and weaknesses of each robot. Cody is a one-of-a-kind robot. My lab had to build Cody and write much of its software. In contrast, GATSBII is one of 16 commercially produced PR2s currently at research organizations around the world. This is one of GATSBII’s biggest strengths, since it comes with an active research community that’s producing open source software. GATSBII is also well tested and comes with a full suite of documented software out of the box. After having built and programmed 3 different custom mobile manipulators (EL-E, Dusty, and Cody), my lab found it very refreshing to be able to productively use GATSBII for research from day one. This is a great step forward for robotics!

At the same time, I think we’re finding that Cody’s arms, which were produced by Meka Robotics, may have some advantages over the PR2’s arms. It’s still too early to definitively discuss their differences, since we’ve only been working with the PR2 for 6 months. I can say that we especially like using Cody’s wrist-mounted force-torque sensors, since they enable Cody to use haptic sensing (sense of touch) while manipulating the world and interacting with people. The PR2 has tactile sensors on its gripper, but they lack some of the information we’ve used in our research with Cody. I believe that haptic sensing is a very important modality for mobile manipulators, and my lab has shown that robots can use haptic sensing to perform a variety of tasks, such as opening a door or drawer with moderate forces without knowing its geometry in advance, or softly making contact with a person’s body in order to clean a patch of skin

In general, we’re at the dawn of commercially available, human-scale mobile manipulators. I expect that this type of robot will continue to evolve as researchers and practitioners learn what features are important. I’m also confident that we haven’t begun to approach the limits of what GATSBII and Cody can do with the right software.

Question: What other sensors would you like to see in robots?

Answer: I think force sensitive skin covering the entire body of the robot would be very useful. It’s advantageous for a robot operating in an unstructured environment, like the home, to directly sense the forces it is applying to the world. Humans take this type of sensing for granted, and tend to injure themselves without it. Without whole-body force sensing, I suspect that robots will be prone to make errors with bad consequences that could otherwise be avoided with ease.

I’m also interested in the potential for novel types of sensors to be used by mobile manipulators. For example, my lab has been researching how to use short and long-range RFID antennas to better perform tasks. The idea is to apply inexpensive self-adhesive RFID tags to locations and objects, and then let the robot use these tags to better perceive and manipulate the world. We’ve already added two articulated RFID antennas to GATSBII, which enable GATSBII to move toward an RFID tag worn by a person in order to deliver medicine. More generally, the tags’ unique identifiers can be used by robots to look up information in a database that tells the robot information about the tagged object, such as how to manipulate it.

Question: What is Dusty?

Answer: Dusty is a specialized robot capable of picking up objects from the floor and delivering them to a seated person. In contrast to the three expensive human-scale mobile manipulators in my lab (EL-E, Cody, and GATSBII), which are designed to be general purpose, Dusty is a small, inexpensive mobile manipulator that’s designed for a specific task. Dusty helps people with physical disabilities retrieve objects that they’ve dropped. My lab has collaborated with the Emory ALS Center to better understand the needs of people with disabilities. We found that people tended to drop small lightweight objects, and that they would like help recovering these objects. It can sound strange at first to an able-bodied person, but imagine frequently dropping objects you’re using and being unable to recover them without assistance. Everyday activities like taking a pill or using a TV remote could be frustratingly interrupted. Retrieving dropped objects is an example of how robots can help give people greater independence and a higher quality of life.

Question: How does Dusty work?

Answer: Dusty uses a special robot hand that works like a dustpan and brush. A flat plate with a leading wedge moves along the ground and under the object, while a finger sweeps the object onto the plate. The finger holds the object on the plate, while a scissor lift raises the plate up to a height that’s easy for a seated person to reach. Dusty picks up objects from the floor extremely well. We’ve rigorously tested it on a variety of floors with objects that people with ALS have told us are important. In our tests, the robot has successfully grasped these objects over 97% of the time. Dusty easily picks up objects that are very difficult for much more expensive robots to grasp, such as credit cards and small pills.

Question: Is Dusty tele-operated ?

Answer: Yes, you use a joystick to drive the robot and two buttons. One button tells Dusty to pick up the object in front of it, and the other button tells Dusty to lift the tray up to a comfortable height. Dusty is now in its second generation and we’ve recently completed a user study with 20 ALS patients with very promising results. We’re excited about the possibility of commercializing Dusty.

Question: Many robotics researchers are excited about Microsoft’s Kinect sensor. Why is that?

Answer: The Kinect sensor is exciting to roboticists, because it’s the first affordable sensor to produce high-quality color and depth information at video rates. In other words, it provides high-quality 3D video that can be used by robots to better perceive the world. Until now, the available sensors were either expensive, slow, low quality, or some combination of the three. For example, for much of my lab’s research, we have mechanically tilted an expensive laser range finder using a motor, and then taken a picture with a nearby color camera in order to slowly obtain data similar to what the Kinect provides. Many labs, including my own, have now bought numerous Kinects. We have high hopes! I think the commercial availability of these sensors is an excellent sign for the future of robotics.

Question: How far are we from fully autonomous robots?

Answer: Many roboticists have the ultimate goal of creating fully autonomous robots. In well-controlled environments, such as factories, robots already perform well-defined tasks autonomously with remarkable consistency. The problem is that many environments, such as the home, vary a great deal and the tasks are more open ended. Even a task as seemingly simple as retrieving an object in a home can be made arbitrarily complex by introducing challenges, such as a stuck door, obstructing furniture, fragile surrounding objects, a locked safe, and a vague description of the object to be retrieved. As such, even seemingly simple tasks might require human-level AI (or better) in order to autonomously perform them in the general case. Fortunately, robots don’t have to solve the general case in order to be useful. For many tasks, it’s probably sufficient for robots to operate autonomously in statistically common situations, and to work with humans otherwise.

Question: How important is AI to robotics?

Answer: Historically, AI and robotics have been closely related. The Roomba is a nice example of this, since it arguably resulted from AI research by Rod Brooks in the 1980’s. In many ways, robots represent the original grand challenge for AI. AI and robotics continue to be important to one another, and this is well recognized. For instance, AI is one of 6 core areas associated with the Georgia Tech Robotics PhD program, which started in 2008.

Question: Could a robotics project lead to a major AI advance?

Answer: By seeking to develop robots that perform useful tasks in human environments, we may be placing ourselves on the road to human-like AI. I believe robots that operate in human environments are subject to the same types of evolutionary pressures that led to human intelligence. Specifically, robots in human environments will be more useful if they can interact with people better and physically manipulate the world better. I suspect that AI that enables robots to do these things well will seem very human-like.

Question: How much progress do you foresee in robotics during the next decade?

Answer: By 2020, I’m optimistic that we’ll see widespread use of commercially available mobile manipulators. Hopefully, there will be a vibrant industry with many different models to choose from, lots of software available, and strong consumer interest across many demographics. I’m not sure if we’ll see this in 10 years, but in the long term, I’m very bullish on this technology. I see it as the next car industry or personal computer industry, and imagine that there will be a mobile manipulator in every home. This type of general purpose robot could enhance people’s lives in a wide variety of ways, including health, entertainment, telecommunications, and housework. With respect to health, my hope is that there will be a variety of health-related software applications that can be run on general purpose robots priced for consumers. For example, I imagine robots providing health monitoring, helping people with medicine, and assisting with activities of daily living (ADLs), such as feeding and bathing. If all goes well, these robots will enable older adults and others to have greater independence, improved health, and a higher quality of life.

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks
netseer_tag_id = “2397”;

netseer_ad_width = “750”;

netseer_ad_height = “80”;

netseer_task = “ad”;

Featured articles

Ocean Floor Gold and Copper
   Ocean Floor Mining Company

var MarketGidDate = new Date();
document.write(”);