Here is the Thomas Bewley interview by Sander Olson. Dr. Bewley heads the Coordinated Robotics Lab at the University of California San Diego. The UCSD lab’s specialty is using feedback control to impart stability to robots. Dr. Bewley is an expert on simulation, optimization, and control issues relating to robotics, and he is currently writing a textbook on this subject. Dr. Bewley’s lab is improving their robots at a rapid pace, and is developing updated versions of robots every year.
Question: You lead the Coordinated Robotics Lab at the University of California, San Diego (UCSD). What makes this lab unique?
Answer: The Coordinated Robotics Lab is fundamentally a dynamics and control shop that does robotics; most other such labs are robotics shops that sometimes do control. We emphasize up front the modeling and control of dynamically unstable systems – we have found that creative, minimalist robotic designs which sacrifice open-loop stability can often achieve greatly enhanced maneuverability. We endeavor to create small, minimalist robots that can overcome large, complex obstacles via clever leveraging of advanced feedback control strategies.
Question: How do you obtain high performance?
Answer: Essentially, by supplanting open-loop stability with stability obtained via feedback control, like in modern jet aircraft.
We began our robotics research looking at hopping robots. The basic pogo stick concept, though itself relatively inefficient, is a fundamental reference problem in robotics: essentially, it is the discrete-time equivalent of the classic inverted pendulum. That project, iHop, has evolved through several generations, and has spawned a new treaded concept, Switchblade, and a new spherical concept, iceCube. The iHop concept has also been scaled down into a simple, non-hopping, ball flinging design, iFling, which is appropriate for the toy market. iHop, Switchblade, iceCube, and iFling are the four most mature robot designs in our lab today; all four are patent pending.
Question: The “killer apps” of your lab involve enhanced perception of the physical world. How is this achieved?
Answer: Well, first, we are researching how to “map” a room by sending in vehicles, taking multiple location-tagged, distance-calibrated images, and then “sewing” these images together to form a 3D virtual reality, like in a modern first-person shooter video game. We are also looking at tracking pollution in the environment by tracking a contaminant density, and the winds that drive its evolution, in order to forecast where contaminants are headed using weather-forecasting type algorithms. Our initial demonstrations of these algorithms used unmanned ground vehicles (UGVs) in a parking lot at UCSD, but the algorithms we are developing extend directly to unmanned aerial vehicles (UAVs) and unmanned underwater vehicles (UUVs), exploring, for example, the ash in the skies over northern Europe, and the oil in the waters off the gulf coast, respectively.
Question: None of your robots use legs. Why is that?
Answer: Actually, iHop does have some sort of “leg”, and we are in fact exploring some natural extensions of this concept. But you are correct, we don’t constrain our designs to mimic biological solutions. We appreciate that engineers can build wheels, motors, and engines that are incredibly efficient, whereas biological organisms can build linear actuators, or “muscles”, with maximum efficiency. As we see it, engineering labs that attempt to be “bio-mimetic” are a priori constraining their designs via biological constraints rather than the reality of practical engineering constraints. Given the obvious efficiency of rolling and rotary motions in the engineering setting, we have consciously decided to not follow a bio-mimetic design paradigm, despite the obvious popularity and visual aesthetic of such an approach. So, though we are often inspired by efficient biological systems, we don’t constrain ourselves to mimic biological designs.
Also, we like to focus on problems that other labs just aren’t looking at as much.
Question: You are now in your third generation iHop. For what uses could iHop be used?
Answer: iHop leverages its two large wheels to roll around efficiently whenever possible, in horizontal roving mode or vertical, “Segway-like” roving mode. When necessary, it can also jump over any of a number of complex obstacles. Thus, an iHop-like design could be used to search burning buildings with stairs, overturned furniture, holes in the floor, burning obstacles which must be overcome quickly (by hopping) to ensure the survival of the robot, etc. It can also be used in urban combat and homeland security settings to map unexplored buildings or caves, or in mine rescue situations, etc.
Question: Does your lab collaborate with other labs at UCSD?
Answer: Yes, our work is highly interdisciplinary, and there is extensive collaboration with labs working in other related fields, including artificial intelligence (AI), vision, optics, sensor technology, and low-power communications leveraging emerging commercial off-the-shelf (COTS) components. Though our own system design expertise centers on enhanced mobility and agility via clever application of advanced 3D dynamics and estimation/control theories, we are actively working to incorporate recent advances in all of these related fields in our designs.
Question: Describe the iceCube, iFling, and Switchblade robots.
Answer: iceCube is probably the most unusual design. It is a self-propelled, self-guided, highly agile spherebot – I like to call it The Ball That Even Obama Can Bowl With. Remember the common high-school science experiment in which you sit in a swivel chair and hold a heavy spinning bicycle wheel, then you gently gimbal the axis of the spinning wheel and suddenly feel a large reaction force that turns you in your chair? Well, iceCube has four such spinning flywheels that can be gimbaled, known in this application as control moment gyros (CMGs). These CMGs are optimally configured inside the body to give it maximum agility. Imagine sitting not on a swivel chair but inside a sphere, with not one but four spinning bicycle wheels at your disposal that you can gimbal as you please in order to propel yourself. This is the iceCube problem – it is a fascinating exercise in 3D dynamics and controls, and leads to a wicked new robotic design with interesting new applications.
iFling is a self-righting little Segway-like vehicle that can pick up and throw ping-pong balls. Due to the careful attention paid during its design, picking up a ball is quite easy with this vehicle: simply roll over a ball to wedge it between the body and one of the rotating wheels – the ball is then automatically lifted by the rotary motion of the wheel, and stored in a cage which can hold up to four balls. Throwing balls is also quite effective with this design, and is achieved, on command, in a precise and energetic lacrosse-like fashion.
Switchblade is a nimble reconfigurable treaded rover. The treads are designed in such a way that they can each independently pivot around the axis of the driver wheel of the tread mechanism via a sophisticated “hip joint”. The design is capable of, among other things, balancing on its front wheels, on its back wheels, and even perching on the edges of successive steps to climb stairs. It can also engage an “active suspension” mode to reject disturbances while driving on rough terrain, leveraging the pivot at the hip joint.
You really gotta see these designs to appreciate how cool they are… probably the best thing is for me to refer your readers to http://robotics.ucsd.edu to see the CAD models and prototypes.
Question: Every year, your lab appears to be coming out with brand new robotic concepts, and significant improvements on existing robotic concepts of your own design. How long can your team continue this rate of innovation?
Answer:The field of agile mobile robotics is still in its infancy – there are quite a few basic configurations yet to be fully explored. There are several new designs we are working on that in our lab which are as yet unannounced. Stay tuned.
Question: What is the primary cause of the exponential growth we are seeing in this field?
Answer: Primarily, its because the cost of the enabling technologies is now affordable. Due to the prevalence of airbag deployment systems, video game controllers, smartphones, laptop computers, etc., accelerometers, gyros, cameras, wireless communication systems, and the like are now quite inexpensive. This has greatly opened up the field for small, cheap, mobile robots for a variety of practical uses.
Question: You are also an expert in high-performance computing. Can high-performance computing benefit robotic systems?
Answer: Absolutely. Other groups are spending considerable effort looking at swarming behaviors: that is, group motions that emerge from very simple vehicle interaction rules repeated over hundreds or thousands of vehicles. Such work is indeed of some emerging academic interest. However, another engineering reality is that the intelligence that can be put into robotic vehicles today is much more powerful than is currently being leveraged by such existing swarming studies. Further, massive centralized computational resources are now widely available, and radios are cheap. Thus, in many situations it is quite unnecessary to coordinate expensive sensor-laden vehicles with simplistic decentralized control strategies. The problem of weather forecasting, with hundreds of thousands of deployed remote sensors, is certainly not solved (or solvable) following such a simplistic paradigm.
A buzzword floating around these days to describe the fundamental interplay between sensors, robotic systems, and high-performance computing is “cyber-physical systems”. I generally abhor the use of such broad buzzwords in science, as they often are used by PIs, when communicating with contract monitors, in a manner which implies, quite unjustifiably, a “dominance” by their particular research group over a very broad field (such as the interplay between sensors, robotic systems, and high performance computing, an interplay that countless research groups have been studying for quite some time). Nonetheless, this particular buzzword is evocative of an interplay that will have a very important role in the growth of the field of robotics in the years to come.
Question: Are advanced computing resources being used to design the robots themselves?
Answer: Modern CAD software tools allow groups like ours to consider and quickly refine many iterations of a design before we ever build and test a single prototype. For example, when we designed the iFling, we went through over 25 significant design iterations in a two month period before we built the first prototype. CAD tools are thus instrumental in the development of advanced robotic designs.
Question: When can we expect commercial robots to be able to map their surroundings in real time?
Answer: Actually, we are already starting to see this capability. The well-known Roomba vacuum robot by iRobot, which seemed cool just a couple of years ago, uses very rudimentary logic – proceed until you have to stop, turn a random number of degrees, then proceed again. The next-generation robots of this class, such as the Neato XV11, actually do a sophisticated computation to develop a 2-d map of the floor of the room, then efficiently vacuum the room based on this map, just as you would mow a lawn. This enhancement in intelligence effectively makes the Romba concept completely obsolete, even if both vacuums had the same suction capability. This is a great example of how intelligent use of robotic mapping can have an important impact in commercial applications.
The next step is to do such mapping in 3-d, and this should be possible in the very near future. This will enable further important commercial applications, such as stock management in warehouses and supermarkets, and certainly other applications that we don’t yet envision today.
Question: Are any major corporations funding robotics development?
Answer: When talking about the field of robotics in broad terms, I think it is useful to divide the field into two broad categories: mobile robotics and robotic arms.
Robotic arms for electronics soldering, heavy industrial welding, precision surgery, and defusing improvised explosive devices (IEDs) have been commercialized for many years, and continue to evolve under heavy corporate funding. NASA has also targeted funding for robotic arms for launching and retrieving satellites from the space shuttle, sampling the Martian landscape, etc.
Advanced mobile robotic systems perhaps have a longer way yet to go. There are many promising paradigms to propel and maneuver mobile robotic systems that have yet to be explored. The Pacbot by iRobot and the Talon by Foster Miller are both evocative of designs that are at least 50 years old, but both perform the job of defusing IEDs fairly well. Smaller companies like SeaBotix are exploring new novel concepts for UUVs. Japanese corporations have been funding bio mimetic robotic systems for quite some time, and their designs have achieved a high level of sophistication. In the U.S., NASA and GM are jointly developing a “Robonaut” for routine tasks on the space station. So, there are a lot of exciting things going on in this field, but certainly a lot yet to come; I believe the funding for advanced mobile robotic systems will ramp up significantly when corporations appreciate better their potential impact.
Question: What will be the first commercial applications based on your robots?
Answer: I foresee the first commercial application to be in the toy industry. Ironically, high-end military, search and rescue, and homeland & border security applications benefit directly and heavily from technology developed for the toy industry. The unrelenting financial pressure of mass production in the toy industry spurs innovation that substantially drives down the cost of sophisticated technology. The trickle up of mass produced technology into high-end applications is a fascinating reality in today’s market, and is a trend that is expected to continue.