Abstracts
![]() |
Special Seminar, April 26th, 12.30pm Humanoid Robotics: A Journey of Environment Manipulation, Language, and Control Professor Mike Stilman Humanoid Robotics Lab, Georgia Institute of Technology |
Humanoid Robots in Human Environments are now a practical and realistic challenge for robotics research. What does it mean for a humanoid robot to interact with its enviornment? How does it understand a complex task such as rescue, automobile assembly or cooking dinner? How could it take advantage of its anthropomorphic structure to achieve these tasks in a way that is both natural and efficient? In this talk we will discuss novel algorithms and techniques that relate our understanding of human thought to algorithms that aim to give humanoid robots the capacity to assist humans in challenging and dangerous situations.
Bio: Prof. Mike Stilman is an Assistant Professor of Interactive Computing and Director of the Humanoid Robotics Lab at the Georgia Institute of Technology. He received his BS in Mathematics, MS in Computer Science from Stanford University and his PhD in Robotics from Carnegie Mellon. Prof. Stilman's research interests focus on Motion Planning and Control of robotic systems, particularly in high-dimensional dynamic spaces such as Humanoid Robots. His recent contributions include starting the field of Navigation Among Movable Obstacles where robots autonomously interact with environment objects and the use of whole-body motion for increased performance in dynamic tasks. Publications and other resources from his lab are available online at: http://www.golems.org/
![]() |
3D Robotic Needle Steering: Ultrasound Segmentation and Teleoperated Control Troy Adebar & Ann Majewicz Advisor: Allison Okamura |
Needle steering is a technique that enables the insertion of highly flexible needles along controlled, curved, 3D paths through tissue. Robotic systems for needle steering have been described extensively in the literature, with research groups developing controllers, kinematic models, path planners, and needle-tissue interaction models. While there are many potential benefits of robotic needle steering for percutaneous interventions, in vivo testing has been extremely limited to date, and no patient studies have been reported. Two major obstacles to the clinical adoption of robotic needle steering are the need for an intuitive physician interface and methods for real-time medical image guidance.
Direct operation of steerable needles using joint space inputs (i.e. insertion and spin) is difficult for physicians due to the nonholonomic constraints of the needle. To achieve more intuitive needle control, we have developed a Cartesian space teleoperation algorithm, allowing physician users to control the needle tip position using a haptic device with force feedback.
Ultrasound imaging is an excellent modality for intraoperative guidance since it is inexpensive, portable and operates in real time. However, automatic segmentation of steerable needles from ultrasound images is a difficult problem due to image artifacts and low signal to noise. To overcome these limitations, we have developed methods for using ultrasound Doppler imaging to visualize needles that are vibrated at high frequencies.
Direct operation of steerable needles using joint space inputs (i.e. insertion and spin) is difficult for physicians due to the nonholonomic constraints of the needle. To achieve more intuitive needle control, we have developed a Cartesian space teleoperation algorithm, allowing physician users to control the needle tip position using a haptic device with force feedback.
Ultrasound imaging is an excellent modality for intraoperative guidance since it is inexpensive, portable and operates in real time. However, automatic segmentation of steerable needles from ultrasound images is a difficult problem due to image artifacts and low signal to noise. To overcome these limitations, we have developed methods for using ultrasound Doppler imaging to visualize needles that are vibrated at high frequencies.
![]() |
Design, Control, and Simulation Strategies for Haptic Interfaces Francois Conti Advisor: Oussama Khatib |
Haptics is an emerging technology that involves transmitting information through the sense of touch. This hands-on form of interaction is performed by using small actuated interfaces called haptic devices that apply forces, vibrations, and/or motions to the user. In recent years the technologies of haptics have been integrated into many new applications ranging from gaming devices in the field of computer animation to advanced interfaces for intuitively operating surgical robot systems. This talk will present recent hardware design methodologies and algorithms developed for simulating the sense of touch, and address the computational challenges associated with the real-time requirements for haptic simulation.
![]() |
Robosimian Hand Paul Karplus Advisor: Mark Cutkosky |
The design and construction of a strong, inexpensive, underactuated robotic hand for a disaster relief robot to be entered in the 2013 Darpa Robotics Challenge. There are numerous design challenges that must be tackled. Among the most difficult are making it robust to collisions, support "palm walking", operate human tools, and hold the 100kg robot on a ladder. Since the hand will be used in competition, simplicity and serviceability are importnt. The talk will focus on the design details including: electrical controller, tendon routing, sensors, software, material choice, drive components, palm-camera, and packaging. For a list of the challenge requirements see: http://theroboticschallenge.org/aboutcompetition.aspx
![]() |
Optimizing Locomotion Controllers Using Biologically-Based Actuators and Objectives Jack Wang Advisor: Scott Delp |
We present a technique for automatically synthesizing walking and running controllers for physically-simulated 3D humanoid characters. The sagittal hip, knee, and ankle degrees-of-freedom are actuated using a set of eight Hill-type musculotendon models in each leg, with biologically-motivated control laws. The parameters of these control laws are set by an optimization procedure that satisfies a number of locomotion task terms while minimizing a biological model of metabolic energy expenditure. We show that the use of biologically-based actuators and objectives measurably increases the realism of gaits generated by locomotion controllers that operate without the use of motion capture data, and that metabolic energy expenditure provides a simple and unifying measurement of effort that can be used for both walking and running control optimization.
![]() |
Simulation-Based Tools for Evaluating Underactuated Hand Designs Daniel Aukes Advisor: Mark Cutkosky |
This paper presents a tool aimed at the design of compliant, under-actuated hands. The particular motivation is hands that will be used for an underwater robot to grasp a variety of objects, some of which may be delicate or slippery. The focus of the analysis is the problem of object acquisition. In comparison to many prior grasp analysis tools, the tool presented here models the dynamics of a hand, including actuation mechanisms, compliance and friction in an efficient formulation that permits one to evaluate variations in such quantities as phalange length, finger spacing, transmission ratios, and torsional joint stiffness when comparing hand designs. The analysis is demonstrated for a quasi-static object acquisition problem and leads to the computation of a vector space of three dimensional regions for which the hand will tend to center and stably grasp a compact object.
![]() |
Replacing the Office Intern: An Autonomous Coffee Run with a Mobile Manipulator Tony Pratkanis Advisor: Ken Salisbury |
We describe our development of an autonomous robotic system that safely navigates through an unmodified campus environment to purchase and deliver a cup of coffee. To accomplish this task, the robot navigates through indoor and outdoor environments, opens heavy spring-loaded doors, calls, enters, and exits an elevator, waits in line with other customers, interacts with coffee shop employees to purchase beverages, and returns to its original location to deliver the beverages. This paper makes four contributions: a robust infrastructure for unifying multiple 2D navigation maps; a process for detecting and opening transparent, heavy spring-loaded doors; algorithms for operating elevators; and software that enables the intuitive passing of objects to and from untrained humans.
One of the most valuable functions of flexible robots is its ability to interact with constrained workspaces, where its body can conform around and interact safely with obstacles. However, because the constraints impose kinematic constraints on the robot, control becomes a problem. Traditional robotics rely on robot models for control; however, for a flexible robot bending and conforming based on unknown environmental forces and constraints at unknown points on its body, there is no good model. A way to avoid the complications of model-based control is to use model-less control. We use convex optimization to frame the model-less control problem, and show why it is especially useful for over-actuated robots and flexible robots in unknown constrained environments.
![]() |
Model-less Control of Continuum Manipulators Michael Yip Advisor: David Camarillo |
One of the most valuable functions of flexible robots is its ability to interact with constrained workspaces, where its body can conform around and interact safely with obstacles. However, because the constraints impose kinematic constraints on the robot, control becomes a problem. Traditional robotics rely on robot models for control; however, for a flexible robot bending and conforming based on unknown environmental forces and constraints at unknown points on its body, there is no good model. A way to avoid the complications of model-based control is to use model-less control. We use convex optimization to frame the model-less control problem, and show why it is especially useful for over-actuated robots and flexible robots in unknown constrained environments.
![]() |
Haptic Rendering of Medical Image Data for Surgical Rehersal Sonny Chan Advisor: Ken Salisbury |
Ear, nose, and throat (ENT) surgical procedures practiced today are incredibly sophisticated and carry with them a great degree of difficulty. This not only presents a challenge for practicing surgeons to master, but can also expose patients to higher risks of complication. Now imagine if a surgeon could rehearse a difficult procedure on a virtual "body double" of the patient, with an accurate replica of the patient-specific, surgically-relevant anatomy, prior to entering the operating room. It is common practicetoday for surgeons to examine pre-operative medical image data, including computed tomography (CT) and magnetic resonance imaging (MRI), while formulating a surgical plan. What if, in addition to visualizing the patient's anatomy through the image data, we could also allow the surgeon to touch, manipulate, and even dissect the virtual patient?
In this talk, I will briefly describe a few of the steps, in terms of haptic rendering, that our group has taken toward the vision of patient-specific surgical rehearsal. First, I will introduce the nature of pre-operative medical image data captured for typical ENT procedures and how to haptically render surfaces within these images. Then I will discuss approaches for simulating six degree-of-freedom contact between surgical instruments and the virtual anatomy, and for constructing a deformable model of the patient-specific anatomy for haptic interaction. Finally, I will show how these methods can be applied in the context of simulating sinus surgery and middle ear surgery.
![]() |
Muscular Strategy Shift in Human Running Across a Range of Speeds Tim Dorn Advisor: Scott Delp |
Humans run faster by increasing a combination of stride length and stride frequency. In slow and medium-paced running, stride length is increased by exerting larger support forces during ground contact, whereas in fast running and sprinting, stride frequency is increased by swinging the legs more rapidly through the air. Many studies have investigated the mechanics of human running, yet little is known about how the individual leg muscles accelerate the joints and center-of-mass during this task. The aim of this study was to describe and explain the synergistic actions of the individual leg muscles over a wide range of running speeds, from slow running to maximal sprinting. Experimental gait data from nine subjects were combined with a detailed computer model of the musculoskeletal system to determine the forces developed by the leg muscles at different running speeds. For speeds up to 7 m/s, the ankle plantarflexors, soleus and gastrocnemius, contributed most significantly to vertical support forces and hence increases in stride length. At speeds greater than 7 m/s, these muscles shortened at relatively high velocities and had less time to generate the forces needed for support. Thus, above 7 m/s the strategy used to increase running speed shifted to the goal of increasing stride frequency. The hip muscles, primarily iliopsoas, gluteus maximus and hamstrings, achieved this goal by accelerating the hip and knee joints more vigorously during swing. These findings provide insight into the strategies used by the leg muscles to maximize running performance.









