Animals, including humans and apes, have demonstrated impressive agility in traversing obstacles by intelligently exploiting contact forces in their environment. Studying the mechanics and planning behind their dynamic forms of locomotion can yield a simpler, more elegant, and more agile style of robotic locomotion in challenging environments.

Sensing, tracking, and then attacking other animals to consume is one of the most highly evolved and complex behaviors animals perform. We study the mechanical and neural principles underlying this behavior in two model systems: the larval zebrafish, Danio rerio, and the black ghost electric knifefish, Apteronotus albifrons. Larval zebrafish are a leading vertebrate genetic model system.

Knifefish are highly maneuverable swimmers capable of navigating complex environments. The fish generate thrust by undulating an elongated ventral fin. We study the fin mechanics using motion capture of live fish, computational fluid dynamics, and bio-inspired robotics. Using these tools, we are uncovering the underlying principles of knifefish locomotion which can then be implemented into underwater robotics to enhance maneuverability.

We are seeking to develop more reliable algorithms for use with physical systems of varying dimensionality. Using these algorithms, we address issues of computational complexity and resource management in the design of algorithms for information determination, control, and sensitivity analysis which remain applicable to complicated nonlinear and impulsive systems.

Because of their poor eyesight, rats rely on their whiskers (also called vibrissae) to obtain three-dimensional tactile information about the surrounding world, including the shape and texture of nearby surfaces. This process is in many ways analogous to the manner by which humans use their fingertips to infer tactile information from nearby objects. Our lab aims to understand this process and to replicate it artificially by creating robotic whiskers.

We are developing human-in-the-loop interfaces for task-based assistance and training. Using these interfaces, we address issues of uncertainty from both the human and unknown dynamics of the environment while allowing users to be as autonomous as possible.  This work moves away from controls that prioritize trajectory error that use a priori knowledge of the joint human-machine system in favor of data-driven approaches with applications to a broad class of tasks and sensorimotor deficits.

Autonomous systems use sensory data to make decisions about how to act in their environments. Often, either a system itself or the environment it is in can make control synthesis challenging, particularly in real-time constrained settings where the environment is changing. Embodied intelligence, where the physical design of a robot implicitly or explicitly encodes part or all of a control policy, can make control synthesis easier or unnecessary.

Nonprehensile manipulation primitives such as rolling, sliding, pushing, and throwing are commonly used by humans but are often avoided by robots. Dynamic nonprehensile manipulation raises challenges in high-speed sensing and control, as the manipulated object is not in static equilibrium throughout the process. Dynamics can be exploited to allow the robot to create and control object motions that would otherwise be impossible.

How does a robot use its physical capabilities to actively explore, sense, and learn about itself and its environment? This research is about developing control policies to exploit the physical motion of robotic systems for active learning and sensing. We are interested in understanding how information can be used to drive decision-making and how one can generate control actions with respect to principles of information theory and measures of information.

Functional electrical stimulation (FES) systems can restore various functions to persons with impairments such as spinal cord injury, brain injury, and stroke. FES can be used to control skeletal movements in cycling, walking, grasping, and reaching. Our research focuses on using FES to restore reaching motions to persons with high spinal cord injuries who have little or no voluntary control over their upper extremities.

Surface haptic interaction design focuses on the perceptions and experiences of the user during tactile interactions. We study topics ranging from fundamental principles underlying multi-finger tactile perception to haptic display techniques which can improve safety in automobiles or user experiences with personal electronic devices.

“Surface haptics” is the creation of programmable haptic effects on physical surfaces such as touch screens. Our lab has pioneered an approach to surface haptics based on controlling the shear force at each fingertip. This enables fingertips to interact with physics-based virtual environments, much like force feedback devices enable the whole hand to do. Ultrasonic vibrations and electrostatic fields are used to produce the fingertip forces.

Rats rhythmically brush and tap their whiskers (also called vibrissae) against objects to explore their world through the sense of touch. This behavior is called "whisking." Using tactile information from its whiskers, a rat is able to determine an object's size, shape, orientation, and texture. We quantify the head movements and the whisking motions that the rat uses to tactually explore its world.

Few effective technologies exist for sensing in dark or murky underwater situations. For this reason, we have been exploring the use of a novel biologically-inspired approach to non-visual sensing based on the detection of perturbations to a self generated electric field. This is used by many species of neotropical nocturnal freshwater fish. This approach, termed active electrosense, provides unique capabilities for sensing of nearby objects.

We're developing a framework for self-organizing robot systems based on decentralized estimation and control. The goal is to "compile" desired group behaviors into local communication, estimation, and control laws running on individual robots. For properly designed control laws, the interactions of individual robots result in the desired group behavior, without any centralized control.

We study the mechanics of whiskers in order to gain insight into how the rat's brain is able to interpret mechanical signals to determine object properties.