Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
Robot Learning from Demonstration (RLfD) has been identified as a key element for making robots useful in daily lives. A wide range of techniques has been proposed for deriving a task model from a set of demonstrations of the task. Most previous works use learning to model the kinematics of the task, and for autonomous execution the robot then relies on a stiff position controller. While many tasks can and have been learned this way, there are tasks in which controlling the position alone is insufficient to achieve the goals of the task. These are typically tasks that involve contact or require a specific response to physical perturbations. The question of how to adjust the compliance to suit the need of the task has not yet been fully treated in Robot Learning from Demonstration. In this paper, we address this issue and present interfaces that allow a human teacher to indicate compliance variations by physically interacting with the robot during task execution. We validate our approach in two different experiments on the 7 DoF Barrett WAM and KUKA LWR robot manipulators. Furthermore, we conduct a user study to evaluate the usability of our approach from a non-roboticists perspective.
Posted on: February 20, 2015
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
Technological advancements have led to the development of numerous wearable robotic devices for the physical assistance and restoration of human locomotion. While many challenges remain with respect to the mechanical design of such devices, it is at least equally challenging and important to develop strategies to control them in concert with the intentions of the user. This work reviews the state-of-the-art techniques for controlling portable active lower limb prosthetic and orthotic (P/O) devices in the context of locomotive activities of daily living (ADL), and considers how these can be interfaced with the user’s sensory-motor control system. This review underscores the practical challenges and opportunities associated with P/O control, which can be used to accelerate future developments in this field. Furthermore, this work provides a classification scheme for the comparison of the various control strategies. As a novel contribution, a general framework for the control of portable gait-assistance devices is proposed. This framework accounts for the physical and informatic interactions between the controller, the user, the environment, and the mechanical device itself. Such a treatment of P/Os — not as independent devices, but as actors within an ecosystem — is suggested to be necessary to structure the next generation of intelligent and multifunctional controllers. Each element of the proposed framework is discussed with respect to the role that it plays in the assistance of locomotion, along with how its states can be sensed as inputs to the controller. The reviewed controllers are shown to fit within different levels of a hierarchical scheme, which loosely resembles the structure and functionality of the nominal human central nervous system (CNS). Active and passive safety mechanisms are considered to be central aspects underlying all of P/O design and control, and are shown to be critical for regulatory approval of such devices for real-world use. The works discussed herein provide evidence that, while we are getting ever closer, significant challenges still exist for the development of controllers for portable powered P/O devices that can seamlessly integrate with the user’s neuromusculoskeletal system and are practical for use in locomotive ADL.
Posted on: January 14, 2015
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
The mechanical mismatch between soft neural tissues and stiff neural implants hinders the long-term performance of implantable neuroprostheses. Here, we designed and fabricated soft neural implants with the shape and elasticity of dura mater, the protective membrane of the brain and spinal cord. The electronic dura mater, which we call e-dura, embeds interconnects, electrodes, and chemotrodes that sustain millions of mechanical stretch cycles, electrical stimulation pulses, and chemical injections. These integrated modalities enable multiple neuroprosthetic applications. The soft implants extracted cortical states in freely behaving animals for brain-machine interface and delivered electrochemical spinal neuromodulation that restored locomotion after paralyzing spinal cord injury.
Posted on: January 12, 2015
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
In this paper, we investigate the following problem: given the image of a scene, what is the trajectory that a robot-mounted camera should follow to allow optimal dense depth estimation? The solution we propose is based on maximizing the information gain over a set of candidate trajectories. In order to estimate the information that we expect from a camera pose, we introduce a novel formulation of the measurement uncertainty that accounts for the scene appearance (i.e., texture in the reference view), the scene depth and the vehicle pose. We successfully demonstrate our approach in the case of real-time, monocular reconstruction from a micro aerial vehicle and validate the effectiveness of our solution in both synthetic and real experiments. To the best of our knowledge, this is the first work on active, monocular dense reconstruction, which chooses motion trajectories that minimize perceptual ambiguities inferred by the texture in the scene.
Posted on: December 5, 2014
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
This paper addresses the problem of optimal grasping of an object with a multi-fingered robotic hand for accomplishing a given task. The task is first demonstrated by a human operator and its force/torque requirements are captured through the usage of a sensorized tool. The grasp quality is computed through a task compatibility criterion. Grasp synthesis is then formulated as a single constrained optimization problem, generating grasps that are feasible for the hand’s kinematics by maximizing the corresponding task-oriented quality criterion and ensuring grasp stability. The method was validated on a human hand model and is shown to be easily adapted to different hand kinematic models.
Posted on: November 17, 2014
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
In the last few years, we have witnessed impres-sive demonstrations of aggressive flights and acrobatics using quadrotors. However, those robots are actually blind. They do not see by themselves, but through the “eyes” of an external motion capture system. Flight maneuvers using onboard sensors are still slow compared to those attainable with motion capture systems. At the current state, the agility of a robot is limited by the latency of its perception pipeline. To obtain more agile robots, we need to use faster sensors. In this paper, we present the first onboard perception system for 6-DOF localization during high-speed maneuvers using a Dynamic Vision Sensor (DVS). Unlike a standard CMOS camera, a DVS does not wastefully send full image frames at a fixed frame rate. Conversely, similar to the human eye, it only transmits pixel-level brightness changes at the time they occur with microsecond resolution, thus, offering the possibility to create a perception pipeline whose latency is negligible compared to the dynamics of the robot. We exploit these characteristics to estimate the pose of a quadrotor with respect to a known pattern during high-speed maneuvers, such as flips, with rotational speeds up to 1,200◦/s. Additionally, we provide a versatile method to capture ground-truth data using a DVS.
Posted on: November 5, 2014
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
Modern wearable robots are not yet intelligent enough to fully satisfy the demands of endusers, as they lack the sensor fusion algorithms needed to provide optimal assistance and react quickly to perturbations or changes in user intentions. Sensor fusion applications such as intention detection have been emphasized as a major challenge for both robotic orthoses and prostheses. In order to better examine the strengths and shortcomings of the field, this paper presents a review of existing sensor fusion methods for wearable robots, both stationary ones such as rehabilitation exoskeletons and portable ones such as active prostheses and full-body exoskeletons. Fusion methods are first presented as applied to individual sensing modalities (primarily electromyography, electroencephalography and mechanical sensors), and then four approaches to combining multiple modalities are presented. The strengths and weaknesses of the different methods are compared, and recommendations are made for future sensor fusion research.
Posted on: October 22, 2014
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
Despite tremendous improvements in recent years, lower-limb prostheses are still inferior to their biological counterparts. Most powered knee joints use impedance control, but it is unknown which impedance profiles are needed to replicate physiological behavior. Recently, we have developed a method to quantify such profiles from conventional gait data. Based on this method, we derive stiffness requirements for knee prostheses, and we propose an actuation concept where physical actuator stiffness changes in function of joint angle. The idea is to express stiffness and moment requirements as functions of angle, and then to combine a series elastic actuator (SEA) with an optimized nonlinear transmission and parallel springs to reproduce the profiles. By considering the angle-dependent stiffness requirement, the upper bound for the impedance in zero-force control could be reduced by a factor of two. We realize this ANGle-dependent ELAstic Actuator (ANGELAA) in a leg, with rubber cords as series elastic elements. Hysteresis in the rubber is accounted for, and knee moment is estimated with a mean error of 0.7 Nm. The nonlinear parallel elasticity creates equilibria near 0◦ as well as 90◦ knee flexion, frequent postures in daily life. Experimental evaluation in a test setup shows force control bandwidth around 5–9 Hz, and a pilot experiment with an amputee subject shows the feasibility of the approach. While weight and power consumption are not optimized in this prototype, the incorporated mechatronic principles may pave the way for cheaper and lighter actuators in artificial legs and in other applications where stiffness requirements depend on kinematic configuration.
Posted on: October 22, 2014
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
Functional near-infrared spectroscopy (fNIRS) is a noninvasive optical method that measures cortical activity based on hemodynamics in the brain. Physiological signals (biosignals), such as blood pressure and respiration, are known to appear in cortical fNIRS recordings. Some biosignal components occupy the same frequency band as the cortical response, and respond to the subjects activity. To process an fNIRS signal in a braincomputer interface, it is desirable to know which components of the signal come from cortical response, and which come from biosignal interference. Numerous filtering methods have been proposed to this end with mixed success, possibly because they assume that the cortical and physiological signals combine linearly, or that biosignals do not correlate with subject behavior. Here, we propose an adaptive filter with a cost function based on mutual information to selectively remove information that correlates with blood pressure from the fNIRS signal. The filter was tested with real and simulated data. The real signals were measured on seven healthy subjects performing an isometric pinching task. Cross-correlation and mutual information were employed as performance measures. The filter successfully removed correlations between blood pressure and the fNIRS signal, by an equal or greater amount compared to a traditional recursive least squares adaptive filter. Blood pressure was found to be the most informative signal to classify rest and active periods using linear discriminant analysis. Any task information in the fNIRS signal was redundant to that expressed by blood pressure.
Posted on: October 22, 2014
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
51
Warning: Use of undefined constant citation_author - assumed 'citation_author' (this will throw an Error in a future version of PHP) in
/home/clients/89f5f0444c120951cfdb7adc5e3aa2bf/web/dev-nccr-robotics/wp-content/themes/nccr-twentyseventeen-child/template-parts/post/content-publication.php on line
52
In the attempt to build adaptive and intelligent machines, roboticists have looked at neuroscience for more than half a century as a source of inspiration for perception and control. More recently, neuroscientists have resorted to robots for testing hypotheses and validating models of biological nervous systems. Here, we give an overview of the work at the intersection of robotics and neuroscience and highlight the most promising approaches and areas where interactions between the two fields have generated significant new insights. We articulate the work in three sections, invertebrate, vertebrate and primate neuroscience. We argue that robots generate valuable insight into the function of nervous systems, which is intimately linked to behaviour and embodiment, and that brain-inspired algorithms and devices give robots life-like capabilities.
Posted on: September 22, 2014