The field of Human-Robot Interaction (HRI) is rapidly evolving, blurring the lines between human and machine capabilities. Recent advances in computer science, computer vision, artificial intelligence, robotics, brain-computer interfaces, neural engineering, and cognitive science have profoundly transformed HRI. As we stand on the threshold of a new era characterized by increasingly sophisticated human-machine interactions, it becomes imperative to explore the emerging frontiers of this dynamic field. Robots are now entering homes, workplaces, and even battlefields, necessitating a deeper…mehr
The field of Human-Robot Interaction (HRI) is rapidly evolving, blurring the lines between human and machine capabilities. Recent advances in computer science, computer vision, artificial intelligence, robotics, brain-computer interfaces, neural engineering, and cognitive science have profoundly transformed HRI. As we stand on the threshold of a new era characterized by increasingly sophisticated human-machine interactions, it becomes imperative to explore the emerging frontiers of this dynamic field. Robots are now entering homes, workplaces, and even battlefields, necessitating a deeper understanding of effective, safe, and ethical human-robot interaction. This book delves into cutting-edge research shaping the future of HRI, representing a collective effort to navigate uncharted territories. Through diverse chapters authored by leading experts, it offers a comprehensive exploration of the latest developments, challenges, and opportunities in HRI. By examining technical, engineering, and methodological challenges, this volume brings together perspectives from researchers, engineers, and designers to provide a comprehensive view of this dynamic field. Aiming to showcase groundbreaking research and spark interdisciplinary dialogue and collaboration, this book is a valuable resource for researchers, engineers, students, and anyone interested in the future of human-robot collaboration. Whether you are a seasoned roboticist, a curious student, or simply interested in future technology, this book offers insights and knowledge to navigate the complex world of human-robot interaction. Join us on this journey of discovery as we navigate the emerging frontiers of human-robot interaction together.
Ramana Vinjamuri received his undergraduate degree in Electrical Engineering from Kakatiya University (India) in 2002. He received his MS in Electrical Engineering from Villanova University in 2004 specialized in Bioinstrumentation. He received his Ph.D. in Electrical Engineering in 2008 specializing in Dimensionality Reduction in Control and Coordination of Human Hand from the University of Pittsburgh. He worked as a postdoctoral fellow (2008-2012) in the field of Brain-Machine Interfaces (BMI) to control prosthesis in the School of Medicine, University of Pittsburgh. He worked as a Research Assistant Professor in the Department of Biomedical Engineering at the Johns Hopkins University (2012-2013). He worked as an Assistant Professor in the Department of Biomedical Engineering at Stevens Institute of Technology (2013-2020). He is the recipient of the Harvey N Davis Distinguished Teaching Award in 2018 at Stevens. His research at Stevens was supported by Research and Innovation grants from the New Jersey Health Foundation. He received the NSF CAREER Award in 2019. He received an NSF IUCRC Planning grant in 2020, culminating in a successful planning meeting for the BRAIN center at UMBC in 2022, officially launching the center in 2024. His collaboration with Delsys, Inc., through an SBIR award from NIDILRR in 2022, evolved from his lab's research on virtual reality and hand synergies. With his role as a visiting scientist at NIDA and with the support of an NSF I-Corps grant in 2024 he is starting to commercialize neurotechnologies for mental health developed in his lab. His other notable research awards are from NSF I-Corps, NIDILRR, USISTEF and New Jersey Health Foundation. He is currently a tenured Associate Professor in the Department of Computer Science and Electrical Engineering at the University of Maryland Baltimore County. He holds a secondary appointment as an Adjunct Professor at Indian Institute of Technology, Hyderabad, India. His research interests are in the areas of - brain-computer interfaces, neuroprosthetics and exoskeletons, neurotechnologies for mental health, machine learning, and signal processing.
Inhaltsangabe
Towards Genuine Robot Teammates: Improving Human-Robot Team Performance Beyond Shared Mental Models with Proactivity.- Speech-based Communication for Human-Robot Collaboration: Evaluation Studies.- Value Alignment and Trust in Human-Robot Interaction: Insights from Simulation and User Study.- Closing the loop between wearable robots and machine learning: a new paradigm for steering assistance personalization control.- New Horizons in Human-Robot Interaction: Synergy, Cognition, and Emotion.- High-level Framework in Wearable Exosuits involving Virtual Reality and Human-Robot Intent Interaction for Gait Intervention.- Active-Passive Exoskeletons for Assistive and Resistive Interventions in Human Walking.- Hydrodynamics of Microrobots: Effect of Confinement and Collisions.- Development of Fuzzy Controller for a Flexible Bevel-Tip Needle in Percutaneous Interventional Procedures.- Evolving Trends and Future Prospects of Transformer Models in EEG-based Motor-Imagery BCI Systems.- Estimating Grasp Pose using Deep Learning Architectures for Intelligent Robotic Manipulation.- Visual Affordance Recognition: A Study on Explainability and Interpretability for Human Robot Interaction.- Experimental Verification of Force-assistive Optimal Variable Admittance Control of Haptic Systems.- Experimental Design Principles for Developing Machine Learning Models for Human-Robot Interaction.- Electrophysiological Measures for Human-Robot Collaboration Quality Assessment.- Design of a Virtual Chatbot platform for basic needs communication through Imagined Speech BCI.- Human-Robot Interaction in Biopsy Procedures: A Biomimetic Dual-Sheath Needle Design Inspired by Insect Ovipositor Mechanics.- Explainable AI Methods for Interpreting Emotions in Brain-Computer Interface EEG Data.- Robot Control via Natural Instructions Empowered by Large Language Model.- Emotion Awareness in Humanoids: Human Facial Expressions Recognition Using Deep CNN Models.- Scheduling Robotic Collaboration based on Human Motion Analysis.
Towards Genuine Robot Teammates: Improving Human-Robot Team Performance Beyond Shared Mental Models with Proactivity.- Speech-based Communication for Human-Robot Collaboration: Evaluation Studies.- Value Alignment and Trust in Human-Robot Interaction: Insights from Simulation and User Study.- Closing the loop between wearable robots and machine learning: a new paradigm for steering assistance personalization control.- New Horizons in Human-Robot Interaction: Synergy, Cognition, and Emotion.- High-level Framework in Wearable Exosuits involving Virtual Reality and Human-Robot Intent Interaction for Gait Intervention.- Active-Passive Exoskeletons for Assistive and Resistive Interventions in Human Walking.- Hydrodynamics of Microrobots: Effect of Confinement and Collisions.- Development of Fuzzy Controller for a Flexible Bevel-Tip Needle in Percutaneous Interventional Procedures.- Evolving Trends and Future Prospects of Transformer Models in EEG-based Motor-Imagery BCI Systems.- Estimating Grasp Pose using Deep Learning Architectures for Intelligent Robotic Manipulation.- Visual Affordance Recognition: A Study on Explainability and Interpretability for Human Robot Interaction.- Experimental Verification of Force-assistive Optimal Variable Admittance Control of Haptic Systems.- Experimental Design Principles for Developing Machine Learning Models for Human-Robot Interaction.- Electrophysiological Measures for Human-Robot Collaboration Quality Assessment.- Design of a Virtual Chatbot platform for basic needs communication through Imagined Speech BCI.- Human-Robot Interaction in Biopsy Procedures: A Biomimetic Dual-Sheath Needle Design Inspired by Insect Ovipositor Mechanics.- Explainable AI Methods for Interpreting Emotions in Brain-Computer Interface EEG Data.- Robot Control via Natural Instructions Empowered by Large Language Model.- Emotion Awareness in Humanoids: Human Facial Expressions Recognition Using Deep CNN Models.- Scheduling Robotic Collaboration based on Human Motion Analysis.
Towards Genuine Robot Teammates: Improving Human-Robot Team Performance Beyond Shared Mental Models with Proactivity.- Speech-based Communication for Human-Robot Collaboration: Evaluation Studies.- Value Alignment and Trust in Human-Robot Interaction: Insights from Simulation and User Study.- Closing the loop between wearable robots and machine learning: a new paradigm for steering assistance personalization control.- New Horizons in Human-Robot Interaction: Synergy, Cognition, and Emotion.- High-level Framework in Wearable Exosuits involving Virtual Reality and Human-Robot Intent Interaction for Gait Intervention.- Active-Passive Exoskeletons for Assistive and Resistive Interventions in Human Walking.- Hydrodynamics of Microrobots: Effect of Confinement and Collisions.- Development of Fuzzy Controller for a Flexible Bevel-Tip Needle in Percutaneous Interventional Procedures.- Evolving Trends and Future Prospects of Transformer Models in EEG-based Motor-Imagery BCI Systems.- Estimating Grasp Pose using Deep Learning Architectures for Intelligent Robotic Manipulation.- Visual Affordance Recognition: A Study on Explainability and Interpretability for Human Robot Interaction.- Experimental Verification of Force-assistive Optimal Variable Admittance Control of Haptic Systems.- Experimental Design Principles for Developing Machine Learning Models for Human-Robot Interaction.- Electrophysiological Measures for Human-Robot Collaboration Quality Assessment.- Design of a Virtual Chatbot platform for basic needs communication through Imagined Speech BCI.- Human-Robot Interaction in Biopsy Procedures: A Biomimetic Dual-Sheath Needle Design Inspired by Insect Ovipositor Mechanics.- Explainable AI Methods for Interpreting Emotions in Brain-Computer Interface EEG Data.- Robot Control via Natural Instructions Empowered by Large Language Model.- Emotion Awareness in Humanoids: Human Facial Expressions Recognition Using Deep CNN Models.- Scheduling Robotic Collaboration based on Human Motion Analysis.
Towards Genuine Robot Teammates: Improving Human-Robot Team Performance Beyond Shared Mental Models with Proactivity.- Speech-based Communication for Human-Robot Collaboration: Evaluation Studies.- Value Alignment and Trust in Human-Robot Interaction: Insights from Simulation and User Study.- Closing the loop between wearable robots and machine learning: a new paradigm for steering assistance personalization control.- New Horizons in Human-Robot Interaction: Synergy, Cognition, and Emotion.- High-level Framework in Wearable Exosuits involving Virtual Reality and Human-Robot Intent Interaction for Gait Intervention.- Active-Passive Exoskeletons for Assistive and Resistive Interventions in Human Walking.- Hydrodynamics of Microrobots: Effect of Confinement and Collisions.- Development of Fuzzy Controller for a Flexible Bevel-Tip Needle in Percutaneous Interventional Procedures.- Evolving Trends and Future Prospects of Transformer Models in EEG-based Motor-Imagery BCI Systems.- Estimating Grasp Pose using Deep Learning Architectures for Intelligent Robotic Manipulation.- Visual Affordance Recognition: A Study on Explainability and Interpretability for Human Robot Interaction.- Experimental Verification of Force-assistive Optimal Variable Admittance Control of Haptic Systems.- Experimental Design Principles for Developing Machine Learning Models for Human-Robot Interaction.- Electrophysiological Measures for Human-Robot Collaboration Quality Assessment.- Design of a Virtual Chatbot platform for basic needs communication through Imagined Speech BCI.- Human-Robot Interaction in Biopsy Procedures: A Biomimetic Dual-Sheath Needle Design Inspired by Insect Ovipositor Mechanics.- Explainable AI Methods for Interpreting Emotions in Brain-Computer Interface EEG Data.- Robot Control via Natural Instructions Empowered by Large Language Model.- Emotion Awareness in Humanoids: Human Facial Expressions Recognition Using Deep CNN Models.- Scheduling Robotic Collaboration based on Human Motion Analysis.
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497