38 results on '"Massimiliano Zecca"'
Search Results
2. Natural human–robot musical interaction: understanding the music conductor gestures by using the WB-4 inertial measurement system
- Author
-
Massimiliano Zecca, Sarah Cosentino, Zhuohua Lin, Atsuo Takanishi, Salvatore Sessa, Klaus Petersen, and Luca Bartolomeo
- Subjects
Engineering ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,business.industry ,Musical ,Human–robot interaction ,Computer Science Applications ,Human-Computer Interaction ,Naturalness ,Hardware and Architecture ,Control and Systems Engineering ,Gesture recognition ,Human–computer interaction ,Natural (music) ,Robot ,business ,Software ,Humanoid robot ,Simulation ,Gesture - Abstract
This paper presents an inertial measurement unit-based human gesture recognition system for a robot instrument player to understand the instructions dictated by an orchestra conductor and accordingly adapt its musical performance. It is an extension of our previous publications on natural human–robot musical interaction. With this system, the robot can understand the real-time variations in musical parameters dictated by the conductor’s movements, adding expression to its performance while being synchronized with all the other human partner musicians. The enhanced interaction ability would obviously lead to an improvement of the overall live performance, but also allow the partner musicians, as well as the conductor, to better appreciate a joint musical performance, thanks to the complete naturalness of the interaction.
- Published
- 2014
- Full Text
- View/download PDF
3. Cross-Cultural Perspectives on Emotion Expressive Humanoid Robotic Head: Recognition of Facial Expressions and Symbols
- Author
-
Kenji Hashimoto, Massimiliano Zecca, Nobutsuna Endo, Gabriele Trovato, Atsuo Takanishi, and Tatsuhiro Kishi
- Subjects
0209 industrial biotechnology ,General Computer Science ,Social Psychology ,050109 social psychology ,02 engineering and technology ,Human–robot interaction ,020901 industrial engineering & automation ,Human–computer interaction ,0501 psychology and cognitive sciences ,Computer vision ,Electrical and Electronic Engineering ,Facial expression ,Social robot ,business.industry ,05 social sciences ,Robotics ,Expression (mathematics) ,Human-Computer Interaction ,Philosophy ,Control and Systems Engineering ,Robot ,Artificial intelligence ,business ,Cultural divide ,Psychology ,Meaning (linguistics) - Abstract
Emotion display through facial expressions is an important channel of communication. However, between humans there are differences in the way a meaning to facial cues is assigned, depending on the background culture. This leads to a gap in recognition rates of expressions: this problem is present when displaying a robotic face too, as a robot’s facial expression recognition is often hampered by a cultural divide, and poor scores of recognition rate may lead to poor acceptance and interaction. It would be desirable if robots could switch their output facial configuration flexibly, adapting to different cultural backgrounds. To achieve this, we made a generation system that produces facial expressions and applied it to the 24 degrees of freedom head of the humanoid social robot KOBIAN-R, and thanks to the work of illustrators and cartoonists, the system can generate two versions of the same expression, in order to be easily recognisable by both Japanese and Western subjects. As a tool for making recognition easier, the display of Japanese comic symbols on the robotic face has also been introduced and evaluated. In this work, we conducted a cross-cultural study aimed at assessing this gap in recognition and finding solutions for it. The investigation was extended to Egyptian subjects too, as a sample of another different culture. Results confirmed the differences in recognition rates, the effectiveness of customising expressions, and the usefulness of symbols display, thereby suggesting that this approach might be valuable for robots that in the future will interact in a multi-cultural environment.
- Published
- 2013
- Full Text
- View/download PDF
4. Development of lower limb rehabilitation evaluation system based on virtual reality technology
- Author
-
Yajing Shen, Haojian Lu, Chang Gao, Yong Zhao, Zhengzhi Wu, Li Weiguang, Chunbao Wang, Atsuo Takanishi, Jianjun Long, Shihui Shen, Massimiliano Zecca, Jian Qin, Qing Shi, Yulong Wang, Quanquan Liu, and Sun Tongyang
- Subjects
030506 rehabilitation ,medicine.medical_specialty ,Rehabilitation ,Research groups ,Computer science ,medicine.medical_treatment ,Medical rehabilitation ,Rehabilitation evaluation ,Virtual reality ,Lower limb ,03 medical and health sciences ,0302 clinical medicine ,Physical medicine and rehabilitation ,Rehabilitation training ,medicine ,Robot ,030212 general & internal medicine ,0305 other medical science ,Simulation - Abstract
Nowadays, with the development of the proportion of the elderly population in the world, several problems caused by the population aging gradually into people's horizons. One of the biggest problems plagued the vast majority of the elderly is hemiplegia, which leads to the vigorous development of the physical therapists. However, these traditional methods of physical therapy mainly rely on the skill of the physical therapists. In order to make up the defects of traditional methods, many research groups have developed different kinds of robots for lower limb rehabilitation training but most of them can only realize passive training which cannot adopt rehabilitation training based on the patients' individual condition effectively and they do not have a rehabilitation evaluation system to assess the real time training condition of the hemiplegic patients effectively. In order to solve the problems above, this paper proposed a lower limb rehabilitation evaluation system which is based on the virtual reality technology. This system has an easy observation of the human-computer interaction interface and the doctor is able to adjust the rehabilitation training direct at different patients in different rehabilitation stage based on this lower limb rehabilitation evaluation system. Compared with current techniques, this novel lower limb rehabilitation evaluation system is expected to have significant impacts in medical rehabilitation robot field.
- Published
- 2016
- Full Text
- View/download PDF
5. Development of a rehabilitation robot for hand and wrist rehabilitation training
- Author
-
Zhengzhi Wu, Yajing Shen, Li Weiguang, Lihong Duan, Yulong Wang, Jianjun Wei, Chunbao Wang, Li Mengjie, Qing Shi, Lu Zhijiang, and Massimiliano Zecca
- Subjects
musculoskeletal diseases ,Engineering ,medicine.medical_specialty ,Functional training ,Rehabilitation ,business.industry ,medicine.medical_treatment ,GRASP ,Thumb ,Wrist ,Motion (physics) ,body regions ,Physical medicine and rehabilitation ,medicine.anatomical_structure ,medicine ,Robot ,Torque sensor ,business - Abstract
Up to now, the number of hemiplegia rehabilitation devices is increasing quickly along with hemiplegic patients'. But most of hand rehabilitation training just limit to the fingers flexible training of patients' affected hand. They not only ignore the importance of functional training of hand, but also wrist cooperative training during rehabilitation process. In our new research, we proposed a novel hand and wrist rehabilitation robot to achieve grasp functional training of hand except thumb and intorsion/extorsion and dorsiflexion/plantar flexion of wrist, which provides a creative hand rehabilitation way for hemiplegic patients. In this paper, we will introduce the detail design of the robot. It mainly includes two rehabilitation units —wrist rehabilitation unit and hand rehabilitation unit, which can realize separate motion or cooperate motion of hand and wrist based on patients' willingness. What's more, the torque sensor unit is purposely designed to detect feedback torque of related motion instead of available ones in the market, which makes the whole mechanical structure more compact. In a word, this novel hand and wrist rehabilitation robot will have a promising prospect.
- Published
- 2015
- Full Text
- View/download PDF
6. Development an arm robot to simulate the lead-pipe rigidity for medical education
- Author
-
Yulong Wang, Lu Zhijiang, Lihong Duan, Yajing Shen, Jianjun Wei, Zhengzhi Wu, Chunbao Wang, Li Mengjie, Qing Shi, Massimiliano Zecca, and Li Weiguang
- Subjects
Engineering ,medicine.diagnostic_test ,business.industry ,Medical robot ,education ,Elbow ,Neurological Model ,Physical examination ,Neurological examination ,Simulated patient ,body regions ,medicine.anatomical_structure ,Human–computer interaction ,medicine ,Medical training ,Robot ,business ,Simulation - Abstract
Neurologic examination takes an important role in the physical examination. It requires abundant knowledge with prominent skills. Normally, the medical staffs, especially novices are trained to master the skills and accumulate experiences with several methods such as watching video, training with the simulated patient (SP), and so on. However, the drawbacks of the above methods, such as lack of multi-symptoms, lack of active interactions, etc, limit the training effects. To make up them, several kinds of medical training simulators have been developed to improve training effectiveness. However, most of these simulators just focus on mimicking the symptoms. The could not simulate the pathology of diseases. In this paper, we will propose an elbow robot named WKE-2(Waseda Kyotokagaku Elbow Robot No.2) to simulate the symptoms of motor nerve system for neurologic examination training on elbow force examination. In this paper, the mechanism of the elbow robot and physiological neurological model is described. As a sample, the performance of lead-pipe symptoms is introduced. Taking advantage of the robot, the trainee can get a full training on the examination skills and knowledge as well as the understanding of disease effection. Finally, several experiments are performed to verify the proposed robot. The results lead to the consideration that the approach is worth following in further research. 1
- Published
- 2015
- Full Text
- View/download PDF
7. Development of a novel ankle rehabilitation robot with three freedoms for ankle rehabilitation training
- Author
-
Li Mengjie, Qing Shi, Lihong Duan, Atsuo Takanishi, Lu Zhijiang, Zhengzhi Wu, Chunbao Wang, Lin Wang, Massimiliano Zecca, and Li Weiguang
- Subjects
Engineering ,business.industry ,education ,Training (meteorology) ,Plantar flexion ,body regions ,medicine.anatomical_structure ,Ankle rehabilitation ,Long period ,Linear motion ,medicine ,Robot ,Ankle ,business ,Physical therapist ,human activities ,Simulation - Abstract
Ankle rehabilitation training takes an important role in the hemiplegic rehabilitation training. Traditional training requires the physical therapist treating the patient as one to one. The training process is repeating and needs a long period. The performances of training rely heavily on the skill of therapists. The training effectiveness depends on the skills of the therapist. To improve the training effectiveness, there are many ankle rehabilitation robots are proposed. However all of them are only focusing on providing a passive training to the patient. In this paper, an ankle rehabilitation robot with three degrees is proposed. This robot can not only realizes the passive training but also has much more sensors to detect the movements of the ankle, especially realizes the patient active training. In this paper, the detail design of the mechanism is introduced. The mechanical structure includes pedal parts, thigh fixing parts, cross slider parts, driving unit and sensing unit. Comparing with current ankle rehabilitation researches, this robot uses linear motion in horizontal and vertical plane instead of rotary motion to realize dorsiflexion/plantar flexion and abduction/adduction. Finally, we present ankle motion space analysis and ankle robot workspace analysis of ankle robot to verify the feasibility of the robot.
- Published
- 2015
- Full Text
- View/download PDF
8. A Novel Greeting Selection System for a Culture-Adaptive Humanoid Robot
- Author
-
Omer Terlemez, Tamim Asfour, Massimiliano Zecca, Masuko Kuramochi, Gabriele Trovato, Atsuo Takanishi, Alex Waibel, and Martin Do
- Subjects
Computer science ,lcsh:TK7800-8360 ,050109 social psychology ,lcsh:QA75.5-76.95 ,050105 experimental psychology ,German ,Artificial Intelligence ,Cultural diversity ,0501 psychology and cognitive sciences ,Social robot ,business.industry ,Repertoire ,DATA processing & computer science ,lcsh:Electronics ,05 social sciences ,Social environment ,language.human_language ,Computer Science Applications ,language ,Robot ,lcsh:Electronic computers. Computer science ,Artificial intelligence ,ddc:004 ,business ,Software ,Humanoid robot ,Gesture - Abstract
Robots, especially humanoids, are expected to perform human-like actions and adapt to our ways of communication in order to facilitate their acceptance in human society. Among humans, rules of communication change depending on background culture: greetings are a part of communication in which cultural differences are strong. Robots should adapt to these specific differences in order to communicate effectively, being able to select the appropriate manner of greeting for different cultures depending on the social context. In this paper, we present the modelling of social factors that influence greeting choice, and the resulting novel culture-dependent greeting gesture and words selection system. An experiment with German participants was run using the humanoid robot ARMAR-IIIb. Thanks to this system, the robot, after interacting with Germans, can perform greeting gestures appropriate to German culture in addition to a repertoire of greetings appropriate to Japanese culture.
- Published
- 2015
- Full Text
- View/download PDF
9. Development of a human-like motor nerve model to simulate the diseases effects on muscle tension for neurologic examination training
- Author
-
Salvatore Sessa, Atsuo Takanishi, Jian Qin, Ai Nibori, Massimiliano Zecca, Di Zhang, Hiroyuki Ishii, Yusaku Miura, Chunbao Wang, Zhengzhi Wu, Li Weiguang, Qing Shi, Lihong Duan, Yurina Sugamiya, and W. Kong
- Subjects
Engineering ,medicine.medical_specialty ,Involuntary action ,medicine.diagnostic_test ,business.industry ,education ,Elbow ,Motor nerve ,Physical examination ,body regions ,medicine.anatomical_structure ,Physical medicine and rehabilitation ,Muscle tension ,medicine ,Medical training ,Robot ,Biceps tendon ,business ,Simulation - Abstract
Neurologic examination takes an important role in the physical examination. By now, several methods have been carried out for medical training which bring the benefits for trainee to master the skills and accumulate experiences. However, because of the limits of these methods, the training effectiveness is limited. With the developments of technology, more and more simulators have been launched to improve medical training effectiveness. However, most of these simulators only focus on mimicking the symptoms, not simulating the pathology of diseases. In this paper, we propose an improved motor nerve model which is used in the elbow robot named WKE-2(Waseda Kyotokagaku Elbow Robot No.2). This robot is designed to simulate the disorders of motor nerves to give the trainee a full training on the elbow examination. The motor nerve model mimics the real motor nerve structure. And the functions of each part are designed from analysis of the real functions of each organ. In this robot, the effects of motor nerve system, and various symptoms related to the examination of elbow force, biceps tendon reflex, involuntary action are simulated. Making use of this robot, a systematic training on the skills as well as the understanding of knowledge is provided. Finally, several experiments are performed to verify our proposed system. The experimental results lead to the consideration that the approach is worth following in further research.
- Published
- 2014
- Full Text
- View/download PDF
10. Development of a nerve model of eyeball motion nerves to simulate the disorders of eyeball movements for neurologic examination training
- Author
-
Jian Qin, Ai Nibori, Yusaku Miura, Li Weiguang, Chunbao Wang, Atsuo Takanishi, Massimiliano Zecca, Qing Shi, Lihong Duan, Yurina Sugamiya, Salvatore Sessa, W. Kong, Lin Wang, Zhengzhi Wu, and Hiroyuki Ishii
- Subjects
Robot kinematics ,medicine.medical_specialty ,medicine.diagnostic_test ,Computer science ,Cranial nerve examination ,education ,Eyeball movements ,Motor nerve ,Physical examination ,Motion (physics) ,Simulated patient ,Physical medicine and rehabilitation ,medicine ,Robot ,Simulation - Abstract
Cranial nerve examination takes an important role in the physical examination. All the medical staffs need to be trained to master the skills for examination. By now, several training methods have been carried out for medical training including training with the simulated patient (SP). However, considering the characters of eyeball, the involuntary actions cannot be simulated by Sp. With the developments of technology, more and more robot heads have been launched. However, these simulators only focus on mimicking the healthy human abilities, not simulating disorders for medical training. In this paper, we propose a novel eyeball motion nerve model which is used in the head robot named WKH-2(Waseda Kyotokagaku Head Robot No.2). This robot is designed to simulate the disorders of eyeball movements to give the trainee a full training on cranial examination. The motor nerve model is carried out from mimicking the real eyeball nerve structure. And the functions of each part are designed from analysis of the real functions of each organ. In this robot, the effects of eyeball motion nerve system, and various symptoms are simulated. Making use of this robot, a systematic training on the skills as well as the understanding of medical knowledge is provided. Finally, several experiments are carried out to verify our proposed system. The experimental results show that the approach is worth following in further research.
- Published
- 2014
- Full Text
- View/download PDF
11. Emotional gait: Effects on humans' perception of humanoid robots
- Author
-
Tatsuhiro Kishi, Matthieu Destephe, Kenji Hashimoto, Massimiliano Zecca, Martim Brandao, and Atsuo Takanishi
- Subjects
business.industry ,media_common.quotation_subject ,Affect (psychology) ,Gait ,Human–robot interaction ,Perception ,Robot ,Computer vision ,Artificial intelligence ,Emotion recognition ,business ,Psychology ,Humanoid robot ,media_common ,Cognitive psychology - Abstract
Humanoid robots have this formidable advantage to possess a body quite similar in shape to humans. This body grants them, obviously, locomotion but also a medium to express emotions without even needing a face. In this paper we propose to study the effects of emotional gaits from our biped humanoid robot on the subjects' perception of the robot (recognition rate of the emotions, reaction time, anthropomorphism, safety, likeness, etc.). We made the robot walk towards the subjects with different emotional gait patterns. We assessed positive (Happy) and negative (Sad) emotional gait patterns on 26 subjects divided in two groups (whether they were familiar with robots or not). We found that even though the recognition of the different types of patterns does not differ between groups, the reaction time does. We found that emotional gait patterns affect the perception of the robot. The implications of the current results for Human Robot Interaction (HRI) are discussed.
- Published
- 2014
- Full Text
- View/download PDF
12. Bipedal humanoid robot that makes humans laugh with use of the method of comedy and affects their psychological state actively
- Author
-
Tatsuhiro Kishi, Atsuo Takanishi, Massimiliano Zecca, Takuya Otani, Takashi Nozawa, Nobutsuna Endo, Kenji Hashimoto, and Sarah Cosentino
- Subjects
Cognitive science ,Social robot ,business.industry ,media_common.quotation_subject ,Comedy ,Robot control ,Expression (architecture) ,Robot ,Conversation ,Artificial intelligence ,Affect (linguistics) ,business ,Psychology ,Humanoid robot ,media_common - Abstract
This paper describes the bipedal humanoid robot that makes human laugh with its whole body expression and affect human’s psychological state. In order to realize "Social interaction" between human and robot, the robot has to affect human’s psychological state actively. We focused on "laugh" because it can be thought as a typical example for researching "Social interaction". Looking through a Japanese comedy style called "manzai" or the art of conversation, we picked out several methods for making human laugh. Then we made several skits with the advice of comedians, and made the whole body humanoid robot perform them. Results of experimental evaluation with these skits shows that the robot’s behavior made subjects laugh and change their psychological state seen as a decrease of "Depression" and "Anger".
- Published
- 2014
- Full Text
- View/download PDF
13. Towards culture-specific robot customization : a study on greeting interaction with Egyptians
- Author
-
Lorenzo Jamone, Gabriele Trovato, Jaap Ham, Massimiliano Zecca, Salvatore Sessa, Kenji Hashimoto, Atsuo Takanishi, and Human Technology Interaction
- Subjects
business.industry ,National culture ,technology, industry, and agriculture ,Context (language use) ,computer.software_genre ,Human–robot interaction ,Cultural background ,body regions ,Videoconferencing ,surgical procedures, operative ,Robot ,Artificial intelligence ,Psychology ,business ,Inclusion (education) ,Social psychology ,computer ,human activities ,Gesture - Abstract
A complex relationship exists between national cultural background and interaction with robots, and many earlier studies have investigated how people from different cultures perceive the inclusion of robots into society. Conversely, very few studies have investigated how robots, speaking and using gestures that belong to a certain national culture, are perceived by humans of different cultural background. The purpose of this work is to prove that humans may better accept a robot that can adapt to their specific national culture. This experiment of Human-Robot Interaction was performed in Egypt. Participants (native Egyptians versus Japanese living in Egypt) were shown two robots greeting them and speaking respectively in Arabic and Japanese, through a simulated video conference. Spontaneous reactions of the human subjects were measured in different ways, and participants completed a questionnaire assessing their preferences and their emotional state. Results suggested that Egyptians prefer the Arabic version of the robot, while they report discomfort when interacting with the Japanese version. These findings confirm the importance of a culture-specific customisation of robots in the context of Human-Robot Interaction.
- Published
- 2014
- Full Text
- View/download PDF
14. A Novel Culture-Dependent Gesture Selection System for a Humanoid Robot Performing Greeting Interaction
- Author
-
Martin Do, Massimiliano Zecca, Omer Terlemez, Masuko Kuramochi, Gabriele Trovato, Atsuo Takanishi, and Tamim Asfour
- Subjects
Social robot ,business.industry ,Culture dependent ,language.human_language ,German ,language ,Robot ,Artificial intelligence ,Psychology ,business ,Humanoid robot ,Selection (genetic algorithm) ,Gesture ,Selection system - Abstract
In human-robot interaction, it is important for the robots to adapt to our ways of communication. As humans, rules of non-verbal communication, including greetings, change depending on our culture. Social robots should adapt to these specific differences in order to communicate effectively, as a correct way of approaching often results into better acceptance of the robot. In this study, a novel greeting gesture selection system is presented and an experiment is run using the robot ARMAR-IIIb. The robot performs greeting gestures appropriate to Japanese culture; after interacting with German participants, the selection should become appropriate to German culture. Results show that the mapping of gesture selection evolves successfully.
- Published
- 2014
- Full Text
- View/download PDF
15. Perception of emotion and emotional intensity in humanoid robots gait
- Author
-
Kenji Hashimoto, Matthieu Destephe, Andreas Henning, Atsuo Takanishi, and Massimiliano Zecca
- Subjects
Social robot ,business.industry ,media_common.quotation_subject ,Mobile robot ,Robot control ,Sadness ,Perception ,Robot ,Computer vision ,Artificial intelligence ,business ,Psychology ,Humanoid robot ,media_common ,Cognitive psychology ,Robot locomotion - Abstract
Humanoid robots progress everyday closer and closer to a more stable walking suitable for a human environment as the researchers in the Humanoid robotics field focus their effort on the understanding of the human locomotion. Nonetheless for Social Robotics researchers, humanoid robots might have another use, such as being our companions from birth to nursing home. Designing social humanoid robots is one critical step if we want the robots to be active in our society. However, to our knowledge, only a few studies in the area of humanoid robotics have addressed emotion expression with robot gaits. In this paper we propose to assess different emotional gait patterns and the perception of the emotion intensity in those patterns. Actors' emotional movement were captured and then normalized for our robot platform. Several robot simulations were shown to human observers who completed a survey questionnaire in which they indicated their assessment of the portrayed emotion by the robot simulation. The surveyed emotions consist of Sadness, Happiness, Anger, Fear with different intensities (Intermediate, High and Exaggerated). We achieved a high recognition rate of emotions (72.32%). Even if the intensities were less well recognized (33.63%), our study indicates that the intensity might help the recognition of emotional walking.
- Published
- 2013
- Full Text
- View/download PDF
16. Development of a human-like neurologic model to simulate the influences of diseases for neurologic examination training
- Author
-
Hiroyuki Ishii, Matsuoka Yusuke, Kazuyuki Hatake, Yohan Noh, Salvatore Sessa, Satoru Shoji, Mitsuhiro Tokumoto, Atsuo Takanishi, Massimiliano Zecca, Chunbao Wang, and Chihara Terunaga
- Subjects
medicine.medical_specialty ,Involuntary action ,Computer science ,media_common.quotation_subject ,education ,Elbow ,Neurological Model ,Training (civil) ,Simulated patient ,Physical medicine and rehabilitation ,medicine.anatomical_structure ,Reading (process) ,medicine ,Robot ,Whole body ,Set (psychology) ,Biceps tendon ,Simulation ,media_common - Abstract
Neurologic examination procedures require not only abundant knowledge but also prominent skills. During medical education and training, several methods will be put to use such as watching video, reading books, making use of the simulated patient (SP), and so on. These can help medical staffs, especially novices, to master the skills and accumulate experiences. To make up for the drawbacks of the above methods, such as lack of active interactions, limitation of multi-symptom reproductions, etc, the medical training simulators have been developed to improve training effectiveness. However, most of these simulators only mimic the symptoms. They have no the abilities to show the pathology of diseases. This limits the training effectiveness. In this paper, we propose an elbow robot named WKE-1(Waseda Kyotokagaku Elbow Robot No.1) for neurologic examination training as one part of the whole body patient robot named WKP (Waseda Kyotokagaku Patient). In this robot, we simulate various symptoms occurring during the examination of elbow force, biceps tendon reflex, involuntary action, and also make a physiological neurological model to simulate the pathology of the nervous system. Taking advantage of this robot, the trainee can get a systematic training on both the skills and knowledge. Finally, we take a set of experiments to verify our proposed mechanism and system. The experimental results lead to the consideration that the approach is worth following in further research.
- Published
- 2013
- Full Text
- View/download PDF
17. Cross-cultural study on human-robot greeting interaction: acceptance and discomfort by Egyptians and Japanese
- Author
-
Massimiliano Zecca, Atsuo Takanishi, Lorenzo Jamone, Kenji Hashimoto, Gabriele Trovato, Jaap Ham, Salvatore Sessa, and Human Technology Interaction
- Subjects
Technology ,technology social factors ,Cognitive Neuroscience ,Applied psychology ,cultural differences ,SDG 3 – Goede gezondheid en welzijn ,computer.software_genre ,Human–robot interaction ,human-robot interaction ,Behavioral Neuroscience ,Videoconferencing ,SDG 3 - Good Health and Well-being ,Developmental Neuroscience ,Artificial Intelligence ,social robotics ,Cultural diversity ,Medicine ,Cross-cultural ,Social robot ,business.industry ,humanoid robots ,Human-Computer Interaction ,Robot ,business ,computer ,Humanoid robot ,Gesture - Abstract
As witnessed in several behavioural studies, a complex relationship exists between people’s cultural background and their general acceptance towards robots. However, very few studies have investigated whether a robot’s original language and gesture based on certain culture have an impact on the people of the different cultures. The purpose of this work is to provide experimental evidence which supports the idea that humans may accept more easily a robot that can adapt to their specific culture. Indeed, improving acceptance and reducing discomfort is fundamental for future deployment of robots as assistive, health-care or companion devices into a society. We conducted a Human- Robot Interaction experiment both in Egypt and in Japan. Human subjects were engaged in a simulated video conference with robots that were greeting and speaking either in Arabic or in Japanese. The subjects completed a questionnaire assessing their preferences and their emotional state, while their spontaneous reactions were recorded in different ways. The results suggest that Egyptians prefer the Arabic robot, while they feel a sense of discomfort when interacting with the Japanese robot; the opposite is also true for the Japanese. These findings confirm the importance of the localisation of a robot in order to improve human acceptance during social human-robot interaction.
- Published
- 2013
- Full Text
- View/download PDF
18. Music conductor gesture recognition by using inertial measurement system for human-robot musical interaction
- Author
-
Yoshihisa Sugita, Hiroyuki Ishii, Sarah Cosentino, Atsuo Takanishi, Klaus Petersen, Salvatore Sessa, Zhuohua Lin, and Massimiliano Zecca
- Subjects
Articulation (music) ,Engineering ,Naturalness ,Inertial frame of reference ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Gesture recognition ,business.industry ,Speech recognition ,Robot ,Musical ,business ,Human–robot interaction ,Conductor - Abstract
In this paper, we describe a human gesture recognition system developed to enable a robot instrument player to recognize the variations in tempo and in articulation dictated by a conductor's movements and accordingly adapt its performance. The enhanced interaction ability would allow the partner musicians, as well as the conductor, to better appreciate a joint musical performance, because of the complete naturalness of the interaction. In addition, the possibility for the robot to change its performance parameters according to the conductor directions, thus being synchronized with all the other human musicians, would lead to an improvement in the overall musical performance.
- Published
- 2012
- Full Text
- View/download PDF
19. Musical robots: Towards a natural joint performance
- Author
-
Atsuo Takanishi, Yoshihisa Sugita, H. Ishii, Sarah Cosentino, Massimiliano Zecca, Klaus Petersen, Salvatore Sessa, K. Saito, Zhuohua Lin, and Luca Bartolomeo
- Subjects
Personal robot ,Engineering ,Social robot ,Gesture recognition ,business.industry ,Robot ,Mobile robot ,Computer vision ,Artificial intelligence ,business ,Humanoid robot ,Human–robot interaction ,Robot control - Abstract
In this paper, we describe the mechanical design system implemented to enable a robot flute player to enhance the expressiveness of its performance, reproducing the different musical articulations with the available features of the instrument. In addition, the robot has been equipped with a Human Gesture Recognition system to recognize the real-time variations in tempo, dynamics, and articulation dictated by the conductor's movements. The possibility for the robot to change its performance parameters according to the conductor directions, thus adding expression to its performance while being synchronized with all the other human musicians, would lead to an improvement in the overall joint musical performance.
- Published
- 2012
- Full Text
- View/download PDF
20. Non visual sensor based shape perception method for gait control of flexible colonoscopy robot
- Author
-
Genya Ukawa, Shuna Doho, Jaewoo Lee, Atsuo Takanishi, H. Ishii, Massimiliano Zecca, and Zhuohua Lin
- Subjects
Kinematic chain ,Robot kinematics ,Engineering ,Medical robot ,business.industry ,Orientation (computer vision) ,Accelerometer ,Compass ,Robot ,Computer vision ,Cartesian coordinate robot ,Artificial intelligence ,business ,Simulation - Abstract
In this paper, the shape of the medical robot which can move in the colon is suggested for its gate control. In this system, the current shape information plays a sensing role in order to control the gate of robot in the colon. In order to find current shape of robot, we construct sensor network system which composed of several electronic compass units. This unit makes use of chip which includes pair of 3 axis accelerometer and 3 axis magnetometer. From this signals, orientation is evaluated after filtering noise. Then, based on the kinematic chain model, the shape of the flexible robot is calculated using orientation information. The resulting trajectory shows that this method cans percept shape of flexible robot well.
- Published
- 2011
- Full Text
- View/download PDF
21. Towards high-level, cloud-distributed robotic telepresence: Concept introduction and preliminary experiments
- Author
-
Ebihara Kazuki, Hirokyuki Ishii, Nobutsuna Endo, Zhuohua Lin, Kotaro Fukui, Klaus Petersen, Ruediger Dillmann, Atsuo Takanishi, Tamim Asfour, and Massimiliano Zecca
- Subjects
Telerobotics ,business.industry ,Human–computer interaction ,Computer science ,Robot ,Cloud computing ,business ,Human–robot interaction ,Simulation - Abstract
In this paper we propose the basic concept of a tele-presence system for two (or more) anthropomorphic robots located in remote locations. As one robot interacts with a user, it acquires knowledge about the user's behavior and transfers this knowledge to the network. The robot in the remote location accesses this knowledge and according to this information emulates the behavior of the remote user when interacting with its partner.
- Published
- 2011
- Full Text
- View/download PDF
22. Objective evaluation of laparoscopic surgical skills using Waseda bioinstrumentation system WB-3
- Author
-
Hiroyuki Ishii, Makoto Hashizume, Takeshi Odaira, Kazuko Itoh, Morimasa Tomikawa, Satoshi Ieiri, Zhuohua Lin, Salvatore Sessa, Kazuo Tanoue, Massimiliano Zecca, Munenori Uemura, Kozo Konishi, Luca Bartolomeo, and Atsuo Takanishi
- Subjects
Laparoscopic surgery ,medicine.medical_specialty ,Engineering ,business.industry ,medicine.medical_treatment ,education ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Wearable computer ,Training methods ,Motion capture ,Motion (physics) ,medicine ,Surgical skills ,Robot ,Medical physics ,Objective evaluation ,business ,Simulation - Abstract
Performing laparoscopic surgery requires several skills which have never been required for conventional open surgery, surgeons experience difficulties in learning and mastering these techniques. Various training methods and metrics have been developed in order to assess and improve surgeon's operative abilities. While these training metrics are currently widely being used, skill evaluation methods are still far from being objective in the regular laparoscopic skill education. This study proposes a methodology of defining a processing model to objectively evaluate surgical performance and skill expertise in the routine laparoscopic training course. Our approach is based on the analysis of kinematic data describing the movements of surgeon's upper limbs. An ultra-miniaturized wearable motion capture system (Waseda Bioinstrumentation system WB-3), therefore, has been developed to measure and analyze these movements. The skill evaluation model was trained by using the subjects' motion features acquired from WB-3 system and further validated to classify the expertise levels of the subject with different laparoscopic experience. Experimental results show that, the proposed methodology can be efficiently used both for quantitative assessment of surgical performance, and for the discrimination between expert surgeons and novices.
- Published
- 2010
- Full Text
- View/download PDF
23. Modular Design of Emotion Expression Humanoid Robot KOBIAN
- Author
-
Nobutsuna Endo, Keita Endo, Massimiliano Zecca, and Atsuo Takanishi
- Subjects
Personal robot ,Activities of daily living ,business.industry ,Computer science ,Perspective (graphical) ,technology, industry, and agriculture ,Kobian ,Modular design ,Expression (mathematics) ,body regions ,surgical procedures, operative ,Human–computer interaction ,Robot ,business ,human activities ,Humanoid robot - Abstract
Personal robots and Robot Technology (RT)-based assistive devices are expected to play a substantial role in our society, largely populated by elders; they will play an active role in joint works and community life with humans. In particular, these robots are expected to play an important role for the assistance of elderly and disabled people during normal activities of daily living (ADLs). To achieve this result, personal robots should be also capable of human-like emotion expressions. In this perspective we developed a whole body bipedal humanoid robot, named KOBIAN, which is also capable to express human-like emotions. In this paper we present the mechanical and modular design of KOBAIN.
- Published
- 2010
- Full Text
- View/download PDF
24. Objective skill analysis and assessment of neurosurgery by using the waseda bioinstrumentation system WB-3
- Author
-
Zhuohua Lin, Tomoya Sasaki, Kazuko Itoh, Massimiliano Zecca, Hiroshi Iseki, Takashi Suzuki, Salvatore Sessa, and Atsuo Takanishi
- Subjects
Engineering ,medicine.medical_specialty ,Evaluation system ,business.industry ,media_common.quotation_subject ,Training methods ,Motion capture ,Inertial measurement unit ,medicine ,Robot ,Quality (business) ,Medical physics ,Neurosurgery ,business ,Simulation ,media_common - Abstract
In recent years there has been an ever increasing amount of research and development of technologies and methods to improve the quality and the performance of advanced surgery. In other fields several training methods and metrics have been proposed, both to improve the surgeon's abilities and also to assess her/his skills. For neurosurgery, however, the extremely small movements and sizes involved have prevented until now the development of similar methodologies and systems. In this paper we present the development of an ultra-miniaturized Inertial Measurement Unit and its application for the evaluation of the performance in a simple pick and place scenario. This analysis is a preliminary yet fundamental step to realize a better training/evaluation system for neurosurgeons, and to objectively evaluate and understand how the neurosurgery is performed.
- Published
- 2009
- Full Text
- View/download PDF
25. Whole body emotion expressions for KOBIAN humanoid robot — preliminary experiments with different Emotional patterns
- Author
-
Kazuko Itoh, Yu. Mizoguchi, Y. Kawabata, Atsuo Takanishi, Keita Endo, Fumiya Iida, Massimiliano Zecca, and Nobutsuna Endo
- Subjects
Personal robot ,Activities of daily living ,business.industry ,Face (sociological concept) ,Kobian ,Human–computer interaction ,Robot ,Emotional expression ,Artificial intelligence ,Whole body ,business ,Psychology ,human activities ,Humanoid robot - Abstract
Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans, as partner and as friends for us. In particular, these robots are expected to be fundamental for helping and assisting elderly and disabled people during their activities of daily living (ADLs). To achieve this result, personal robots should be also capable of human-like emotion expressions. To this purpose we developed a new whole body emotion expressing bipedal humanoid robot, named KOBIAN, which is also capable to express human-like emotions. In this paper we presented three different evaluations of the emotional expressiveness of KOBIAN. In particular In particular, we presented the analysis of the roles of the face, the body, and their combination in emotional expressions. We also compared Emotional patterns created by a Photographer and a Cartoonist with the ones created by us. Overall, although the experimental results are not as good as we were expecting, we confirmed the robot can clearly express its emotions, and that very high recognition ratios are possible.
- Published
- 2009
- Full Text
- View/download PDF
26. Evaluation of the effects of the shape of the artificial hand on the quality of the interaction
- Author
-
Kazuko Itoh, Yousuke Kawabata, Atsuo Takanishi, Massimiliano Zecca, Yu. Mizoguchi, Fumiya Iida, Nobutsuna Endo, and Keita Endo
- Subjects
Personal robot ,Handshake ,Computer science ,business.industry ,media_common.quotation_subject ,Human–computer interaction ,Natural (music) ,Robot ,Computer vision ,Quality (business) ,Artificial hand ,Artificial intelligence ,business ,Humanoid robot ,media_common - Abstract
Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, by interacting with surrounding people both physically and psychologically. A fundamental role during the interaction is of course played by the hand. In this paper we present the evaluation of the effect of hand shape to the quality of the interaction, in particular during handshake.
- Published
- 2009
- Full Text
- View/download PDF
27. Design of the humanoid robot KOBIAN - preliminary analysis of facial and whole body emotion expression capabilities
- Author
-
Koichi Itoh, Atsuo Takanishi, S. Momoki, Nobutsuna Endo, and Massimiliano Zecca
- Subjects
Personal robot ,Activities of daily living ,Social robot ,Computer science ,business.industry ,technology, industry, and agriculture ,Kobian ,Expression (mathematics) ,body regions ,surgical procedures, operative ,Human–computer interaction ,Robot ,Emotional expression ,Artificial intelligence ,business ,human activities ,Humanoid robot - Abstract
Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans, as partner and as friends for us. In particular, these robots are expected to be fundamental for helping and assisting elderly and disabled people during their activities of daily living (ADLs). To achieve this result, personal robots should be capable of human-like emotion expressions; in addition, human-like bipedal walking is the best solution for the robots which should be active in the human living environment. Although several bipedal robots and several emotional expression robots have been developed in the recent years, until now there was no robot which integrated all these functions. Therefore we developed a new bipedal walking robot, named KOBIAN, which is also capable to express human-like emotions. In this paper, we present the design and the preliminary evaluation of the new emotional expression head. The preliminary results showed that the emotion expressed by only the head cannot be really easily understood by the users. However, the presence of a full body clearly enhances the emotion expression capability of the robot, thus proving the effectiveness of the proposed approach.
- Published
- 2008
- Full Text
- View/download PDF
28. Development of whole-body emotion expression humanoid robot
- Author
-
Atsuo Takanishi, Massimiliano Zecca, Yu. Mizoguchi, Koichi Itoh, S. Momoki, Nobutsuna Endo, and M. Saito
- Subjects
Personal robot ,Engineering ,Social robot ,business.industry ,Mobile robot ,Robot learning ,Robot control ,Human–computer interaction ,Robot ,Computer vision ,Artificial intelligence ,business ,Humanoid robot ,Robot locomotion - Abstract
Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans, as partner and as friends for us. The authors think that the emotion expression of a robot is effective in joint activities of human and robot. In addition, we also think that bipedal walking is necessary to robots which are active in human living environment. But, there was no robot which has those functions. And, it is not clear what kinds of functions are effective actually. Therefore we developed a new bipedal walking robot which is capable to express emotions. In this paper, we present the design and the preliminary evaluation of the new head of the robot with only a small number of degrees of freedom for facial expression.
- Published
- 2008
- Full Text
- View/download PDF
29. Design and evaluation of the new head for the whole-body emotion expression hu-manoid robot KOBIAN
- Author
-
S. Momoki, Atsuo Takanishi, Koichi Itoh, Massimiliano Zecca, and N. Endo
- Subjects
Expression (architecture) ,Head (linguistics) ,business.industry ,Biomedical Engineering ,Robot ,Kobian ,Computer vision ,Artificial intelligence ,Geriatrics and Gerontology ,Psychology ,business ,Whole body ,Gerontology - Published
- 2008
- Full Text
- View/download PDF
30. Waseda Bioinstrumentation System WB-2 - the new Inertial Measurement Unit for the new Motion Caption System –
- Author
-
Nobutsuna Endo, M. Saito, Hideaki Takanobu, Atsuo Takanishi, Massimiliano Zecca, Yu. Mizoguchi, and Koichi Itoh
- Subjects
Personal robot ,Engineering ,Inertial measurement unit ,business.industry ,Robot ,Adaptation (computer science) ,business ,Motion capture ,Human–robot interaction ,Motion (physics) ,Humanoid robot ,Simulation - Abstract
Personal robots and robot technology (RT)- based assistive devices are expected to play a major role in Japan's elderly-dominated society, both for joint activities with their human partners and for participation in community life. These new devices should be capable of smooth and natural adaptation and interaction with their human partners and the environment, should be able to communicate naturally with humans, and should never have a negative effect on their human partners, neither physical nor emotional. To achieve this smooth and natural integration between humans and robots, we need first to investigate and clarify how these interactions are carried out. Therefore, we developed the portable bioinstrumentation system WB-2 (Waseda bioinstrumentation system No.2), which can measure the movements of the head, the arms, and the hands (position, velocity, and acceleration), as well as several physiological parameters (electrical activity of the heart respiration, perspiration, pulse wave, and so on), to objectively measure and understand the physical and physiological effects of the interaction between robots and humans. In this paper we present our development of the Inertial Measurement Unit, which is at the heart of our new motion-capture system that replaces the system used in the Waseda bioinstrumentation system No.1 refined (WB-1R). Some preliminary results of experiments with the unit are also presented and analyzed.
- Published
- 2007
- Full Text
- View/download PDF
31. On the development of the Bioinstrumentation System WB-1R for the evaluation of human-robot interaction - Head and Hands Motion Capture Systems
- Author
-
M. Saito, Atsuo Takanishi, Koichi Itoh, Massimiliano Zecca, Nobutsuna Endo, K. Imanishi, N. Nanba, and Hideaki Takanobu
- Subjects
Engineering ,Personal robot ,Intelligent sensor ,business.industry ,Head (linguistics) ,Community life ,Robot ,Adaptation (computer science) ,business ,Motion capture ,Simulation ,Human–robot interaction - Abstract
Personal Robots and Robot Technology (RT)-based assistive devices are expected to play a major role in Japan's elderly-dominated society, both for joint activities with their human partners and for participation in community life. These new devices should be capable of smooth and natural adaptation and interaction with their human partners and the environment, should be able to communicate naturally with humans, and should never have a negative effect on their human partners, neither physical nor emotional. To achieve this smooth and natural integration between humans and robots, we need first to investigate and clarify how these interactions are carried out. Therefore, we developed the portable Bioinstrumentation System WB-1R (Waseda Bioinstrumentation system no.l Refined), which can measure the movements of the head, the arms, the hands (position, velocity, and acceleration), as well as several physiological parameters (electrocardiogram, respiration, perspiration, pulse wave, and so on), to objectively measure and understand the physical and physiological effects of the interaction between robots and humans. In this paper we present our development of the head and hands motion capture systems as additional modules for the Waseda Bioinstrumentation system No.1 (WB-1). The preliminary experimental results, given the inexpensiveness of the systems, are good for our purposes.
- Published
- 2007
- Full Text
- View/download PDF
32. Development of a Bioinstrumentation System in the Interaction between a Human and a Robot
- Author
-
Yuko Nukariya, Maria Chiara Carrozza, Massimiliano Zecca, Stefano Roccella, Hideaki Takanobu, Atsuo Takanishi, Paolo Dario, Kazuko Itoh, and Hiroyasu Miwa
- Subjects
Engineering ,Personal robot ,business.industry ,Work (physics) ,technology, industry, and agriculture ,Human stress ,Motion capture ,Motion (physics) ,body regions ,Community life ,Robot ,Computer vision ,Artificial intelligence ,business ,human activities ,Humanoid robot - Abstract
Personal robots, which are expected to become popular in the future, are required to be active in joint work and community life with humans. Such robots must have no bad physical or psychical effect on humans. The psychical effect of a robot on humans has been subjectively measured using questionnaires. However, it has not been objectively measured yet. Human emotion and the consciousness direction can be measured by physiological parameters and body motion, respectively. Therefore, the bioinstrumentation system WB-1 was developed in order to objectively measure the psychical effect of a robot on a human. It can measure physiological parameters such as respiration, heart rate, perspiration and pulse wave, and arm motion. Analyzing human stress in the interaction with a robot from electrocardiogram, the robot could generate a motion for decreasing the stress.
- Published
- 2006
- Full Text
- View/download PDF
33. Behavior model of humanoid robots based on operant conditioning
- Author
-
Atsuo Takanishi, Hiroyasu Miwa, Stefano Roccella, M. Matsumoto, M.C. Carrozza, Hideaki Takanobu, Kazuko Itoh, Paolo Dario, and Massimiliano Zecca
- Subjects
Personal robot ,Social robot ,Computer science ,business.industry ,Robot ,Operant conditioning ,Mobile robot ,Motion planning ,Artificial intelligence ,Behavior-based robotics ,business ,Humanoid robot - Abstract
Personal robots, which are expected to become popular in the near future, are required to be active in work and community life alongside humans. Therefore, we have been developing new mechanisms and functions in order to realize natural bilateral interaction by expressing emotions, behaviors and personality in a human-like manner. We have proposed a mental model with emotion, mood, personality and needs. However, the robot behavior was very simple since the robot shows just a single kind of behavior in response to a robot mental state. In this paper, we present a new behavior model for humanoid robots based on operant conditioning, which is a well-known psychological behavior model. We implemented this new behavior model into the emotion expression humanoid robot WE-4RII (Waseda Eye No.4 Refined II), developed in 2004. Through the experimental evaluations, we confirmed that the robot with the new behavior model could autonomously select suitable behavior for the situation within a predefined behavior list
- Published
- 2006
- Full Text
- View/download PDF
34. New memory model for humanoid robots - introduction of co-associative memory using mutually coupled chaotic neural networks
- Author
-
Kazuko Itoh, Hideaki Takanobu, Atsuo Takanishi, H. Miwa, Massimiliano Zecca, Paolo Dario, and Yuko Nukariya
- Subjects
Personal robot ,Social robot ,Artificial neural network ,business.industry ,Computer science ,Chaotic ,Robot ,Memory model ,Artificial intelligence ,Content-addressable memory ,business ,Humanoid robot - Abstract
Personal robots, which are expected to become popular in the future, are required to be active in joint work and community life with humans. Therefore, we have been developing new mechanisms and functions for a humanoid robot that has the ability to express emotions and to communicate with humans in a human-like manner. In 2004, we introduced the "Behavior Model" and "Consciousness Model" to the robot mental model so that the robot generated various kinds of behavior and an object of the robot's behavior became clear. We implemented the mental model in the emotion expression humanoid robot WE-4RII (Waseda Eye No.4 Refined II). Also, we have been studying a system of multiple harmonic oscillators (neurons) interacting via chaotic force since 2002. Each harmonic oscillator is driven by chaotic force whose bifurcation parameter is modulated by the position of the harmonic oscillator. In this paper, we propose an associative memory model using mutually coupled chaotic neural networks for generating an optimum behavior to a stimulus. We implemented this model in the emotional expression humanoid robot WE-4RII (Waseda Eye No.4 Refined II).
- Published
- 2006
- Full Text
- View/download PDF
35. Biologically-Inspired Microfabricated Force and Position Mechano-Sensors
- Author
-
Maria Chiara Carrozza, Arianna Menciassi, Paolo Dario, Cecilia Laschi, Silvestro Micera, Massimiliano Zecca, Barbara Mazzolai, and F. Vecchi
- Subjects
Microelectromechanical systems ,Position (vector) ,Computer science ,Robot ,Control engineering ,Hall effect sensor ,Mechatronics ,Tactile sensor ,Position sensor ,Humanoid robot - Abstract
The aim of this paper is to discuss an ideal design procedure for biologically-inspired mechano-sensors. The main steps of this procedure are the following: (1) analysis of force and position sensors in humans; (2) analysis of technologies available for MEMS (Micro Electro Mechanical Systems) and (3) design and implementation of biologically-inspired sensors in innovative mechatronic and biomechatronic systems (e.g., anthropomorphic robots, prostheses, and neuroprostheses).
- Published
- 2003
- Full Text
- View/download PDF
36. From the human hand to a humanoid hand: Biologically-inspired approach for the development of RoboCasa Hand #1
- Author
-
H. Miwa, Atsuo Takanishi, G. Cappiello, Kazuko Ito, M. Chiara Carrozza, Massimiliano Zecca, Stefano Roccella, Paolo Dario, and K. Imanishi
- Subjects
Prosthetic hand ,Personal robot ,Human–computer interaction ,Computer science ,Robotic hand ,Robot ,Emotional expression ,Humanoid robot ,Natural communication - Abstract
In a society getting older year by year, Robot technology (RT) is expected to play an important role. In order to achieve this objective, the new generation of personal robots should be capable of a natural communication with humans by expressing human-like emotion. In this sense, the hands play a fundamental role in communication, because they have grasping, sensing and emotional expression ability. This paper presents the recent results of the collaboration between the Takanishi Lab of Waseda University, Tokyo, Japan, and the Arts Lab of Scuola Superiore Sant’Anna, Pisa, Italy, and RoboCasa in a biologically-inspired approach for the development of a new humanoid hand. In particular, the grasping and gestural capabilities of the novel anthropomorphic hand for humanoid robotics RCH-1 (RoboCasa Hand No.1) are presented.
37. Mechanical design of emotion expression humanoid robot WE-4RII
- Author
-
Stefano Roccella, Massimiliano Zecca, Kazuko Itoh, Maria Chiara Carrozza, Hiroyasu Miwa, Atsuo Takanishi, Paolo Dario, and Hideaki Takanobu
- Subjects
Personal robot ,Facial expression ,Expression (architecture) ,Human–computer interaction ,Computer science ,Mechanical design ,Robot ,Motion (physics) ,Humanoid robot ,Robot control - Abstract
A Personal Robot is expected to become popular in the future. It is required to be active in joint work and community life with humans. Therefore, we have been developing new mechanisms and functions for a humanoid robot that can express emotions and communicate naturally with humans. In this paper, we present the mechanical design of the Emotion Expression Humanoid Robot WE-4RII, which was developed by integrating the Humanoid Robot Hands RCH-1 into the previous version WE-4R. The robot has four of the five human senses for detecting external stimuli: visual, tactile, auditory and olfactory, and 59-DOFs for expressing motion and emotions. It is capable of expressing seven basic emotional patterns.
38. On the development of the emotion expression humanoid robot WE-4RII with RCH-1
- Author
-
G. Cappiello, Paolo Dario, Atsuo Takanishi, H. Miwa, Hideaki Takanobu, Stefano Roccella, Koichi Itoh, M. Matsumoto, M.C. Carrozza, Massimiliano Zecca, and John-John Cabibihan
- Subjects
education.field_of_study ,Personal robot ,business.industry ,Computer science ,Population ,The arts ,Robot control ,Expression (architecture) ,Human–computer interaction ,Robot ,Artificial hand ,Artificial intelligence ,business ,education ,Humanoid robot - Abstract
Among social infrastructure technologies, robot technology (RT) is expected to play an important role in solving the problems of both decrease of birth rate and increase of elderly people in the 21st century, specially (but not only) in Japan where the average age of the population is rising faster than any other nation in the world. In order to achieve this objective, the new generation of personal robots should be capable of a natural communication with humans by expressing human-like emotion. In this sense, human hands play a fundamental role in exploration, communication and interaction with objects and other persons. This paper presents the recent results of the collaboration between the Takanishi Lab of Waseda University, Tokyo, Japan, the Arts Lab of Scuola Superiore Sant'Anna, Pisa, Italy, and ROBOCASA, Tokyo, Japan. At first, the integration of the artificial hand RTR-2 of ARTS lab with the humanoid robotic platform WE-4R during ROBODEX2003 is presented. Then, the paper show the preliminary results of the development of a novel anthropomorphic hand for humanoid robotics RCH-I (ROBOCASA Hand No.1) and its integration into a new humanoid robotic platform, named WE-4RII (Waseda Eye No.4 Refined II).
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.