Back to Search Start Over

A Multimodal Dataset For Authoring And Editing Multimedia Content: The Mamem Project

Authors :
Nikolopoulos, Spiros
Petrantonakis, Panagiotis
Georgiadis, Kostas
Kalaganis, Fotis
Lazarou, Ioulietta
Adam, Katerina
Papazoglou, Anastasios
Chatzilari, Elisavet
Oikonomou, Vangelis
Kumar, Chandan
Menges, Raphael
Staab, Steffen
Muller, Daniel
Sengupta, Korok Sengupta
Zeilig, Gabi
Plotnik. Meir
Gotlieb, Amihai
Kizoni, Racheli
Fountoukidou, Sofia
Jaap, Ham
Athanasiou, Dimitris
Marakakis, Agnes
Comanducci, Dario
Sabatini, Edoardo Sabatini
Nistico, Walter
Plank, Markus
Kompatsiaris, Ioannis
Publication Year :
2017
Publisher :
Zenodo, 2017.

Abstract

We present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals collected from 34 individuals (18 able-bodied and 16 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. The presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....42623ed48d3ee13de1f79c8beb4c4f18
Full Text :
https://doi.org/10.5281/zenodo.1293933