Publications

by Keyword: Valence


By year:[ 2020 | 2019 | 2018 | 2017 | 2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | 2008 | 2007 | 2006 | 2005 ]

Sánchez-Fibla, M., Forestier, S., Moulin-Frier, C., Puigbò, J. Y., Verschure, P., (2020). From motor to visually guided bimanual affordance learning Adaptive Behavior Article first published online

The mechanisms of how the brain orchestrates multi-limb joint action have yet to be elucidated and few computational sensorimotor (SM) learning approaches have dealt with the problem of acquiring bimanual affordances. We propose a series of bidirectional (forward/inverse) SM maps and its associated learning processes that generalize from uni- to bimanual interaction (and affordances) naturally, reinforcing the motor equivalence property. The SM maps range from a SM nature to a solely sensory one: full body control, delta SM control (through small action changes), delta sensory co-variation (how body-related perceptual cues covariate with object-related ones). We make several contributions on how these SM maps are learned: (1) Context and Behavior-Based Babbling: generalizing goal babbling to the interleaving of absolute and local goals including guidance of reflexive behaviors; (2) Event-Based Learning: learning steps are driven by visual, haptic events; and (3) Affordance Gradients: the vectorial field gradients in which an object can be manipulated. Our modeling of bimanual affordances is in line with current robotic research in forward visuomotor mappings and visual servoing, enforces the motor equivalence property, and is also consistent with neurophysiological findings like the multiplicative encoding scheme.

Keywords: Affordances, Bimanual affordances, Goal babbling, Interlimb coordination, Motor equivalence, Sensorimotor learning


López-Carral, Héctor, Santos-Pata, D., Zucca, R., Verschure, P., (2019). How you type is what you type: Keystroke dynamics correlate with affective content ACII 2019 8th International Conference on Affective Computing and Intelligent Interaction , IEEE (Cabride, UK) , 1-5

Estimating the affective state of a user during a computer task traditionally relies on either subjective reports or analysis of physiological signals, facial expressions, and other measures. These methods have known limitations, can be intrusive and may require specialized equipment. An alternative would be employing a ubiquitous device of everyday use such as a standard keyboard. Here we investigate if we can infer the emotional state of a user by analyzing their typing patterns. To test this hypothesis, we asked 400 participants to caption a set of emotionally charged images taken from a standard database with known ratings of arousal and valence. We computed different keystroke pattern dynamics, including keystroke duration (dwell time) and latency (flight time). By computing the mean value of all of these features for each image, we found a statistically significant negative correlation between dwell times and valence, and between flight times and arousal. These results highlight the potential of using keystroke dynamics to estimate the affective state of a user in a non-obtrusive way and without the need for specialized devices.

Keywords: Feature extraction, Correlation, Keyboards, Task analysis, Statistical analysis, Affective computing, Standards, Keystroke, Keyboard, Typing, Arousal, Valence, Affect