Staff member

Klaudia Grechuta

Postdoctoral Researcher
Synthetic, Perceptive, Emotive and Cognitive Systems (SPECS)

Staff member publications

Grechuta, Klaudia, De La Torre, Javier, Rubio, Belé, Verschure, P., (2019). Challenging the boundaries of the physical self: purely distal cues in the environment impact body ownership bioRxiv , 672139

The unique ability to identify one’s own body and experience it as one’s own is fundamental in goal-oriented behavior and survival. However, the mechanisms underlying the so-called body ownership are yet not fully understood. The plasticity of body ownership has been studied using two experimental methods or their variations. Specifically, the Rubber Hand Illusion (RHI), where the tactile stimuli are externally generated, or the moving RHI which implies self-initiated movements. Grounded in these paradigms, evidence has demonstrated that body ownership is a product of bottom-up reception of self- and externally-generated multisensory information and top-down comparison between the predicted and the actual sensory stimuli. Crucially, provided the design of the current paradigms, where one of the manipulated cues always involves the processing of a proximal modality sensing the body or its surface (e.g., touch), the contribution of sensory signals which pertain to the environment remain elusive. Here we propose that, as any robust percept, body ownership depends on the integration and prediction of all the sensory stimuli, and therefore it will depend on the consistency of purely distal sensory signals pertaining to the environment. To test our hypothesis, we create an embodied goal-oriented task and manipulate the predictability of the surrounding environment by changing the congruency of purely distal multisensory cues while preserving bodily and action-driven signals entirely predictable. Our results empirically reveal that the way we represent our body is contingent upon all the sensory stimuli including purely distal and action-independent signals which pertain to the environment.

Grechuta, K., Rubio Ballester, B., Espín Munne, R., Usabiaga Bernal, T., Molina Hervás, B., Mohr, B., Pulvermüller, F., San Segundo, R., Verschure, P., (2019). Augmented dyadic therapy boosts recovery of language function in patients with nonfluent aphasia Stroke 50, (5), 1270-1274

Background and Purpose- Evidence suggests that therapy can be effective in recovering from aphasia, provided that it consists of socially embedded, intensive training of behaviorally relevant tasks. However, the resources of healthcare systems are often too limited to provide such treatment at sufficient dosage. Hence, there is a need for evidence-based, cost-effective rehabilitation methods. Here, we asked whether virtual reality-based treatment grounded in the principles of use-dependent learning, behavioral relevance, and intensity positively impacts recovery from nonfluent aphasia. Methods- Seventeen patients with chronic nonfluent aphasia underwent intensive therapy in a randomized, controlled, parallel-group trial. Participants were assigned to the control group (N=8) receiving standard treatment or to the experimental group (N=9) receiving augmented embodied therapy with the Rehabilitation Gaming System for aphasia. All Rehabilitation Gaming System for aphasia sessions were supervised by an assistant who monitored the patients but did not offer any elements of standard therapy. Both interventions were matched for intensity and materials. Results- Our results revealed that at the end of the treatment both groups significantly improved on the primary outcome measure (Boston Diagnostic Aphasia Examination: control group, P=0.04; experimental group, P=0.01), and the secondary outcome measure (lexical access-vocabulary test: control group, P=0.01; experimental group, P=0.007). However, only the Rehabilitation Gaming System for aphasia group improved on the Communicative Aphasia Log ( P=0.01). The follow-up assessment (week 16) demonstrated that while both groups retained vocabulary-related changes (control group, P=0.01; experimental group, P=0.007), only the Rehabilitation Gaming System for aphasia group showed therapy-induced improvements in language ( P=0.01) and communication ( P=0.05). Conclusions- Our results demonstrate the effectiveness of Rehabilitation Gaming System for aphasia for improving language and communication in patients with chronic aphasia suggesting that current challenges faced by the healthcare system in the treatment of stroke might be effectively addressed by augmenting traditional therapy with computer-based methods. Clinical Trial Registration- URL: . Unique identifier: NCT02928822.

Keywords: Aphasia, Embodied training, Neurological rehabilitation, Virtual reality

Herreros, Ivan, Miquel, Laia, Blithikioti, Chrysanthi, Nuño, Laura, Rubio Ballester, Belen, Grechuta, Klaudia, Gual, Antoni, Balcells-Oliveró, Mercè, Verschure, P., (2019). Motor adaptation impairment in chronic cannabis users assessed by a visuomotor rotation task Journal of Clinical Medicine 8, (7), 1049

Background—The cerebellum has been recently suggested as an important player in the addiction brain circuit. Cannabis is one of the most used drugs worldwide, and its long-term effects on the central nervous system are not fully understood. No valid clinical evaluations of cannabis impact on the brain are available today. The cerebellum is expected to be one of the brain structures that are highly affected by prolonged exposure to cannabis, due to its high density in endocannabinoid receptors. We aim to use a motor adaptation paradigm to indirectly assess cerebellar function in chronic cannabis users (CCUs). Methods—We used a visuomotor rotation (VMR) task that probes a putatively-cerebellar implicit motor adaptation process together with the learning and execution of an explicit aiming rule. We conducted a case-control study, recruiting 18 CCUs and 18 age-matched healthy controls. Our main measure was the angular aiming error. Results—Our results show that CCUs have impaired implicit motor adaptation, as they showed a smaller rate of adaptation compared with healthy controls (drift rate: 19.3 +/− 6.8° vs. 27.4 +/− 11.6°; t(26) = −2.1, p = 0.048, Cohen’s d = −0.8, 95% CI = (−1.7, −0.15)). Conclusions—We suggest that a visuomotor rotation task might be the first step towards developing a useful tool for the detection of alterations in implicit learning among cannabis users.

Keywords: Cerebellum, Cannabis, Implicit motor learning, Motor adaptation, Visuomotor rotation

Grechuta, Klaudia, Ulysse, Laura, Rubio Ballester, Belen, Verschure, Paul, (2019). Self beyond the body: task-relevant distal cues modulate performance and body ownership bioRxiv (pre-print server) , 361022

The understanding of Body Ownership (BO) largely relies on the Rubber Hand Illusion (RHI) where synchronous stroking of real and Rubber Hands (RH) leads to an illusion of ownership of RH provided physical, anatomical, postural and spatial plausibility of the two body-parts. RHI also occurs during visuomotor synchrony, in particular, when the visual feedback of virtual arm movements follows the trajectory of the instantiated motor command. Hence BO seems to result from a bottom-up integration of afferent and efferent proximal multisensory evidence, and top-down prediction of both externally and self-generated signals, which occurs when the predictions about upcoming sensory signals are accurate. In motor control, the differential processing of predicted and actual sensory consequences of self-generated actions is addressed by, the so-called, Forward Model (FM). Based on an efference copy or corollary discharge, FM issues predictions about the sensory consequences of motor commands and compares them with the actual outcome. The discrepancies (Sensory Prediction Errors, SPEs) are used to correct the action on the consecutive trial and provide new estimates of the current state of the body and the environment. Here, we propose that BO might be computed by FMs, and therefore, it might depend on their consistency, specifically, in contexts where the sensory feedback is self-generated. Crucially, to reduce SPE, FMs integrate both proximal (proprioceptive) and distal (vision, audition) sensory cues relevant to the task. Thus, if BO depends on the consistency of FMs, it would be compromised by the incongruency of not only proximal but also distal cues. To test our hypothesis, we devised an embodied VR-based task where action outcomes were signaled by distinct auditory cues. By manipulating the cues with respect to their spatiotemporal congruency and valence, we show that distal feedback which violates predictions about action outcomes compromises both BO and performance. These results demonstrate that BO is influenced by not only efferent and afferent cues which pertain to the body itself but also those arising outside of the body and suggest that in goal-oriented tasks BO might result from a computation of FM.

Vouloutsi, V., Grechuta, K., Verschure, P., (2019). Evaluation of the facial expressions of a humanoid robot Biomimetic and Biohybrid Systems 8th International Conference, Living Machines 2019 (Lecture Notes in Computer Science) , Springer International Publishing (Nara, Japan) 11556, 365-368

Facial expressions are salient social features that crucial in communication, and humans are capable of reading the messages faces convey and the emotions they display. Robots that interact with humans will need to employ similar communication channels for successful interactions. Here, we focus on the readability of the facial expressions of a humanoid robot. We conducted an online survey where participants evaluated emotional stimuli and assessed the robot’s expressions. Results suggest that the robot’s facial expressions are correctly recognised and the appraisal of the robots expressive elements are consistent with the literature.

Keywords: Emotion recognition, Facial expressions, Human-robot interaction