Skip to main content

Hi there! I'm Juan Nino

I’m a Sr. XR Developer and an HCI researcher.
By leveraging design, AI, XR, and HCI technologies, I craft immersive experiences that reimagine how we engage online with one another and with information.

Photo of Juan

About me

After completing my Engineering degree, I pursued a Master's in Arts where I developed a sound-controlled multiplayer XR video game. Since then, I have used game design to create embodied XR experiences for multidisciplinary research-creation projects in fields such as music, literature, performative and martial arts, neuroscience, and rehabilitation. My PhD research aims to make it easier and more natural for visually impaired users to navigate the internet through the co-design, development, and user evaluation of an open-source tactile human-computer interface. Consequently, I am currently in the process of launching LibreTactile.org, a platform to connect researchers, developers, users, and enthusiasts of open-source tactile technologies for human-computer interaction.
Personally, I'm always eager to learn new things, and I value kindness, respect, and openness. I'm passionate about photography, music, and travel.

Research and Development

Photo of a visually impaired person using the Tomat Navigator
Tomat Navigator

Leveraging AI and tactile technologies to co-design assistive technology with visually impaired internet users.

Logo of LibreTactile
LibreTactile.org

Orchestrating interdisciplinary collaboration for open-source touch-based assistive technologies.

Project XR Babel Library
XR Babel Library

Reimagining the experience of digital cultural heritage through procedurally generated books powered by semantic web, generative AI, and XR technologies.

Participatory Opera Project
Participatory Opera

Mixing live/virtual elements to create collective immersive experiences promoting audience participation in stories about tolerance, disability, and mental health.

Lab Vivant Project
LabVivant

Engaging online participants in live martial-arts events through embodied (voice, breathing, and movement) interactions using an AI powered interface.

MagnaQuest project
MagnaQuest

Developing a sound-controlled video game that inspires musical expression, teaches music concepts, and sparks creative collaboration among young learners.

Let's create something extraordinary together!

Connect with me on LinkedIn and GitHub

Logo GitHub Logo LinkedIn