Hoppa till huvudinnehåll

27

April

Perceptual Engineering: How can we engineer human perception? - fika-to fika in Lund

Tid: 2026-04-27 10:00 till 15:30 Seminar

When: 27 April 10.00 to 15.00 CET

Location: Teknodromen, M-building LTH, Klas Anshelms väg 6, Lund

Registration: It is free to participate, but please  sign up at ai.lu.se

Programme (prel.)

10:00-10:15 Fika

10:15-10:20 Welcome 
Emma Söderberg or Amir Aminifar

10:20-11:00 Invited talk: Expanding Human & Computer Senses through Perceptual Engineering

Speaker: Jas Brooks, Assistant Professor of Electrical Engineering and Computer Science at MIT

Abstract: What if sensory experience were as adjustable as a phone setting: dialing down sweetness to eat healthier, modulating perceived temperature for comfort, or extending smell to detect invisible hazards? Today's interfaces fall short of this vision: integrating intimate senses like taste, touch, temperature, and smell remains limited by power inefficiency, miniaturization constraints, and poor sensory specificity.

I argue that realising this future requires perceptual engineering: designing interfaces that precisely target the body's sensory mechanisms to alter perception in controlled, reproducible ways. My work pursues this through two main threads. The first is chemical interfaces: wearable systems that act directly on the body's chemical pathways rather than relying on mechanical stimulation or sensory substitution. This approach yields power-efficient thermal feedback (CHI'20 Best Paper), rich haptic variety from a single miniaturized actuator (UIST'21), and selective taste modification such as sweetness suppression to support healthier diets (UIST'23). The second thread demonstrates that perceptual engineering generalizes across modalities: electrical stimulation of the septum evokes smell-like sensations for gas detection and localization (CHI'21), and thermal modulation of the nose reshapes perceived airflow to support anxiety management and mask compliance (UIST'24).

Together, these systems show that by targeting sensory mechanisms directly, we can move beyond the limitations of conventional interfaces toward a world where people have genuine, programmable agency over their own perception to help with their wellbeing.

Bio: Jas Brooks is an Assistant Professor of Electrical Engineering and Computer Science at MIT, where they lead the Perceptual Engineering Lab within the Computer Science and Artificial Intelligence Laboratory (CSAIL). They received a Ph.D. (2025) and B.S. (2016) in Computer Science from the University of Chicago. Their work has appeared at top HCI venues and received Best Paper Awards as well as has attracted media coverage from outlets like WIRED and Fast Company. During their doctorate, Jas was a Rising Star in EECS, a Siebel Scholar, and an NSF Graduate Research Fellow.


11:00 Trustworthy Perceptual AI for Human Augmentation [publication]
Speaker: Fatemeh Akbarian, Secure and Networked Systems, EIT, Lund University

11:30 Digital perception of scent with BRIAN, an electronic nose
Speaker: David Kadish, School of Arts and Communication, Malmö University

Abstract: In a few key perceptual fields - namely vision and hearing - there is a rich digital environment for capturing and describing sensory information. There is also a relatively advanced understanding of how these digital representations map to human experiences of those same stimuli. For the sense of smell, we have barely begun to scratch the surface.
Electronic noses have existed since at least the 1980s, but due to a range of factors, have not achieved ubiquity in the way of cameras and microphones. This has rendered us digitally anosmic - unable to engage with the olfactory world through computing - and this has left a rich sensory world broadly unexplored.

In this talk, I will discuss some of the factors that make digital olfaction challenging. I will present an e-nose that we have been developing for the last couple of years, as well as some preliminary testing that we have done.

Bio: David Kadish is a researcher and interaction designer whose work explores non-visual senses. His research spans ecoacoustics — developing novel visualisations and tools for analysing environmental sound — and the emerging field of digital olfaction, including electronic nose (e-nose) technology for the exploration of smells and smellscapes. He is based at Malmö University, where he also teaches interaction design.


12:00-13:00 Lunch


13:00 The Future of In-Motion, Attention-Aware Interaction

Speaker: Morten Fjeld, professor of Human-Computer Interaction (HCI) at the University of Bergen (Norway) and Chalmers University of Technology (Sweden)

Abstract: Human–Computer Interaction (HCI) has evolved from static, desk-bound systems to mobile, touch-based technologies. In today’s attention-demanding environment, many systems are designed to capture user attention, often increasing distraction and cognitive load while reducing engagement with the physical world. We argue for a new design paradigm that better balances digital interaction with real-world engagement, especially during movement.
Although devices like laptops, tablets, and smartphones are portable, they are typically used while stationary, as in-motion use can compromise safety. While prior HCI research has explored multitasking, attentive interfaces, and intent prediction, limited work has addressed attention-preserving interaction in dynamic contexts such as walking, cycling, or driving.
Guided by cognitive principles, our project will develop a framework that couples user attention and intent to support in-motion interaction. It will integrate modalities such as eye tracking, wearable sensing, motion capture, voice input, VR/AR, and real-time scene imaging to enable novel interfaces. Through extensive indoor and outdoor studies in ecologically valid settings, the project aims to establish design principles for future in-motion HCI systems. Methods will include user studies, intent and distraction modelling, controlled VR experiments, and in-the-wild AR evaluations, building on prior work in mobile and wearable interfaces, gaze-based systems, and redirected walking.

Bio: Morten Fjeld is a professor of Human-Computer Interaction at the University of Bergen and Chalmers University of Technology. His research focuses on tangible and tabletop interaction, exploring new ways people engage with digital systems. He founded the t2i Interaction Lab at Chalmers in 2005. He holds dual MSc degrees from NTNU and ENSIMAG, and a PhD from ETH Zurich, where he received the ETH Medal in 2002 for his thesis “Designing for Tangible Interaction.” He has been a visiting professor at NUS Singapore, Tohoku University, and ETH Zurich, and has broad industry experience in fluid mechanics, simulators, and user interface design across research and applied settings.

Full bio: https://www4.uib.no/finn-ansatte/morten.fjeld
https://www.chalmers.se/en/persons/fjeld/


13:35 GazePrinter: Visualizing Expert Gaze to Guide Novices in a New Codebase 
Speaker: Peng Kuang, Software Development and Environments, Lund University

Abstract: Program comprehension is an essential activity in software engineering. Not only does it often challenge professionals, but it can also hinder novices from advancing their programming skills. Gaze, an emerging modality in developer tools, has so far primarily been utilized to improve our understanding of programmers' visual attention and as a means to reason about programmers' cognitive processes. There has been limited exploration of integrating gaze-based assistance into development environments to support programmers, despite the tight links between attention and gaze. We also know that joint attention is important in collaboration, further suggesting that there is value in exploring collective gaze.

In this paper, we investigate the effect of visualizing gaze patterns gathered from experts to novice programmers to assist them with program comprehension in a new codebase. To this end, we present GazePrinter, designed to provide gaze-orienting visual cues informed by experts to aid novices with program comprehension. We present the results of a mixed-methods study conducted with 40 novices to study the effects of using GazePrinter for program comprehension tasks. The study included a survey, a controlled experiment, and interviews. We found that visualization of expert gaze can have a significant effect on novice programmers' behavior in terms of which path they take through the code base; with GazePrinter, novices took a path closer to the path taken by experts. We also found indications of reduced time and cognitive load among novices using GazePrinter. 

13:55 Efficient AI for wearables (prel.)
Speaker: Baichuan Huang, Secure and Networked Systems, EIT, Lund University

14:15-14:45 Panel discussion


14:45-15:00 Fika


Organisation



Om händelsen
Tid: 2026-04-27 10:00 till 15:30

Plats
Teknodromen, M-building LTH, Klas Anshelms väg 6, Lund, Sweden

Kontakt
emma [dot] soderberg [at] cs [dot] lth [dot] se