
- News
- Brain–Computer Interface-driven artistic expression: real-time cognitive visualization in the pangolin scales animatronic dress and screen dress
Brain–Computer Interface-driven artistic expression: real-time cognitive visualization in the pangolin scales animatronic dress and screen dress
In an innovative study that combines neuroscience, fashion design, and interactive technology, researchers from Johannes Kepler University Linz and g.tec medical engineering have developed two experimental garments that respond to brain activity in real time. These creations — the Screen Dress and the Pangolin Scales Animatronic Dress — demonstrate how brain-computer interfaces (BCIs) can be used not just for clinical or research purposes, but for artistic and expressive applications as well.
Reimagining the Brain–Computer Interface
Brain-computer interfaces typically allow users to control external systems using neural signals, often in medical contexts such as communication tools for individuals with paralysis. In contrast, this study shifts the focus from control to expression. The garments don’t ask the brain to command devices—they instead reflect what’s happening in the brain, turning the wearer’s mental state into a visual and kinetic experience.
This new direction proposes a more emotionally resonant use of BCI: as a tool for self-awareness, performance, and public engagement with neuroscience.
The Screen Dress: Showing Focus Through Digital Eyes
The first prototype, called the Screen Dress, uses a 4-channel dry EEG headband to measure the wearer’s engagement levels — a mental state often associated with attention or concentration. This EEG data is processed in real time and controls animated eyes displayed on small screens embedded in the dress.

Figure 1. Unicorn BCI Core-4: 4-channel EEG headband device utilizing dry electrode technology.
As the wearer becomes more focused, the eyes open wider or shift gaze. When attention drops, the eyes may droop, blink more slowly, or look away. The entire system is wearable, wireless, and mobile — capable of being used in everyday settings or public demonstrations.
The engagement detection is powered by machine learning algorithms trained on attention-focused tasks (e.g., the d2 concentration test). By mapping the classification output to visual behavior, the researchers created a feedback loop where internal mental shifts become part of an outward display.
The Screen Dress highlights how even a relatively low-resolution EEG setup can be meaningfully applied when paired with strong visual metaphors and real-time feedback design.
The Pangolin Scales Animatronic Dress: A Brain-Controlled Kinetic Sculpture
The second, more complex prototype is the Pangolin Scales Animatronic Dress. This dress uses a high-density 1,024-channel EEG system to capture detailed brainwave patterns across different regions of the scalp. It then maps that brain activity to a series of 36 animatronic “scales” that move and light up.
The design of the dress is inspired by pangolin skin, with overlapping mechanical panels embedded with LEDs and servo motors. Each panel responds to the frequency of brain activity it is assigned to:
- Theta waves (4–8 Hz) trigger slow, fluid movements and soft purple lighting, representing calm or meditative states.
- Alpha waves (8–12 Hz) generate gentle blue waves of motion, often linked to relaxed wakefulness.
- Beta waves (13–30 Hz) produce sharp, flickering white movements and light, reflecting cognitive effort or alertness.

Figure 11. The three dress states with the dress mounted on a mannequin. (A) Theta–meditation, creativity (purple), (B) Alpha–relaxed, awake (blue), (C) Beta–alertness, stress (white).
Each group of brainwave frequencies is mapped to different regions of the dress based on corresponding brain regions. For instance, activity in the frontal cortex might control movement in the front scales, while occipital activity affects the back.
This dress functions as a living, moving data sculpture — a visual and tactile representation of a person’s cognitive rhythms.

Figure 3. (A) Project 1: screen dress (©Anouk Wipprecht), (B) Project 2: pangolin scales dress (©Yanni de Melo).
From Research to Expression
Both garments demonstrate a shift in how Brain-Computer Interfaces can be perceived and used. Rather than requiring users to “do” something with their minds (such as control a cursor or robotic arm), these garments simply show what the mind is already doing.
According to the authors, this form of neuroexpression opens up new possibilities in design, performance art, education, and interactive exhibitions. Because the dresses are portable and can operate in real-world settings, they also offer a compelling way to engage the public in understanding their own mental states and brainwave activity.
This approach could inspire new applications, such as:
- Interactive learning tools that show students how their focus changes during tasks.
- Therapeutic feedback devices that help users visualize stress or mindfulness.
- Art installations and performances where thought and emotion are made visible.
Conclusion: Wearable Neuroscience as Art
The Screen Dress and Pangolin Scales Dress represent a novel approach to wearable technology—one that moves away from functionality and toward emotional and cognitive storytelling. By giving form to thought, they help us consider how much of our inner lives we could—or should—externalize.
These brain-responsive garments aren’t just high-tech clothing. They are a new kind of interface between mind and world — one that invites us to observe and reflect on the invisible processes that shape our everyday experiences.
You are currently viewing a placeholder content from Reddit. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.
More Information