Above is my illustration featuring a woman caressing the face of a hologram of her husband standing in her futuristic living room, in a condominium in a metropolitan city.
The classic trope of science fiction — reaching out to touch a glowing, floating 3D object — has long remained just that: fiction. Yet recent breakthroughs are beginning to blur that boundary. After the earlier article “Holographic Technologies”, in October 2025, 7-months ago, it’s time to explore the frontier: holograms that can be physically felt, grasped, poked, and manipulated. This is not mere visual spectacle, but tactile interactivity — the hologram you can touch.
The Recent Breakthroughs: Elastic Diffusers and Volumetric Touch
In April 2025, researchers from the Public University of Navarra (UPNA) announced a volumetric display that allows direct hand interaction with mid-air holographic objects. By replacing the traditional rigid diffuser in volumetric projectors with a soft, elastic diffuser, the team enabled the user to insert a finger or hand into the 3D volume without damaging the hardware.
A volumetric display normally works by rapidly oscillating a sheet (the diffuser) in depth, projecting successive 2D slices at different heights. These slices are perceived by the human eye as a 3D volume (i.e., persistence of vision). The challenge is: the diffuser is typically rigid, brittle, and moves at high speed; contacting it would break it or potentially injure the user. The UPNA team’s insight was to use elastic bands that can deform when touched, coupled with real-time image correction to compensate for distortion.
In effect, the display becomes semi-compliant: you can push and pull virtual objects inside it. In experiments, users could “grab” a floating cube (between index finger and thumb) and rotate or slide it — or even simulate a walking motion with two fingers. The team describes this as the “come-and-interact” paradigm: a user approaches the display and naturally begins to manipulate volumetric graphics.
Media outlets have widely heralded the result. Popular Mechanics ran “Scientists Built Holograms You Can Manipulate with Your Hands” in April 2025. Futurism described how “elastic bands” enable three-dimensional manipulation. LiveScience covered the breakthrough under “In a First, Breakthrough 3D Holograms Can Be Touched, Grabbed and Poked.” Cosmos Magazine also took note in “Holograms you can touch are here.”
Earlier attempts toward tactile holograms also exist. As far back as 2015, researchers in Japan demonstrated so-called plasma voxels (“Fairy Lights”) created by femtosecond lasers, which generate shock waves perceptible by human touch. WIRED covered this in “You can actually touch these 3D holograms.” Those volumetric plasma voxels are extremely small (around 1 cm³) and far from practical scalability, but their underlying idea — creating physical sensation through light-induced pressure perturbation — remains influential.
Earlier still, mid-air haptics via ultrasound have allowed users to feel “phantom” shapes in space — though not full holograms — by focusing acoustic pressure waves. Wired ran “Invisible 3D objects can be felt thanks to haptic holograms.” And in 2021, researchers at the University of Glasgow blocked the gap further, using hand-tracking plus air pressure to simulate resistance during hologram interaction. ScienceFocus discussed this in “Power up the holodeck.”
These prior techniques are complementary and instructive, though their limitations (small scale, weak force, limited shapes) underscore the promise of the recent elastic diffuser technique.
How Does Touchable Holography Work?
To understand why the UPNA breakthrough is meaningful — and what limitations remain — we need to unpack more precisely how tactile holography might work.
1. Volumetric Rendering and Diffusers
Volumetric displays operate by layering 2D slices (image “planes”) stacked in depth. This can be done by moving a diffuser or scattering medium through the depth dimension. When synchronized with high-speed projection, each slice is illuminated in turn; the human visual system integrates them into a continuous volume. The key is that the diffuser is typically a physical medium — so previously it needed to be rigid to maintain optical fidelity.
In the new work, the diffuser is constructed from elastic bands that can flex when touched. However, deformation distorts the optical path, so the research team applies an onc-the-fly compensation algorithm to correct for the deformation, preserving image integrity.
The diffuser oscillates at an extremely high frequency — reportedly 2,880 images per second — across its depth trajectory.
2. Sensing and User Feedback
For a hologram to respond to touch, the system must detect where and how the user’s hand penetrates or contacts the volume. In the UPNA setup, cameras or depth sensors likely track finger poses relative to the projected volume, coordinating the selection and transformation of virtual objects. (The public announcement does not fully detail the sensor architecture.)
When the user physically displaces a virtual object, the system updates the rendering accordingly: the object moves, rotates, and interacts with other virtual elements. The user sees this in real time, and, due to the elastic diffuser mechanism, feels a subtle resistance or response to their gesture.
3. Haptic Feedback and Force Simulation
One of the limitations is that the elastic diffuser itself offers only a weak mechanical “feel.” It cannot provide strong force feedback or simulate rigidity. To augment the tactile experience, future systems may incorporate ultrasound arrays, electrostatic actuators, or even microfluidic jets to push back against users’ fingertips.
This approach is reminiscent of mid-air haptics systems (e.g. Ultrahaptics) that deploy phased ultrasonic transducer arrays to create pressure nodes in mid-air, yielding tactile sensations without contact. Some mixed-reality systems already layer visual and haptic data: in one prototype, a user can see and feel a holographic beating heart, synchronized via ultrasound and biosensing.
Thus, a full “hologram you can touch” is really a hybrid of volumetric optical display + detection/tracking + haptic stimulus. The recent UPNA work is a milestone in the volumetric + tracking domain; haptic intensity is still evolving.
Challenges on the Path
Despite the excitement, there remain substantial challenges before touchable holograms become practical for consumer or industrial use.
Optical & Mechanical Complexity
- Resolution / brightness / contrast: Volumetric systems are inherently limited by the trade-off between slice count, brightness, and refresh rate. High resolution across depth is demanding.
- Material durability: Elastic diffusers must be optically transparent, have low hysteresis, and long mechanical lifetime. Repeated deformation may cause fatigue or scattering artifacts.
- Deformation correction latency: The system must detect deformations and correct imagery in real time, without perceptible lag. High-speed computation is needed.
Haptics Limitations
- Force magnitude: Ultrasonic-based systems produce gentle tactile pressure, inadequate for simulating heavy or rigid objects.
- Directional cues and texture: Replicating complex surface textures or directional shear forces is nontrivial.
- Safety and ergonomics: Haptic arrays must be safe for skin contact or proximity and avoid fatigue.
Scalability & Field Size
- Current prototype volumes are modest. Scaling to larger display sizes or spaces (e.g. room-scale holograms) would multiply optical, mechanical, and computational demands.
Robustness & Cost
- For consumer or industrial adoption, cost, maintenance, calibration stability, and component reliability must improve significantly.
Integration with AR / VR / Mixed Reality
- Many holographic use cases are intertwined with AR or MR systems (e.g., HoloLens, Holoportation). Bridging immersive headsets with truly free-space interactive holograms is nontrivial.
Potential Applications & Use Cases
When tactile holography matures, it could transform numerous domains:
- Design and engineering: Architects, product designers, and engineers could manipulate 3D models physically in shared space, adjusting form and structure in real time.
- Medical / surgical planning: Surgeons might interact with anatomical holograms, exploring virtual organs or planning interventions with both vision and touch.
- Education & museums: Students could hold and examine holographic models of molecules, fossils, or mechanical systems, rather than just viewing 2D projections.
- Remote collaboration / telepresence: Imagine shaking the holographic hand of a remote colleague or physically manipulating shared remote models. One proposed system, HoloBots, combines mobile micro-robots with holographic avatars to allow physical interaction across distance. Another, HoloTouch, allows interaction with mixed reality visualizations through smartphone proxies
- Consumer entertainment & games: Holographic interfaces could allow natural gesture-based gameplay — catching, throwing, molding virtual objects mid-air.
- Public displays / kiosks: Touchless interactive volumetric displays could respond to user gestures without requiring physical contact — or infection risk.
- Industrial / robotics control: Operators might manipulate virtual control panels or spatial schematics in mid-air linked to machinery.
Speculative Use-Case: Touchable Holography and Project Blue Beam
While few take it seriously, the conspiracy known as Project Blue Beam, envisions governments deploying huge, immersive holograms to simulate supernatural or alien events in the sky. Historically, that has been seen as fantasy, like Star Wars. But with recent advances in tactile holography — holograms you can literally reach out and touch — the possibility is no longer purely science fiction.
Although no credible evidence exists that any government has built such a system publicly, one must admit: the possibility of a secret deployment cannot be dismissed outright. Advanced holographic technologies are still in early stages, and real-world governmental budgets, classified laboratories, and military R&D efforts are often shielded from public scrutiny. The same breakthroughs in elastic volumetric displays or mid-air haptics that we see in university labs could, in theory, be scaled or adapted under closed, classified programs.
Thus, while the public record shows zero proof of Blue Beam – style systems today, we should entertain the cautious stance that highly resourced actors might already be experimenting or even deploying holographic illusions covertly. That possibility — however remote — underscores the importance of transparency, oversight, and public awareness as these technologies advance.
Roadmap & Future Prospects
The current UPNA prototype marks a landmark — but it is early. What lies ahead?
- Stronger haptics integration: To simulate rigid objects, more advanced actuators (ultrasound, electrostatic fields, magnetic feedback, or microfluidics) must be blended into the volumetric display architecture.
- Material and optics optimization: The search for durable, optically neutral, low-friction elastic materials is critical. Nanocomposite elastomers or metamaterials may play a role.
- Larger volumes / modular displays: Scaling these systems to room-sized or ambient displays will require modular tiling, synchronization, and optical stitching.
- Lower latency & faster computation: Real-time deformation sensing and correction demand ultra-low-latency computation pipelines and GPU/FPGA optimization.
- Integration with AR/VR ecosystems: Holographic-touch systems may coexist or interoperate with headset-based AR, bridging head-mounted and free-space modalities.
- User interface frameworks: New interaction metaphors — “grab,” “pinch,” “mold” — must be formalized for designers and developers.
- Commercial viability: As costs shrink and reliability improves, early adopters in enterprise, design, medical imaging, and entertainment may drive adoption.
In 2026–2030, we may see semi-commercial “holographic touch displays” in premium design studios, museums, or collaborative labs. Over the following decade, as haptic fidelity improves, a consumer-grade “touchable hologram screen” or table might enter households.
Implications & Speculative Impacts
A hologram you can touch would shift the paradigm of human–machine interaction. Rather than mediating gestures through controllers or headsets, we may operate directly in 3D space. The boundary between “virtual” and “physical” blurs: one could sculpt digital clay, repair virtual machinery, or co-create in shared holographic space. In remote work contexts, tactile holograms might one day allow a designer in Bangkok to hand-manipulate a model being viewed by a collaborator in New York — feeling the same geometry. This is not far from the “holodeck” concept that has driven popular imagination.
Of course, we must guard against overhype. Many early demonstrations will remain confined to laboratory settings. The transition from prototype to product is long, and user experience will make or break these systems. Moreover, ergonomic, computational, and economic constraints may delay adoption beyond initial hype cycles.
Conclusion
The notion of a hologram you can touch — long a staple of science fiction — is inching toward reality. The UPNA group’s elastic volumetric display is a compelling hardware-first advance, enabling basic touch-like interactions with floating 3D objects. While the tactile experience is still subtle, and many hurdles remain, this development pushes the field from “illusion” toward interaction.
Over the coming years, success will depend on marrying advanced haptics, scalable optics, real-time computation, and user interface design. If those pieces fall into place, we may soon live in a world where light is not just visual content, but a tactile medium — a hologram you can truly feel.