The augmented reality (AR) glasses market is emerging as the next frontier in personal computing, presenting a massive opportunity. Think about it: we've witnessed an incredible evolution from room-sized computers to laptops, then to smartphones that fit in our pockets, and now we're standing at the threshold of having AI directly integrated into our field of vision. This isn't just an incremental upgrade; it's a quantum leap in human-computer interaction that could redefine how we engage with digital information.
Picture this: an AI assistant like Gemini seamlessly helping you navigate unfamiliar cities, instantly translating foreign languages, or providing real-time information overlays — all without ever needing to reach for your phone. This is the compelling vision of truly hands-free, immersive computing that AR glasses manufacturers are racing to deliver.
However, here's the challenge that keeps product managers up at night: How do you create intuitive controls for a device where users can't see what they're touching?
Current AR glasses face a fundamental UX problem that's hampering adoption. Most rely on capacitive touch surfaces embedded in the temple's sleek and minimal, yes, but they create a frustrating "blind interaction" scenario. Users are left questioning every input: Did my tap actually register? Am I touching the right spot? Should I press harder or softer? Was that a successful gesture, or did I just accidentally trigger something?
This uncertainty destroys user confidence and breaks the immersive experience that AR promises to deliver. When users start second-guessing basic interactions, they lose trust in the entire system. Some even resort to pulling out their phones to use companion apps, completely defeating the purpose of hands-free AR interaction.
The stakes are high. Consumer expectations have been shaped by the iPhone's intuitive touch interface and the seamless interactions we've come to expect from our devices. AR glasses need to meet or exceed these expectations to achieve mainstream adoption.
Enter solid-state piezoelectric button technology, a solution that could solve AR's interaction problem while unlocking new possibilities for user experience design. Boreas Technologies has developed what might be the missing piece of the AR puzzle: buttons that provide instant, localized haptic feedback without any moving parts.
Think of solid-state buttons (SSBs) as the next evolution of touch interfaces. By combining force sensors with advanced haptic technology, they transform simple touch surfaces into sophisticated, multi-functional controls that users can operate with confidence, even when they can't see them.
Every interaction delivers immediate, localized feedback that confirms your action was registered. No more guessing, no more uncertainty. This haptic confirmation is private and instantaneous, unlike voice commands that can disturb others or gesture controls that drain battery life. When you can't see what you're touching, this tactile feedback becomes your primary navigation system.
Consumer trust is everything in emerging tech markets. SSBs eliminate the "did that work?" moment that kills user confidence. When interactions feel reliable and predictable, users tend to embrace the technology instead of fighting against it. This level of trust is crucial for transitioning AR glasses from early adopter gadgets to mainstream consumer products.
Here's where it gets interesting: one SSB can function as multiple controls. A single touch area can act as a button, a slider, or even a multi-directional input device. Imagine controlling camera zoom with a slide gesture, adjusting volume with pressure sensitivity, or navigating menus with intuitive swipes, all with distinct haptic feedback patterns that your fingers instantly recognize.
Unlike traditional linear resonant actuators (LRAs) that can make entire glasses frames vibrate uncomfortably, piezoelectric haptic technology delivers feedback precisely where you touch. This means extended wear comfort is crucial for all-day usage scenarios. The technology offers exceptional clarity and responsiveness, with a wider frequency bandwidth and lightning-fast response times, enabling complex and customizable haptic effects.
SSBs distinguish between intentional presses and accidental contact through intelligent force detection. Light touches, such as adjusting your glasses or casual finger placement, won't trigger actions; only deliberate pressure above a configurable threshold activates controls. This eliminates the frustrating false inputs that plague basic capacitive systems, creating a more intentional and controlled user experience.
SSBs enable sophisticated interactions that feel familiar to users. Think two-stage camera controls: half-press to focus, full-press to capture, just like a traditional camera. Or pressure-sensitive volume controls that respond to how firmly you press. These advanced features don't require learning new behaviors; they leverage existing user mental models while adding precision and feedback.
When actively engaged, our solid-state piezo buttons deliver significantly better power efficiency compared to voice control and over-the-air gesture systems. While voice recognition requires substantial processing power to interpret commands and gesture control demands continuous camera processing and complex algorithms to track hand movements, our CapDrive® technology provides tactile feedback with minimal energy consumption. During active interaction sessions, voice and gesture systems can consume considerable power for real-time processing, whereas our piezo buttons maintain consistent, low-power operation. This efficiency advantage during actual use translates to extended battery life and more reliable all-day performance for AR glasses users who rely on frequent interactions throughout their day.
The numbers tell a compelling story: According to Grand View Research, the global smart glasses market size was estimated at USD 1.93 billion in 2024 and is projected to reach USD 8.26 billion by 2030, growing at a CAGR of 27.3% from 2025 to 2030. This represents one of the most exciting opportunities in consumer technology, with the market expected to more than quadruple in just six years.
The AR glasses market is heating up rapidly, with tech giants such as Meta, Google, Samsung, and Apple making substantial investments to capture their share of this explosive growth. The winners in this space won't be determined by who has the most features, but by who creates the most trustworthy, intuitive user experience.
Solid-state piezo buttons represent a significant opportunity for manufacturers to differentiate their products and set new standards for AR interaction. This isn't just about solving a technical problem; it's about creating a competitive advantage that could define market leadership.
We have already proven this technology in the field, with successful deployments in Dell Latitude and XPS laptops after rigorous testing with major OEMs. Our comprehensive engineering expertise across mechanical, software, and electrical domains means faster integration and shorter time-to-market for AR manufacturers.
The opportunity is clear: make AR interaction so intuitive and responsive that users forget they're wearing a device. This is how AR glasses manufacturers can leapfrog from prototype to indispensable daily tool by creating an interface that can be felt but not seen.
For AR glasses to achieve their full potential and reach mainstream adoption, they must address the fundamental challenge of invisible interaction. Our solid-state piezo buttons could be the key technology that transforms AR from a fascinating demo into a must-have consumer product.
The question isn't whether AR glasses will succeed; it's which manufacturers will create the user experience that defines the category. And that experience might just come down to something as fundamental as how it feels to press a button.