I have been testing and writing about smart wearables for years, but a recent leak regarding Apple’s secret hardware labs genuinely made me sit up and pay attention. We all know Apple’s classic playbook: they rarely invent a new product category, but when they finally enter one, they usually flip the entire industry upside down.
According to recent reports from Bloomberg’s Mark Gurman, Apple is preparing to aggressively expand its wearable lineup. We are not just talking about another Apple Watch update. The tech giant is reportedly developing smart glasses, an AI-powered necklace, and even AirPods equipped with cameras. Let’s dive into what these devices actually are, why Apple is ditching the screen, and how this could fundamentally change how we interact with the world around us.
The Screenless Smart Glasses: Audio and AI First

When I first heard that Apple’s upcoming smart glasses would not feature a built-in display, I was honestly a bit skeptical. We are so used to the idea of Augmented Reality (AR) projecting holographic screens into our field of view. But after digging into the details, this “screenless” approach is actually brilliant.
Instead of overwhelming your eyes with notifications and pop-ups, Apple is focusing heavily on Visual Intelligence. The glasses will be equipped with a high-resolution camera, speakers, microphones, and an extra specialized lens dedicated to AI processing.
Here is what you will actually be able to do with them:
Contextual Navigation: Instead of looking down at a map, Siri can look through your glasses and say, “Turn left after the blue coffee shop.”Real-Time Object Recognition: You can look at a plate of food and ask Siri for the recipe or caloric content, or look at a monument and instantly hear its history.Seamless Communication: Make phone calls, listen to Apple Music, and dictate messages without ever pulling your iPhone out of your pocket.
Taking on Meta: The In-House Strategy
If this sounds familiar, it is because Meta has already been carving out this exact niche with their highly successful Ray-Ban smart glasses. I have used the Meta glasses, and they are incredibly fun. But Apple is taking a radically different approach to design.
Instead of partnering with legacy eyewear brands like Oakley or Ray-Ban, Apple is designing and building these frames entirely in-house. Early prototypes reportedly relied on a messy external battery pack and a cable connected to an iPhone. However, sources indicate that Apple’s engineering team has successfully integrated all the necessary premium components—including the battery—directly into a sleek, standalone frame. They are aiming for top-tier hardware quality and advanced camera sensors to separate themselves from the plastic feel of current competitors.
Beyond Glasses: AirPods with Cameras and AI Necklaces

The glasses are just one piece of the puzzle. Apple seems to be exploring multiple ways to give Siri “eyes.”
Camera-Equipped AirPods: This sounds like pure science fiction, but Apple is exploring putting tiny, low-resolution infrared cameras into AirPods. These wouldn’t be for taking selfies; they would read your environment and track your head movements to improve spatial audio and provide Siri with environmental context.The AI Necklace: Similar to the Humane AI Pin or the Rabbit R1, Apple is experimenting with a wearable necklace that acts as a passive, always-listening, always-looking AI assistant.
All three of these devices will rely heavily on a wireless connection to your iPhone, which will act as the “brain” processing the heavy AI workloads.
The Road Ahead: When Can We Buy Them?

Apple is reportedly planning to start mass production of the smart glasses components by the end of this year, aiming for a highly anticipated retail launch in 2027.
While that seems like a long wait, it tells me that Apple is taking the time to get the “Apple Intelligence” software completely right before pushing the hardware onto our faces. They don’t want to release a gimmick; they want to release a tool you use every single day.
What Do You Think?
As I think about a future where my glasses, earbuds, and maybe even a necklace are constantly analyzing my surroundings, I feel a mix of excitement and privacy concerns. The convenience of having an AI assistant that can literally see what I see is undeniable, but it is a massive leap in how much data we share with our tech.
I would love to hear your perspective on this. Would you wear screenless Apple smart glasses that constantly analyze your environment, or do you prefer keeping your camera strictly on your smartphone? Drop your thoughts in the comments below!
Would you like me to write a follow-up article comparing these upcoming Apple glasses directly to the current Meta Ray-Ban models?








