Three AI-powered wearables—smart glasses, a wearable necklace, and camera-equipped AirPods—are being created by Apple with the goal of connecting them to the iPhone and giving Siri a second set of eyes. According to sources close to the plans, Bloomberg reported.
Unlike Meta, Apple will manufacture its own glasses.
The most advanced of the three are the spectacles, code-named N50. They have two cameras: one for high-resolution images and video, and another for computer vision, akin to LiDAR on iPhones. Since the lens lacks a display, all functions—including phone calls, music, navigation, and Siri queries—are performed by speech. Apple plans to begin production in late 2026 and debut in 2027.
Apple is produces its own frames using premium materials, such as acrylic components, rather than working with an eyewear company like Meta has done with Ray-Ban. There will be a variety of colors and sizes.
Apple’s Siri necklace is an iPhone accessory that takes inspiration from the Humane AI Pin idea.
Of the three, the pendant is probably the most odd. It is the size of an AirTag, attaches to clothes or hangs on a necklace, and effectively serves as a camera that is constantly on and provides Siri with visual context. In contrast to the disastrous Humane AI Pin, it is mostly dependent on the iPhone for processing and lacks a projector. According to Bloomberg, the inclusion of a speaker for two-way Siri chats is still up in the air.
The pendant might potentially be canceled because it is the earliest-stage product. Launching in 2027 is a possibility if it continues.
The closest AirPods to releasing are those with cameras.
The most advanced AirPods are those with cameras, which might be released this year. The cameras are low-resolution and designed for AI awareness rather than photography, with the goal of assisting Siri in understanding her environment rather than taking pictures.
The drive coincides with Apple’s attempt to catch up to OpenAI, which is creating wearables in partnership with former Apple design leader Jony Ive, and Meta, whose Ray-Ban glasses have proven a true market hit.
FAQs
What AI wearables is Apple developing?
Apple is reportedly working on three devices:
-
AI-powered smart glasses
-
A wearable Siri pendant (necklace-style device)
-
Camera-equipped AirPods
All are designed to integrate closely with the iPhone.
What is special about Apple’s smart glasses?
-
Codename: N50
-
Dual cameras (one standard, one for computer vision)
-
No display — voice-controlled via Siri
-
Designed and manufactured in-house
-
Expected production: Late 2026
-
Likely launch: 2027
Will the smart glasses have a screen?
No. Reports suggest there will be no built-in display. All functions like calls, navigation, music, and Siri queries will work through voice interaction.
What is the Apple Siri pendant?
The pendant is a small wearable device:
-
Size similar to an AirTag
-
Clips to clothing or worn as a necklace
-
Always-on camera for AI context
-
Relies on the iPhone for processing
-
May launch in 2027 (still early-stage and could be canceled)
What about AirPods with cameras?
These may launch first among the three:
-
Low-resolution cameras
-
Designed for AI awareness, not photography
-
Helps Siri understand surroundings
How is Apple different from competitors?
Unlike Meta Platforms, which partnered with Ray-Ban, Apple plans to design its own frames.
Meanwhile, OpenAI is working on AI wearables with former Apple designer Jony Ive.
Will privacy be a concern?
Yes. Apple is expected to emphasize:
-
On-device processing
-
Clear camera usage indicators
-
Minimal cloud storage
When are these devices expected to launch?
-
AirPods with cameras: Possibly 2026
-
Smart glasses: 2027
-
Siri pendant: 2027 (uncertain)

