The company logo for eyeglass retailer Warby Parker is displayed on a screen during the company's direct listing at the New York Stock Exchange (NYSE) in New York City, U.S., September 29, 2021. REUTERS/Brendan McDermid
FILE PHOTO: The Google logo is seen on the Google house at CES 2024, an annual consumer electronics trade show, in Las Vegas, Nevada, U.S. January 10, 2024. REUTERS/Steve Marcus/File Photo

Dec 8 (Reuters) - Warby Parker said on Monday it is collaborating with Alphabet's Google to develop lightweight AI-powered glasses, with the first product expected to launch in 2026.

The announcement, made during The Android Show | XR Edition, marks the first time the companies have set a public timeline for the release since unveiling the partnership earlier this year.

Google has been making a renewed push into augmented reality and wearable technology, a sector where Meta Platforms and Apple have taken early leads.

Meta has poured resources into its Quest mixed-reality headsets and Ray-Ban smart glasses, while Apple entered the market with its Vision Pro headset earlier this year, positioning it as a premium spatial computing device.

Google, which shelved its consumer-focused Glass product nearly a decade ago, is now betting on AI integration and strategic partnerships to make smart eyewear more mainstream.

The collaboration with Warby Parker will leverage Google's Android XR platform and Gemini AI model to deliver multimodal intelligence in everyday eyewear, aiming for suitable for all-day wear.

Warby Parker described its upcoming glasses as "lightweight and AI-enabled" but did not provide details on pricing or distribution plans.

In a blog post, Google said it is working with Samsung, Gentle Monster and Warby Parker to create stylish, lightweight glasses.

The initiative includes two types of devices: AI glasses for screen-free assistance, equipped with speakers, microphones and cameras for natural interaction with Gemini, and display AI glasses that feature an in-lens display for private access to information such as navigation or translation captions.

(Reporting by Kritika Lamba in Bengaluru; Editing by Krishna Chandra Eluri)