RainfallSonata ,

Stop trying to make AR glasses happen.

just_another_person ,

They already exist in much larger packaging.

RainfallSonata ,

Of course they exist.

just_another_person ,

So they already "happened".

RainfallSonata ,

You keep using that word. I do not think it means what you think it means.

just_another_person ,

Seems you used that word. I'm just responding to your comments.

technocrit ,

I think most people understand the difference between "existing" and "happening".

That was the whole point of the original comment.

woelkchen ,
@woelkchen@lemmy.world avatar

They already exist in much larger packaging.

Nobody uses that stuff, though.

just_another_person ,

Sure they do. Hololens is used by a large swath of major engineering firms, I've seen people use the Quests fairly extensively for AR, and Apple somehow still sold out their pretty awful AR product as well.

TimeSquirrel ,
@TimeSquirrel@kbin.social avatar

Why? I'd use the shit out of them at work. I work on construction sites. It'd be awesome to have an app to superimpose the finished plans on top of what I'm seeing so I don't have to constantly refer back to the paper prints. No more measuring shit five times, just install it exactly as you see it.

LostXOR ,

That's a pretty cool idea, though I think it would be a challenge to align the plans perfectly with the actual construction site.

Ashelyn ,

Any time a news headline asks a question, the answer is almost always "no"

MossyFeathers ,

After reading the article, this might be an exception.

ringwraithfish ,

Agreed. The form factor is right. AR technology will only reach the possibility of mass adoption when it can fit in/on the existing eye-glasses form factor.

fart_pickle ,
autotldr Bot ,

This is the best summary I could come up with:


But Stanford’s Computational Imaging Lab has an entire page with visual aid after visual aid that suggests it could be onto something special: a thinner stack of holographic components that could nearly fit into standard glasses frames, and be trained to project realistic, full-color, moving 3D images that appear at varying depths.

Like other AR eyeglasses, they use waveguides, which are a component that guides light through glasses and into the wearer’s eyes.

But researchers say they’ve developed a unique “nanophotonic metasurface waveguide” that can “eliminate the need for bulky collimation optics,” and a “learned physical waveguide model” that uses AI algorithms to drastically improve image quality.

Although the Stanford tech is currently just a prototype, with working models that appear to be attached to a bench and 3D-printed frames, the researchers are looking to disrupt the current spatial computing market that also includes bulky passthrough mixed reality headsets like Apple’s Vision Pro, Meta’s Quest 3, and others.

Postdoctoral researcher Gun-Yeal Lee, who helped write the paper published in Nature, says there’s no other AR system that compares both in capability and compactness.

Companies like Meta have spent billions buying and building AR glasses technology, in the hopes of eventually producing a “holy grail” product the size and shape of normal glasses.


The original article contains 319 words, the summary contains 212 words. Saved 34%. I'm a bot and I'm open source!

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • tech
  • kbinEarth
  • testing
  • interstellar
  • wanderlust
  • All magazines