ADVERTISEMENT
The Fast Observer
  • Home
  • News
  • Entertainment
  • Tech
  • Video
Monday, May 16, 2022
No Result
View All Result
ADVERTISEMENT
The Fast Observer
  • Home
  • News
  • Entertainment
  • Tech
  • Video
Monday, May 16, 2022
No Result
View All Result
The Fast Observer
No Result
View All Result
ADVERTISEMENT
Home Tech TECHCRUNCH

Luma raises $4.3M to make 3D models as easy as waving a phone around – TechCrunch

TECHCRUNCH.COM by TECHCRUNCH.COM
October 30, 2021
in TECHCRUNCH
0
Luma raises $4.3M to make 3D models as easy as waving a phone around – TechCrunch
0
SHARES
4
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT


When online shopping, you’ve probably come across photos that spin around so you can see a product from all angles. This is typically done by taking a number of photos of a product from all angles, and then playing them like an animation. Luma — founded by engineers who left Apple’s AR and computer vision group — wants to shake all of that up. The company has developed a new neural rendering technology that makes it possible to take a small number of photos to generate, shade and render a photo-realistic 3D model of a product. The hope is to drastically speed up the capture of product photography for high-end e-commerce applications, but also to improve the user experience of looking at products from every angle. Best of all, because the captured image is a real 3D interpretation of the scene, it can be rendered from any angle, but also in 3D with two viewports, from slightly different angles. In other words: you can see a 3D image of the product you’re considering in a VR headset.

For any of us who’ve been following this space for a while, we’ve seen for a long time startups trying to do 3D representations using consumer-grade cameras and rudimentary photogrammetry. Spoiler alert: It has never looked particularly great — but with new technologies come new opportunities, and that’s where Luma comes in.

A demo of Luma’s technology working on a real-life example. Image Credits: Luma

“What is different now and why we are doing this now is because of the rise of these ideas of neural rendering. What used to happen and what people are doing with photogrammetry is that you take some images, and then you run some long processing on it, you get point clouds and then you try to reconstruct 3D out of it. You end up with a mesh — but to get a good-quality 3D image, you need to be able to construct high-quality meshes from noisy, real-world data. Even today, that problem remains a fundamentally unsolved problem,” Luma AI’s founder Amit Jain explains, making the point that “inverse rendering,” as it known in the industry. The company decided to approach the issue from another angle.

“We decided to assume that we can’t get an accurate mesh from a point cloud, and instead are taking a different approach. If you have perfect data about the shape of an object — i.e. if you have the rendering equation — you can do Physics Based Rendering (PBR). But the issue is that because we are starting from photographs, we don’t have enough data to do that type of rendering. So we came up with a new way of doing things. We would take 30 photos of a car, then show 20 of them to the neural network,” explains Jain. The final 10 photos are used as a “checksum” — or the answer to the equation. If the neural network is able to use the 20 original images to predict what the last 10 images would have looked like, the algorithm has created a pretty good 3D representation of the item you are trying to capture.

It’s all very geeky photography stuff, but it has some pretty profound real-world applications. If the company gets it way, the way you browse physical goods in e-commerce stores will never be the same. In addition to spinning on its axis, product photos can include zooms and virtual movement from all angles, including angles that weren’t photographed.

The top two images are photographs, which formed the basis of the Luma-rendered 3D model below. Image Credits: Luma

“Everyone want to show their products in 3D, but the problem is that you need to involve 3D artists to come in and make adjustments to scanned objects. That increases the cost a lot,” says Jain, who argues that this means that 3D renders will only be available to high-end, premium products. Luma’s tech promises to change that, reducing the cost of capture and display of 3D assets to tens of dollars per product, rather than hundreds or thousands of dollars per 3D representation.

Luma’s co-founders, Amit Jain (CEO) and Alberto Taiuti (CTO). Image Credits: Luma

The company is planning to build a YouTube-like embeddable player for its products, to make it easy for retailers to embed the three-dimensional images in product pages.

Matrix Partners, South Park Commons, Amplify Partners, RFC’s Andreas Klinger, Context Ventures, as well as a gaggle of angel investors believe in the vision, and backed the company to the tune of $4.3 million. Matrix Partners led the round.

“Everyone who doesn’t live under a rock knows the next great computing paradigm will be underpinned by 3D,” said Antonio Rodriguez, general partner at Matrix, “but few people outside of Luma understand that labor-intensive and bespoke ways of populating the coming 3D environments will not scale. It needs to be as easy to get my stuff into 3D as it is to take a picture and hit send!”

The company shared a video with us to show us what its tech can do:



Source link

SOURCE

  • TECHCRUNCH.COM

    TechCrunch is an American online newspaper focusing on high tech and startup companies. Reporting on the business of technology, startups, venture capital funding, and Silicon Valley.

    View all posts

Previous Post

An Exciting Weekend it was at the #Fidelitybeyondlimits Celebration

Next Post

New Video: Fancy Fingers – How I Met Your Mother

TECHCRUNCH.COM

TECHCRUNCH.COM

TechCrunch is an American online newspaper focusing on high tech and startup companies. Reporting on the business of technology, startups, venture capital funding, and Silicon Valley.

Next Post
New Video: Fancy Fingers – How I Met Your Mother

New Video: Fancy Fingers - How I Met Your Mother

Plugin Install : Widget Tab Post needs JNews - View Counter to be installed
  • Trending
  • Comments
  • Latest
How to dress up to a Brunch

How to dress up to a Brunch

March 12, 2022
Besigye unveils 2022 action plan

Besigye unveils 2022 action plan

December 22, 2021
3 Ways to Invest in Ethereum Without Buying ETH

3 Ways to Invest in Ethereum Without Buying ETH

February 4, 2022
Uganda is clamping down on crypto transactions — Quartz Africa

Uganda is clamping down on crypto transactions — Quartz Africa

May 9, 2022
Nigerian star, Fave thrills fans at Guvnor

Nigerian star, Fave thrills fans at Guvnor

May 15, 2022
Demand for computer protective glasses increases as people spend more time in front of their screens

Demand for computer protective glasses increases as people spend more time in front of their screens

May 15, 2022
AFRICA’S BIGGEST MUSIC FESTIVAL, NYEGE NYEGE RETURNS THIS SEPTEMBER.

AFRICA’S BIGGEST MUSIC FESTIVAL, NYEGE NYEGE RETURNS THIS SEPTEMBER.

May 14, 2022
Andrew Mwenda forwards Sheebah’s alleged “Sexual molestation” case to Police

Andrew Mwenda forwards Sheebah’s alleged “Sexual molestation” case to Police

May 13, 2022
ADVERTISEMENT

Disclaimer; All aggregated content on this site is owned by the original authors NOT The Fast Observer

© 2021 reserved by THE FAST OBSERVER a Product of  ALFA MEDIA SMC LTD

No Result
View All Result
  • Home
  • News
  • Entertainment
  • Tech
  • Video

© 2021 The Fast Observer www.fastobserver.com by The Fast Observer.