New mobile phone app shows how well shoes fit based on the 3D shape of the wearer’s foot

Computer vision for virtual shoe fitting and shoe size suggestion

Credit: University of Cambridge

Snapfeet is a new mobile phone app that shows how well shoes fit based on the 3D shape of the wearer’s foot. It also offers a simple augmented reality (AR) visualization of how the shoes will look on the feet.

The app’s technology is designed for online shoe retailers to offer their customers an accurate fit of different shoe styles and the opportunity to see how the shoes will look on the shopper’s feet. This should lead to fewer shoes being returned. There is a huge cost in returns, both monetary and environmental. Many shoe retailers earn very little revenue from online sales due to the high rate of returns, so this app aims to change that.

Professor Roberto Cipolla and his team Dr. James Charles and Ph.D. Student Ollie Boyne from the Machine Intelligence group has created the app in collaboration with Giorgio Raccanelli and the team at Snapfeet.

The Snapfeet app allows the customer to wear the shoes virtually through their telephone thanks to Augmented Reality (AR) and find your perfect shoe in a few moments.

Snapfeet creates, in real time, an accurate 3D copy of the user’s feet. In a few seconds it is possible to make a 3D model of both feet, simply by taking a few photos with the mobile phone from different points of view.

Using the shape of the user’s foot and comparing it to the geometry of the shoe, Snapfeet can recommend the correct size for each type of shoe, communicating to the user the degree of comfort that can be achieved in the different parts of the foot: toe, instep, heel. and sole.

Giorgio Raccanelli says: “Download the Snapfeet app, sign in, take a few photos around the foot, and a 3D model of the foot will appear, allowing you to start shopping right away. The app automatically compares the three-dimensional image of the foot with the style chosen shoe, showing you how it will fit, or directly suggesting a style that best suits the shape of your foot.

Snapfeet has its first big clients in Hugo Boss and Golden Goose.

Snapfeet’s parent company, Trya, began licensing novel photogrammetry software from Professor Cipolla’s group in 2011 through Cambridge Enterprise.

The original photogrammetry technology used photos with a calibration pattern. After these photos are taken, they are uploaded to a server and a multi-view stereo algorithm developed in Cambridge found multiple point correspondences and generated a 3D model that accounts for all the different viewpoints and locates the cameras in world space. This was the state of the art for reconstruction accuracy in 2011.

Since 2019, Professor Cipolla’s team has been working with Snapfeet to turn the original photogrammetry technology into a mobile phone app that reconstructs the shape of the foot in 3D live on the phone and without the need for any calibration pattern and for sizing and correctly display the shoes in AR.

The original photogrammetry software was accurate to 1mm, but it was slow and difficult to process. The accuracy was there, but the ease of use was not. Nor did he exploit any knowledge of the object he was trying to reconstruct.

The team looked at how to make it faster and much easier to use and the idea was born to do it all on a mobile phone with no calibration pattern and no processing on a server. They were able to exploit exciting new developments in machine learning and powerful processors in modern mobile phones.






A video of the app in action building a 3D copy of the foot, size suggestions using machine learning and real-time AR to visualize the suggested size on the feet. Credit: University of Cambridge

“We were able to take advantage of new developments in machine learning (deep learning) to recognize 3D objects and the advanced sensors and powerful processors in modern mobile phones to run the real-time reconstruction algorithms on the phone. In short, we can combine a parameterized foot model and new deep learning algorithms to recognize curves and surfaces that allow us to run the 3D reconstruction algorithm in real time on the device,” said Professor Cipolla.

They used a parameterized foot model that was learned from many 3D foot scans using the original photogrammetry technology. The 3D foot model that the application creates can be rendered in any graphics engine to visualize what it looks like. The shape of the foot can be modified and is controlled by 10 different parameters that are learned with machine learning. The objective is to find out which of these parameters produces a 3D foot that best suits the user. The “master” foot model is called “prior”, short for prior knowledge of what feet look like. The app user still takes multiple images around the foot, but instead of creating point clouds (as in photogrammetry), the app uses machine learning to predict higher-level features that control the shape of the foot. The benefits are that the application user needs to take fewer photos, the foot model returned has fewer artifacts, and the process is more robust in case there are errors during a scan. The model is also much faster to produce thanks to the real-time deep learning element of the app.

The team has just released the new version of the app that can do everything on the mobile device. The server is no longer needed.

Speaking of the app, James Charles says: “I’ve always had a hard time getting the right size shoes. I don’t like the in-store fitting process and the environmental impact of ordering a lot of shoes online was a big concern for me. However , before this app there was really no other option. So I’m very motivated to solve this problem and I think we already have a pretty good solution.”

Initially, when the user opens the app, there is a calibration phase where the user starts tracking the camera using the latest AR features on mobile phones. On an iOS phone which is AR Kit and on an Android phone which is AR Core, they use the same set of routines that an interior design app would use to map a room and represent the physical space graphically.

During the calibration phase, the phone’s camera is tracked. The app is based on AR technology to track the camera and calculate how far you are moving, it also detects the foot and the ground, giving a good idea of ​​the world space. The app knows where the phone is with an accuracy of 2mm and it’s all done within a few seconds of loading the app.

As the phone moves, certain key points of interest on the foot are detected to help determine the length and width of the foot, a 3D mesh is then created from these measurements, and the model is then overlaid on top of the foot. of the user in AR so that they can visualize it. check if it is correct.

This is another key step and different from the competition. There are applications on the market that can also validate model reconstruction in this way, but do not allow you to actively adjust the model. Snapfeet allows you to adjust the model in real time and then immediately get the 3D model of your foot on the phone itself without the need for the server.

There are three machine learning foot algorithms in play. One is building the parameterized foot model; the second is machine learning that retrieves model parameters from multi-view images as you move the mobile phone. Finally, there is a third machine learning algorithm within the app that compares the 3D foot model to all the shoe shapes, or “lasts”, that the customer is interested in and will then return the size of those shoes that best fits the customer. user’s foot. . This is the virtual test.

When manufacturers build a shoe, they build a last that is a solid model of the inside of the shoe. Around the last they create the design of the shoe. The last of the shoe along with the material used to create the shoe determines the size and level of comfort someone will have when they step foot into that shoe.

The algorithm will take the foot. model and digitally place it inside all the shoes you are interested in and it will give you a comfort score. You can then render a virtual shoe on your feet using AR. The app also detects where the legs/pants are so you can get the correct occlusion effect, using machine learning to capture foot tracking.

The app also uses AR once it has reshaped your foot so that the user can get the feel it should feel when trying on the shoe. The AR element of the app allows the user to see how the shoes will look on their foot and if they go well with a particular outfit.

Snapfeet has generously funded a Ph.D. grant enabling Ollie Boyne to further research in foot modeling from photographs. The app is now available on the App Store and is being used and tested by many shoe vendors to help reduce your returns from online sales. Download the app and try it on your own feet.


The perfect fit: A ‘shoe’ for a great start to school


Citation: New mobile phone app shows how well shoes will fit based on the 3D shape of the wearer’s foot (May 5, 2022) Retrieved May 6, 2022 from https://techxplore.com/news/2022 -05-mobile-app-based -3d-user.html

This document is subject to copyright. Other than any fair dealing for private study or research purposes, no part may be reproduced without written permission. The content is provided for informational purposes only.

Leave a Reply

Your email address will not be published.