• Fri. Sep 23rd, 2022

New mobile phone app shows how well shoes will fit based on user’s 3D foot shape

ByCindy J. Daddario

May 5, 2022

Credit: University of Cambridge

Snapfeet is a new mobile phone app that shows how well shoes will fit based on the user’s 3D foot shape. It also offers a simple augmented reality (AR) visualization of what the shoes will look like on the feet.

The app’s technology is designed for online shoe retailers to offer their customers a precise fit of different shoe styles and the ability to see how the shoes will look on the shopper’s feet. This should lead to fewer shoe returns. There is a huge cost in return, both monetary and environmental. Many shoe retailers earn very little revenue from online sales due to the high rate of returns. So the purpose of this app is to change that.

Professor Roberto Cipolla and his team Dr. James Charles and Ph.D. Ollie Boyne, student of the Machine Intelligence group, created the application in collaboration with Giorgio Raccanelli and the team of Snapfeet.

The Snapfeet app allows the customer to wear the shoes virtually via their phone using Augmented Reality (AR) and find their perfect shoe in moments.

Snapfeet creates an accurate 3D copy of the user’s feet in real time. In a few seconds, it is possible to create a 3D model of both feet, simply by taking a few photos with a mobile phone from different points of view.

By using the shape of the user’s foot and comparing it to the geometry of the shoe, Snapfeet is then able to recommend the correct size for each type of shoe, communicating to the user how comfortable they can be. affected in the different parts of the foot: toe, instep, heel and sole.

Giorgio Raccanelli says: “You download the Snapfeet app, register, take a few pictures all around the foot and a 3D model of the foot appears, allowing you to start shopping immediately. The app automatically compares the three-dimensional image of the foot with the chosen shoe style, showing you how it will fit, or directly suggesting a style best suited to your foot shape.

Snapfeet has its first big customers in Hugo Boss and Golden Goose.

Snapfeet’s parent company, Trya, first licensed new photogrammetry software to Professor Cipolla’s group in 2011 through Cambridge Enterprise.

The original photogrammetry technology used photos with a calibration pattern. After taking these photos, they are uploaded to a server and a Cambridge-developed multi-view stereo algorithm found multiple point matches and generated a 3D model that explains all the different viewpoints and locates the cameras in the global space. It was state of the art for reconstruction accuracy in 2011.

Since 2019, Professor Cipolla’s team has been working with Snapfeet to evolve the original photogrammetry technology into a mobile phone application that reconstructs the 3D shape of the foot live on the phone and without the need for any calibration models and to correctly size and visualize shoes in AR.

The original photogrammetry software was very accurate to 1mm but it was slow and difficult to process. The precision was there, but the usability was not. He also didn’t tap into any knowledge of the object he was trying to reconstruct.

The team looked at how to make it faster and much more user-friendly and the idea was born to do everything on a mobile phone without a calibration model and without processing on a server. They were able to exploit exciting new developments in machine learning and powerful processors in modern mobile phones.






A video of the app in action creating a 3D copy of the foot, size suggestions using machine learning and real-time AR to visualize the suggested size on the feet. Credit: University of Cambridge

“We were able to exploit new developments in machine learning (deep learning) for 3D object recognition and the advanced sensors and powerful processors of modern mobile phones to run the reconstruction algorithms in real time on the phone. In summary, we can combine a foot model and new deep learning algorithms for curve and surface recognition allowing us to run the 3D reconstruction algorithm in real time on the device,” said the Professor Cipolla.

They used a parametric foot model that was learned from numerous 3D foot scans using the original photogrammetry technology. The 3D foot model that the app builds can be rendered in any graphics engine to visualize how it looks. The shape of the foot can be changed and is controlled using 10 different parameters which are learned with machine learning. The goal is to find out which of these parameters produces a 3D foot that best suits the user. The “master” foot model is called “prior”, short for prior knowledge of how feet look. The app user still takes multiple images around the foot, but instead of creating point clouds (as in photogrammetry), the app uses machine learning to predict higher-level features that control the shape of the foot. The benefits are that the app user has to take fewer photos, the returned foot model has fewer artifacts, and the process is more robust against errors during a scan. The model is also much faster to produce thanks to the application’s real-time Deep Learning element.

The team has just released the new version of the app which can do everything on the mobile device. The server is no longer needed.

Speaking of the app, James Charles says, “I’ve always had trouble getting the right size shoes. I don’t like the process of trying on in stores and the environmental impact of ordering lots of shoes. shoes online was a big concern for me, however, before this app, there really was no other option, so I’m very motivated to solve this problem and I think we already have a pretty good solution.

Initially, when the user opens the app, there is a calibration phase where the user starts tracking the camera using the latest AR features on mobile phones. On an iOS phone which is AR Kit and on an Android phone it is AR Core, they use the same set of routines an interior design app would use to map a room and represent the physical space graphically.

During the calibration phase, the phone camera is tracked. The app relies on AR technology to track the camera and calculate its moving distance, it also detects the foot and the ground, which gives a good idea of ​​the world space. The app knows where the phone is with 2mm accuracy and everything is done in seconds after loading the app.

As the phone moves around certain key points of interest on the foot are detected to help determine the length and width of the foot, then a 3D mesh is created from these measurements and the model is then overlaid onto the foot of the user in AR so that he can visually validate if it is correct.

This is another key step and different from the competition. There are apps on the market that can also validate model reconstruction in this way, but they don’t allow you to actively adjust the model. Snapfeet allows you to adjust the model in real time and then immediately get the 3D model of your foot on the phone itself without the need for the server.

There are three machine learning foot algorithms in play. One is to build the parameterized foot model; the second is machine learning that retrieves model parameters from multi-view images as you move the mobile phone. Finally, there is a third machine learning algorithm in the app that compares the 3D foot model to all shoe shapes, or “hards”, that the customer is interested in and then returns a size of those shoes that will fit best. at the foot of the user. . This is the virtual trial.

When manufacturers build a shoe, they build a shoe last that is a solid model of the inside of the shoe. Around the last shoe, they create the design of the shoe. The shape of the shoe as well as the material used to create the shoe determines the size and level of comfort someone is going to have when putting their foot in that shoe.

The algorithm will take the foot model and digitally place it inside all the shoes you are interested in and give you a comfort score. You can then render a virtual shoe on your feet using AR. The app also detects where the legs/pants are to achieve the correct occlusion effect, using machine learning to capture foot tracking.

The app also uses AR once you get your foot shape so the user can experience the feeling you should feel when trying on the shoe. The app’s AR element then allows the user to see how the shoes will look on their foot and how well they go with a particular outfit.

Snapfeet generously funded a Ph.D. scholarship for Ollie Boyne to expand research into foot modeling from photographs. The app is now available on the App Store and is used and tested by many shoe sellers to help reduce their returns on online sales. Download the app and try it on your own feet.


The perfect fit: a “shoe-in” for a good start at school


Provided by the University of Cambridge


Quote: New mobile phone app shows how well shoes will fit based on user’s 3D foot shape (2022, May 5) Retrieved May 7, 2022 from https://techxplore.com/news/ 2022-05-mobile-app-based-3d-user.html

This document is subject to copyright. Except for fair use for purposes of private study or research, no part may be reproduced without written permission. The content is provided for information only.