Saving Face

Saving Face is a COVID-19 related mobile application that uses earbuds to detect and warn the user if they are about to touch their face, designed by researchers and students at the MIT Media Lab.

Overview

The project consisted of a mobile application that works as a sonar. The earbuds send a high-frequency signal, and the microphone recorded this same signal. Then we applied signal processing techniques and machine learning algorithms to predict if the user’s hand was approaching their face.

Contributions

I started working on the Saving Face project in April of 2020.

During the first few months, I supported the implementation, design, and testing of the signal processing methods of the project.

My work consisted of coding the bandpass and lowpass filters used to filter the high-frequency signals. I also computed the convolutional filters, feature extraction, and the real-time evaluation of the machine learning pre-trained model. Moreover, I led the efforts of implementing all this work in the Android version of the app.

After a few months, I started working with the iOS team to integrate the latest advances in terms of the signals used to predict the hand-to-face gesture and the improvements of the artificial intelligence model.

Since Saving Face is a project to be used in daily life, it was necessary to improve the signal processing and machine learning model. Part of my responsibilities was to find better convolutional filters and examine different combinations of features to improve the AI model.

We designed a dataset with 9 tasks related to situations where people could be using the app, including walking, typing on the computer, or carrying a bag of groceries. I not only worked recording videos and audios for this dataset, but I also designed part of the protocol to record some of the tasks and have a uniform dataset.

In addition, it was necessary to run the app in different iPhone models. The differences in hardware also changed the maximum and minimum values of the recorded-reflected signal. For this reason, we needed to normalize the signal and the extracted features. Another one of my contributions in this project was to implement these normalization algorithms.

My experience in this project also included co-directing and assigning tasks to all participants. We had students from different parts of the world, such as China, India, Costa Rica, and the United States. Communication was essential in this process, and it was necessary to divide the task to maximize the results; for this reason, strong teamwork was an integral part of the process.

Results

Some of the results of the system are shown in our paper “A Scalable Solution for Signaling Face Touches to Reduce the Spread of Surface-based Pathogens”.

One of my favorite parts of this project was that I applied all my theoretical knowledge about filters, Fourier transforms, Nyquist, and Machine Learning in an app that can be used in daily situations. In addition, this project gave me the opportunity to write, edit, and work in the journal paper that was published in ACM Journal “Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies.”

For more information, you can visit the official webpage of the project.

Previous
Previous

Project Us

Next
Next

Wear UV