MIT makes an AI smart carpet for monitoring people without cameras

(Source: zdnet.com)  

Can MIT's AI 'magic carpet' do a better job than the Apple Watch at monitoring exercises and falls?


Liam Tung
By Liam Tung | June 21, 2021 


Researchers at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have come up with a way to use carpets to monitor humans without using privacy-invading cameras.

The so-called intelligent carpet could have applications in personalized healthcare, smart homes, and gaming. It also might offer a more privacy-friendly way of delivering healthcare to people who need to be remotely monitored by healthcare professionals.

As MIT CSAIL notes, other research in this field has relied on devices like wearable cameras, and webcams.

MIT's system only uses cameras to create a dataset that was used to train the AI model. The neural network uses sensors in the carpet to determine if the person is doing sit-ups, stretching, or other actions.

"You can imagine leveraging this model to enable a seamless health-monitoring system for high-risk individuals, for fall detection, rehab monitoring, mobility, and more," says Yiyue Luo, a lead author on a paper about the carpet.

MIT's focus is on 3D human pose estimation using pressure maps recorded by a tactile-sensing carpet.

"We build a low-cost, high-density, large-scale intelligent carpet, which enables the real-time recordings of human-floor tactile interactions in a seamless manner," the researchers note in a new paper.

The researchers' intelligent carpet measures 36 square feet and includes an integrated tactile-sensing array consisting of over 9,000 pressure sensors that can be embedded on the floor. It also includes readout circuits to allow real-time recordings of humans interacting with the carpet.

The researchers called over 1.8 million synchronized tactile and visual frames for 10 people performing diverse activities, such as lying, walking, and exercising.

The sensors on the carpet convert the human's pressure into an electrical signal, through the physical contact between people's feet, limbs, torso, and the carpet, according to MIT CSAIL.

The researchers trained the system using tactile and visual data, such as a video and a corresponding heatmap of someone doing a pushup. The AI model uses this visual data as the ground truth and uses the person's pressure on the carpet to create various 3D human poses, so it can produce an image or video of a person doing a certain action on the carpet without actually recording the person carrying out the action.

"You may envision using the carpet for workout purposes. Based solely on tactile information, it can recognize the activity, count the number of reps, and calculate the amount of burned calories," says Yunzhu Li, one of the paper's authors.

No comments:

Post a Comment

The most recent dish... enjoy!

How the IRS is trying to nail crypto tax dodgers

(Source: cnbc.com )  PUBLISHED WED, JUL 14 2021, 12:08 PM EDT; UPDATED THU, JUL 15 2021 2:00 PM EDT MacKenzie Sigalos  @KENZIESIGALOS KEY PO...

Popular Dishes