In the setup, a participant wore a backpack containing a laptop and a data acquisition device connected through a wire to a conductive pad on the back of the participant’s neck. The pad measured the voltages picked up by participants, who performed specific gestures around light switches. Software in the laptop generated positioning instructions and at each switch, the gesture order was randomized to eliminate bias.
The experiments showed that electromagnetic noise is so predictable that it can be used it to figure out where a person is standing, what the person is doing, and even where a hand is placed on a wall. The team used a simple sensor that was essentially just a piece of metal, but Morris said that ultimately a sensor could be placed in the user’s hand or anywhere else that the radio signals being picked up by the body can be gathered.
“Our bodies, it turns out, are actually really good and relatively colorful antennas,” Morris said. The team presented their results earlier this week in Vancouver at the ACM CHI Conference on Human Factors in Computing Systems.
The researchers learned that in a typical house, the electromagnetic noise changes noticeably from room to room because of the various appliances in them. Then they applied artificial intelligence to the data.
“The noise is different enough in those different environments that the computer can actually use machine learning to tell the difference,” Morris said.