Smart home knows just how you like your breakfast

Future Homes_Getty

NewScientist

News HOME

16:02 02 September 2009 by MacGregor Campbell 

[Article] — Humans are creatures of habit, as a sensor-stuffed apartment at Washington State University in Pullman knows. The smart home can learn the ways of its inhabitants simply by observing how they walk around and use different appliances.

The technology could be used in houses to support people with cognitive difficulties or dementia with their daily living needs, or to make things easier for healthy people.

The apartment can, for example, recognise when a person is performing actions associated with making breakfast. If the person absent-mindedly leaves a stove burner on, the system can spot the anomaly and prompt them with audio and video signals to return to the hob.

Quick learner?

Diane Cook and colleagues developed the computer system that analyses the sensors’ output, known as Casas.

Graduate student Parisa Rashidi has improved Casas so it can learn a person’s habits without prior assumptions about what events or patterns to expect. Previous smart homesMovie Camera have required key activities to be pre-defined before recognising them.

Rashidi and Cook have successfully tested their system in a specially outfitted apartment with a single resident on campus. It required around a month of training to accurately tease out the resident’s habits from the sea of sensor data, says Rashidi.

Once trained, Casas can identify patterns as complex as “at 6 am the kitchen light comes on, the coffee maker turns on, and the toaster turns on” without any prior knowledge of what to expect.

No big brother

To maintain a resident’s sense of privacy Casas works without cameras, RFID chips or microphones. Instead less “invasive” sensors that detect motion, temperature, light, humidity, water, door contact and the use of key items, such as opening a bottle of medication or switching on the toaster. “We don’t want to give residents the feeling that Big Brother is watching them,” says Rashidi.

The passive sensing approach may appease the Orwellian-minded, but it makes activity recognition more difficult. A camera, for example, might easily distinguish where a person is located in a room, but data from motion sensors does so less readily. Rashidi developed a number of data-mining algorithms to help make sense of the sensor output.

One algorithm uses a grid of motion sensors to map out how a person walks around the home, looking for daily “trajectories”, or routes through the house.

A second algorithm finds patterns in a sequence of events, such as learning to expect the resident to turn on a tap after turning on the oven. A third algorithm looks to correlate events it detects with the time of day to identify the pattern, for example, of when the person eats dinner.

Measure of man

Rashidi and Cook are now working on upgrades that allow the apartment to decipher the actions of multiple inhabitants and recognise subtle variations in commonly repeated tasks. “You don’t do the same task every day in the same manner,” says Rashidi.

Sumi Helal, who leads the University of Florida’s Gator-Tech Smart House project in Gainesville, says that having a home detect and learn behaviour without explicit pre-training is “a difficult problem”.

Most systems require such pre-training, Helal says. “The training is lengthy and unreliable, so if you can skip that step without having to give the system hints, that would be novel,” he says.

Journal reference: IEEE Transactions on Systems Man & Cybernetics, A(DOI: 10.1109/TSMCA.2009.2025137)