
This can benefit in detecting deviations in order to provide timely interventions for patients, e.g., people with dementia. Furthermore, by applying artificial intelligence (AI) algorithms on such data collected over long periods, it is possible to extract patterns that reveal the user’s habits as well as detect changes in the habits. In particular, the research community might be interested in investigating the performance of algorithms when applied on unlabelled datasets and not necessarily on annotated datasets. The dataset can be useful in the analysis of different methods, including data-driven algorithms for activity or habit recognition. Millions of raw sensor data samples were collected continuously at a frequency of 1 Hz over a period of six months between 26 February 2020 and 26 August 2020. The sensors record data from the user’s interactions with the environment, such as indoor movements, pressure applied on the bed, or current consumption when using electrical appliances. Various sensors were used including passive infrared, force sensing resistors, reed switches, mini photocell light sensors, temperature and humidity, and smart plugs. In this article we describe an unlabelled dataset of measurements collected from multiple environmental sensors placed in a smart home to capture human activities of daily living. With the increasing need to enable people to age in place independently, the availability of such data is key to the development of home monitoring solutions. This dataset supports the comparison of gait parameters and properties of inertial and optical capture systems, whereas allows the study of gait characteristics specific for each system.Time series data acquired from sensors deployed in smart homes present valuable information for intelligent systems to learn activity patterns of occupants.

Ten trials for each participant were recorded and pre-processed in each of two sessions, performed on different days. Participants were instructed to walk on a straight-level walkway at their normal pace. The trajectories and accelerations were simultaneously recorded and synchronized.


A smartphone and a custom micro-controlled device with an IMU were attached to one of the subject's legs to capture accelerometer data, and 42 reflexive markers were taped over the whole body to record three-dimensional trajectories. The presented multi-sensor human gait dataset comprises synchronized inertial and optical motion data from 25 subjects free of lower-limb injuries, aged between 18 and 47 years.

We introduce M2DGR: a novel large-scale dataset collected by a ground robot with a full sensor-suite including six fish-eye and one sky-pointing RGB cameras, an infrared camera, an event camera, a Visual-Inertial Sensor (VI. Jie Yin, Ang Li, Tao Li, Wenxian Yu, Danping Zou. Each technology has its drawbacks and advantages, fitting best to particular applications. M2DGR: A Multi-sensor and Multi-scenario SLAM Dataset for Ground Robots. Different technologies can acquire data for gait analysis, such as optical systems and inertial measurement units (IMUs).
