Ambient assisted living, specifically Smart Homes, has emerged as a viable solution capable of providing technology-driven assistive living for the elderly and disabled. The approach involves a multi-step process, i.e., to (1) monitor an inhabitant’s behaviour using multiple multi-modal sensors, (2) collect and fuse sensor data in some ways suitable for interpretation, (3) recognise activities the inhabitant is performing, (4) identify the inhabitant’s needs or anomalies, and (5) finally provide personalised context-aware assistance. With substantial progress being made in sensor networks, data communication and data collection, current research has been concentrating on activity recognition. Given the diversity of activities of daily living, the different ways of performing them by individuals, the wide range of sensors used and also the difficulty of acquiring personal data (e.g., privacy, ethnic issues), activity recognition has become a bottleneck to the realistic development and deployment of assistive systems.
The rationale behind this project is that rather than recognising activities from sensor data we can define goals for an inhabitant. Each goal is realised by performing an activity. Goals are either explicitly manually specified, e.g., a care provider defines goals for an inhabitant to achieve during a day, or learnt based on domain context. Activities are pre-defined in some flexible way and linked to specific goals. As such, once a goal is specified or identified, an assistive system can instruct/remind inhabitants to perform the corresponding activity. The inhabitant will have the freedom of how to perform the activity, i.e., not in a rigid single sequence of actions but arbitrary order in terms of his/her preferences and the location of required items. The assistive system will monitor the unfolding of the activity, i.e., is it performed or is it performed in the right way. Based on the observation the system will provide timely guidance to help the inhabitant complete the activity.
The proposed project aims to develop an intelligent goal-driven assistive agent to instruct and guide the performance of activities. An intelligent agent is a software system that operates (or situates) in a dynamic, complex environment. It senses the changes of the environment, interacts with entities and takes actions accordingly in response to various types of events and changes. An intelligent agent can have human-level attributes such as intention, goals and motivations, and support a number of key properties, including autonomy, reactivity, adaptation and pro-activeness.
Specifically the project will conceive a generic agent architecture for the proposed approach, develop techniques for modelling goals and activities, and decision-support methods for providing guidance for activity performance. It will take a broader system view on data fusion and knowledge modelling and consider the interrelationships that exist between sensing, decision making and acting in such systems. Special emphasis will be placed on goal modelling and representation, activity specification and decision-making mechanisms that make the agent operate effectively in uncertain and dynamic environments and that are robust, scalable and flexible in their operation.
The research will be conducted in the context of the intelligent environment within the Smart Environment Research Group of the School. This comprises four dedicated Smart Labs – Kitchen, Lounge, Meeting room and Autonomic-robotic lab. Application scenarios from assistive living could be used for testing and evaluation of the proposed research outcomes.
Chen L., Nugent CD., Wang H., A Knowledge-Driven Approach to Activity Recognition in Smart Homes, IEEE Transactions on Knowledge and Data Engineering, ISSN: 1041-434707, IEEE Computer Society, doi.ieeecomputersociety.org/10.1109/TKDE.2011.51, 2011
Chen L, Nugent C D, Mulvenna MD, Finlay DD, Hong X, Poland M., (Dec 2008) "A Logical Framework for Behaviour Reasoning and Assistance in a Smart Home", International Journal of Assistive Robotics and Mechatronics, Vol. 9, No. 4, Pages 20-34.
Chen L., Bechkoum K and Clapworhty G. (3/2001), Reconciling Autonomy with Narratives in the Event Calculus. AAAI2001 Spring Symposium Series. Stanford University, USA, AAAI Technical Report SS-01-02, pp20-24
First Supervisor: Chen, L Dr
Second Supervisor: Nugent, CD Professor
Collaboration: This project does not involve collaboration with another establishment