Background
Sensor-based Human Activity Recognition (HAR) using data from wearable and smartphone sensors such as accelerometers and gyroscopes is a mature yet rapidly evolving field with wide-ranging applications in healthcare monitoring, sports analytics, fitness tracking, and ubiquitous human-computer interaction. Deep learning models using convolutional, recurrent, and transformer architectural components have become the de facto standard for modeling these multivariate time series, achieving strong performance on benchmark datasets but often struggling to generalize across users due to inter-subject variability, a form of domain shift caused by differences in physiology, behavior, sensor placement, and movement style.
Meta-learning, and specifically Model-Agnostic Meta-Learning (MAML) [1], has been applied to HAR to learn user-adaptive models capable of quick personalization to new subjects with minimal labeled samples [2]. However, standard MAML relies on second-order gradients and full inner-loop backpropagation, rendering it computationally expensive and impractical for large-scale or edge-based scenarios. To address these limitations, several efficient MAML variants have emerged, such as Reptile [3], a first-order method avoiding explicit meta-gradients, and iMAML [4], which utilizes implicit differentiation. These lightweight approaches have shown promise in vision settings but remain largely unassessed for inter-subject domain shift in sensor-based HAR.
Despite these promising properties, the potential of efficient meta-learning variants to resolve inter-subject variability in centralized HAR remains largely unexplored. This thesis proposes to systematically investigate these lightweight meta-learning architectures for sensor-based HAR, focusing on their ability to (i) achieve competitive or superior personalization performance compared to standard MAML, (ii) maintain robustness against inter-subject domain shift under few-shot adaptation, and (iii) operate efficiently in terms of memory footprint, computational cost, and adaptation speed. The work will evaluate these models on standard HAR benchmarks to provide practical guidance for real-world deployment.
References
- Finn, Chelsea, Pieter Abbeel, and Sergey Levine. "Model-agnostic meta-learning for fast adaptation of deep networks." International conference on machine learning. PMLR, 2017.
- Wijekoon, Anjana, and Nirmalie Wiratunga. "Personalised meta-learning for human activity recognition with few-data." International Conference on Innovative Techniques and Applications of Artificial Intelligence. Cham: Springer International Publishing, 2020.
- Nichol, Alex, Joshua Achiam, and John Schulman. "On first-order meta-learning algorithms." arXiv preprint arXiv:1803.02999 (2018).
- Rajeswaran, Aravind, et al. "Meta-learning with implicit gradients." Advances in neural information processing systems 32 (2019).
Your Tasks
- Review the theory and practice of meta-learning algorithms (MAML, Reptile, iMAML) and their respective optimization strategies.
- Design and implement efficient meta-learning architectures for sensor-based HAR using established backbones.
- Investigate few-shot adaptation regimes (e.g., leave-one-subject-out cross-validation) to evaluate out-of-the-box robustness against inter-subject variability.
- Train the proposed models on established HAR datasets and evaluate them in terms of classification performance, computational efficiency, memory footprint, and adaptation speed.
- Benchmark against strong generalized baselines and standard MAML.
Requirements
- Solid understanding of deep learning concepts.
- Proficient programming skills in Python and PyTorch.
- Experience working with time series or sensor data is a plus but can be acquired during the thesis.
- Interest in domain adaptation, few-shot learning, meta-learning, and efficient model deployment is a plus.
Application Documents
- A paragraph explaining your motivation.
- Your study program (Bachelor/Master), current semester, and field of study.
- A transcript of records (courses and grades).
- Your programming experience.
- Any areas of interest relevant to the topic.
- Your CV (if available)


