The activity to manually transfer patients is critically important as it is a frequent part of nurses’ daily work and an activity that is conducted across health systems globally.


Codestrate Packages: An Alternative to “One-Size-Fits-All” Software

Nurses frequently transfer patients as part of their daily work. However, manual patient transfers pose a major risk to nurses’ health. Although the Kinaesthetics care conception can help address this issue, existing support to learn the concept is low. We present KiTT, a tablet-based system, to promote the learning of ergonomic patient transfers based on the Kinaesthetics care conception. KiTT supports the training of Kinaesthetics-based patient transfers by two nurses. The nurses are guided by the phases (i) interactive instructions, (ii) training of transfer conduct, and (iii) feedback and reflection. We evaluated KiTT with 26 nursing-care students in a nursing-care school. Our results indicate that KiTT provides a good subjective support for the learning of Kinaesthetics. Our results also suggest that KiTT can promote the ergonomically correct conduct of patient transfers while providing a good user experience adequate to the nursing-school context, and reveal how KiTT can extend existing practices.

Time Frame

Research Assistant Work

Maximilian Dürr, Carla Gröschel, Ulrike Pfeil, Jens Müller, Harald Reiterer

Movement Learning Systems, Patient Transfer, Nursing-care Education, Kinaesthetics


Maximilian Dürr, Marcel Borowski, Carla Gröschel, Ulrike Pfeil, Jens Müller, and Harald Reiterer. 2021 (in press). KiTT - The Kinaesthetics Transfer Teacher: Design and Evaluation of a Tablet-based System to Promote the Learning of Ergonomic Patient Transfers. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI '21). DOI: 10.1145/3411764.3445496

Maximilian Dürr, Ulrike Pfeil, Jens Müller, Marcel Borowski, Carla Gröschel, and Harald Reiterer. 2019. Learning Patient Transfers with Technology: A Qualitative Investigation of the Design Space. In Proceedings of Mensch und Computer 2019 (MuC '19). DOI: 10.1145/3340764.3340784


Accompanying video for the CHI 2021 paper: