Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Sensor fusion and motion analysis for medical rehabilitation - the Fraunhofer FIRST Motion Technology Validation Lab

Warianty tytułu
Języki publikacji
It is a well-known factor that our activities of daily life play an important role for staying healthy- In order to validate different motion analysis techniques for clinical and therapeutic purposes Fraunhofer FIRST set up a sensor-controlled testbed. The technologies investigated include time-of-flight cameras, bio-inspired optical stereo-sensors, laminar pressure and proximity sensors as well as body-worn devices including inertial sensors. This paper presents these technologies and sketches the process of sensor fusion and motion analysis being under implementation in the Fraunhofer Motion Technology Validation Lab.
Opis fizyczny
Bibliogr. 34 poz., rys.
  • Fraunhofer Institute for Computer Architecture and Software Technology, Kekulestr. 7, D 12489 Berlin, Germany
  • [1] Reha 2008, Statistik der deutschen Rentenversicherung, Deutsche Rentenversicherung Bd. 174, Berlin, 2009.
  • [2] Bravata D.M., Smith-Spangler C., Sundaram V., Gienger A., Lin N.D., Lewis R., Stave C.D., I Olkin., Sirard J.R., Using Pedometers to Increase Physical Activity and Improve Health: A Systematic Review, Journal of the American Medical Association, Vol. 298, No. 19, 2296-2304.
  • [3]
  • [4]
  • [5]
  • [6]
  • [7]
  • [8] Seewald B., John M., Tele-Reha - Sichtweisen von Benutzergruppen auf eine Rehabilitation im häuslichen Umfeld, Tagungsband 3, Deutscher Ambient Assisted Living Kongress, 26-27 January 2010.
  • [9] Schopp H., Stiegler A., May T., Paintner M., Massanell J., Buxbaum B., 3D-PMD Kamerasysteme zur Erfassung des Fahrzeugumfelds und zur Überwachung des Fahrzeuginnenraums,
  • [10] Profittlich M., CamCube Software Development Tutorial,
  • [11] Sulzbachner C., Kogler J., Kubinger W., Schoitsch E., A 3D Event-Based Silicon Retina Stereo Sensor, ERCIM NEWS 79, October 2009.
  • [12] Garn H., Krenn M., Smart eye - UCOS Universal Counting Sensor, AIT Austrian Institute of Technology, 2009.
  • [13] Litzenberger M., Intelligente Sensorsysteme und Neuroin-formatik - Automatisch anonym abgezahlt, smart systems HIGHLIGHTS, 2007/2008.
  • [14] Hightower J., Borriello G., Location Systems for ubiquitous computing, Computer 34, 8 (Aug.), 2001, 57-66.
  • [15] Meyer K., Applewhite H.L., Biocca F.A., A survey of position-trackers, Presence 1, 2, 1992, 173-200.
  • [16]
  • [17]
  • [18]
  • [19] Vlasic D., Adelsberger R. et al., Practical motion capture in everyday surroundings, ACM Trans. Graph., 26(3), 2007.
  • [20] Frey W., Off-the-shelf, real-time, human body motion capture for synthetic environments, Tech. Rep. NPSCS-96-003, Naval Postgraduate School Monterey, California, 1996.
  • [21] Welch G., Foxlin E., Motion tracking: no silver bullet, but a respectable arsenal, Computer Graphics and Applications, 22, 6 (Nov./Dec.), 2002, 24-38.
  • [22] Bradski G., Kaehler A., Learning OpenCV. Computer Vision with the OpenCV Library, O'Reilly, 2008.
  • [23] Flankers R., Fua P., Tracking and modeling people in video sequences, Comput. Vis. Image Underst., 81(3), 2001, 285-302.
  • [24] Jaeggli T., Koller-Meier E., Van Gool L., Learning Generative Models for Multi-Activity Body Pose Estimation, International Journal of Computer Vision, 2, 2009, 121-134.
  • [25] Hye-Jeong Kim, Kyoung-Mi Lee, Silhouette-based human motion estimation for movement education of young children, 2006 International Conference on Hybrid Information Technology, 2, 2006, 673-678.
  • [26] Agarwal A., Triggs B., Monocular Human Motion Capture with a Mixture of Regressors, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05) - Workshops, 2005.
  • [27] Miller N., Jenkins O.C. et al., Motion capture from inertial sensing for untethered humanoid teleoperation, International Conference of Humanoid Robotics, 2004, 547-565.
  • [28] Foxlin E., Pedestrian tracking with shoe-mounted inertial sensors, Computer Graphics and Applications, 25(6), 2005, 38-46.
  • [29] Hacid M., Decleir C., KouloSumdjiam I, A Database Approach for Modeling and Querying Video Data, IEEE Transactions on Knowledge and Data Engineering, 12(5), 2000, 729-750.
  • [30] Meredith M., Maddock S., Motion capture file formats explained, Department of Computer Science, University of Sheffield.
  • [31] Ringbeck T., Frydlewicz P., Jesorsky O., Sunkel T., PMD-Sensoren als Schlüsselkomponenten für die mehrdimensionale Umfelderfassung des Fahrzeuginnenraums,
  • [32] Diduch L., Hoarau M., Fillinger A., Stanford V., Synchronization of Data Streams in Distributed Realtime Multimodal Signal Processing Environments on Commodity Hardware, Proceedings of the IEEE International Conference on Multimedia & Expo (ICME), Hannover, Germany, June 2008.
  • [33] Danoczy M., Fazli S. et al., Brain2Robot: a grasping robe arm controlled by gaze and asynchronous EEC BCI, Proceedings of the 4th International Brain-Computer Interfac Workshop and Training Course, G.R. Muller-Putz, C. Brunner, R. Leeb, G. Pfurtscheller and C. Neuper, TU Graz, Gra; Austria, 2008, 355-360.
  • [34] Popescu F., Viceconti M. et al., A New method to compar planned and achieved position of an orthopaedic implan Computer Methods and Programs in Biomedicine, 71(2 2003, 117-127.
Typ dokumentu
Identyfikator YADDA
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.