PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Modern intercom system in the COVID-19 period

Treść / Zawartość
Identyfikatory
Warianty tytułu
PL
Nowoczesny system domofonowy w okresie pandemii COVID-19
Języki publikacji
EN
Abstrakty
EN
The COVID-19 pandemic has forced many restrictions in public space. It is also seen in appropriate Human – Computer Interaction (HCI) solutions in the consumer devices space. We present the design of an advanced intercom system in which we tried to take into account modern HCI capabilities and also social aspects. A specific distributed control and measurement system was created where "tailor-made" solutions allowed for the correct implementation of the project. The usability assessment of the operating intercom system confirmed the social expectations of users.
PL
Pandemia Covid-19 wymusiła wiele ograniczeń w przestrzeni publicznej. Jest to również widoczne w realizacji interakcji człowiek komputer (HCI) urządzeń konsumenckich. Przedstawiamy projekt zaawansowanego systemu domofonowego, w którym staraliśmy się uwzględnić współczesne możliwości HCI, a także aspekty społeczne. Powstał specyficzny rozproszony system kontrolno-pomiarowy, w którym rozwiązania „szyte na miarę” pozwoliły na prawidłową realizację projektu. Ocena użyteczności działającego systemu potwierdziła społeczne oczekiwania użytkowników.
Rocznik
Strony
80--86
Opis fizyczny
Bibliogr. 26 poz., rys., tab.
Twórcy
  • Warsaw University of Technology, Institute of Theory of Electrical Engineering, Measurement and Information Systems, Koszykowa 75, 00-662 Warsaw Poland
  • CODI Technical Group, Renesansowa 7C, 01-905 Warsaw Poland
  • VoiceLab.AI, Grunwaldzka 135A, 80-264 Gdańsk, Poland
  • VoiceLab.AI, Grunwaldzka 135A, 80-264 Gdańsk, Poland
autor
  • VoiceLab.AI, Grunwaldzka 135A, 80-264 Gdańsk, Poland
Bibliografia
  • [1] Oviatt, S., Cohen, P.R., 2015. The Paradigm Shift to Multimodality in Contemporary Computer Interfaces. Morgan & Claypool.
  • [2] Krupka, E., et al. (2017). Toward Realistic Hands Gesture Interface: Keeping it Simple for Developers and Machines. In: Proc. 2017 CHI Conference on Human Factors in Computing Systems. Denver USA. pp. 1887-1898. DOI: https://doi.org/10.1145/3025453.3025508
  • [3] Wachs, J. et al., (2006). A Real-Time Hand Gesture Interface for Medical Visualization Applications. In: Tiwari A., Roy R., Knowles J., Avineri E., Dahal K. (eds) Applications of Soft Computing. Advances in Intelligent and Soft Computing, vol 36. Springer, Berlin, Heidelberg. DOI: https://doi.org/10.1007/978-3-540-36266-1_15
  • [4] Duchowski, A., (2007). Eye tracking methodology. Theory and practice. sec. ed. Londyn: Springer.
  • [5] Singh, H., Singh, J. (2012). Human eye tracking and related issues: a review. International Journal of Scientific and Research Publications. 2(9), 1-9.
  • [6] Kwiatkowska, A., Sawicki, D., (2018). Eye Tracking as a Method of Controlling Applications on Mobile Devices. In: Proc. of the 15th International Joint Conference on e-Business and Telecommunications (ICETE 2018) – Volume 1: DCNET, ICE-B, OPTICS, SIGMAP and WINSYS, Porto, Portugal. pp. 373-380. DOI: http://dx.doi.org/10.5220/0006837003730380.
  • [7] Majaranta, P., Bulling, A., (2014). Eye Tracking and Eye-Based Human–Computer Interaction. In: Fairclough, S., Gilleade, K. (eds) Advances in Physiological Computing. Human–Computer Interaction Series. pp. 39-65. Springer. DOI: http://dx.doi.org/10.1007/978-1-4471-6392-3_3
  • [8] Singh, H., Singh, J., (2018). Real-time eye blink and wink detection for object selection in HCI systems. Journal on Multimodal User Interfaces, 12(1), 55-65. DOI: http://dx.doi.org/10.1007/s12193-018-0261-7
  • [9] Kowalczyk, P., Sawicki, D., (2019). Blink and wink detection as a control tool in multimodal interaction. Multimedia Tools and Applications. 78(10), 13749–13765. DOI: http://dx.doi.org/10.1007/s11042-018-6554-8.
  • [10] Evans, D.G., Drew, R., Blenkhorn, P., (2000). Controlling Mouse Pointer Position Using an Infrared Head-Operated Joystick. IEEE Transactions on Rehabilitation Engineering, 8(1), 107–117.
  • [11] Kim H., Ryu D., (2006). Computer control by tracking head movements for the disabled. In: Proc. of the ICCHP ’06. In: Lecture Notes in Computer Science, 4061, pp.709–715, Springer.
  • [12] Sawicki, D., Kowalczyk, P., (2018). Head Movement Based Interaction in Mobility. International Journal of Human–Computer Interaction. 34(7), 653-665. DOI: http://dx.doi.org/10.1080/10447318.2017.1392078.
  • [13] Strumiłło, P., Pajor, T., (2012). A vision-based head movement tracking system for human-computer interfacing. In: Proc. New trends in audio and video/signal processing algorithms, architectures, arrangements and applications (NTAV/SPA). Łódź, Poland. pp. 143-147.
  • [14] Szczepaniak, O., Sawicki, D., (2017). Gesture controlled human–computer interface for the disabled. Medycyna Pracy. 68(1), 11-21. DOI: http://dx.doi.org/10.13075/mp.5893.00529
  • [15] Guzsvinecz, T., Szucs, V., Sik-Lanyi, C., (2019). Suitability of the Kinect Sensor and Leap Motion Controller-A Literature Review. Sensors (Basel). 19(5), 1072. DOI: http://dx.doi.org/10.3390/s19051072.
  • [16] Tandel, N.H., Prajapati, H.B., and Dabhi, V.K., (2020). Voice Recognition and Voice Comparison using Machine Learning Techniques: A Survey. In: Proc. 6th International Conference on Advanced Computing and Communication Systems (ICACCS), Coimbatore, India, 2020, pp. 459-465, DOI: 10.1109/ICACCS48705.2020.9074184.
  • [17] Singh, A.P., Nath, R., and Kumar, S., (2018). A Survey: Speech Recognition Approaches and Techniques. In: Proc. 5th IEEE Uttar Pradesh Section International Conference on Electrical, Electronics and Computer Engineering (UPCON), Gorakhpur, 2018, pp. 1-4, DOI: 10.1109/UPCON.2018.8596954.
  • [18] The Smart Audio Report, (2020). National Public Media, USA, Report, April 2020. available at https://www.nationalpublicmedia.com/insights/reports/smart-audio-report/ last accessed 14 May 2022.
  • [19] Intent Recognition Use Cases, Use VoiceLab to improve your business service quality in any industry. https://voicelab.ai/#use-cases/ last accessed 10 January 2022
  • [20] Ali, A., Renals, A., (2018). Word Error Rate Estimation for Speech Recognition: e-WER. In. Proc. 56th Annual Meeting of the Association for Computational Linguistics. July 2018, Melbourne, Australia. Volume 2: pp.20-24. DOI: http://dx.doi.org/10.18653/v1/P18-2004
  • [21] Panayotov, V., Chen, G., Povey, D., Khudanpur S., (2015). Librispeech: An ASR corpus based on public domain audio books. In: Proc. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 16-24 April 2015, Brisbane, QLD, Australia, pp. 5206–5210. DOI: http://dx.doi.org/10.1109/ICASSP.2015.7178964
  • [22] Open SLR Resources, (2023). Available at: http://www.openslr.org/resources.php/ last accessed 20 October 2023
  • [23] Brooke, J., (1996). SUS: A ‘Quick and Dirty’ Usability Scale. In: Usability Evaluation in Industry, Jordan, P.W., Thomas, B., McClelland, I.L.,Weerdmeester, B., (Eds.), London, Taylor and Francis, pp. 189–194.
  • [24] Likert, R., 1932. A technique for the measurement of attitudes. Arch. Psychol. No 140, 5-55.
  • [25] Bangor, A.; Kortum, P.T.; Miller, J.T., (2008). An Empirical Evaluation of the System Usability Scale. Int. J. Hum. Comput. Interact. 2008, 24, 574–594.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-280f94df-d1b7-4cd0-af12-3094270b3530
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.