Sensors Applied in Healthcare Environments

DOI: 10.4236/jcc.2016.45015   PDF   HTML   XML   1,233 Downloads   1,854 Views  

Abstract

In this paper, the recent advances of sensors applied in healthcare environments are presented. Based on the function and operation of modern health and wellness measurement and monitoring and mHealth solutions, a variety of sensors are described with their features and applications. Further improvements and future trend are pointed out and discussed.

Share and Cite:

Shi, W. (2016) Sensors Applied in Healthcare Environments. Journal of Computer and Communications, 4, 99-106. doi: 10.4236/jcc.2016.45015.

1. Introduction

The world’s population age 65 and older is growing by an unprecedented rate. An aging population with a longer life expectancy results in a larger population of fragile elderly, the chronically ill and those requiring rehabilitation [1]. In 2012, the percentage of the world population who are more than 65 was 6.9%, and this is estimated to increase to around 20% by 2050 [2]. Due to the increase in this proportion of the population, this has created a growing need for innovative approaches to deliver care services for older adults. In the next two decades there will be a significant increase in the elderly population and this in turn will result in a much greater need for effective assistive devices [3].

Various governments around the world are investigating various methods of keeping the elderly and disabled people involved socially and economically in addition to reducing the number of them needing institutionalization. With improved sensing and communication technologies, motion tracking has made assistive living and environment possible [4]. Assistive environment provides many health care solutions including in-home care for the elderly and the disabled.

Efficient sensors with the assistive devices could enable early disease detection, remote diagnosis and the independent living of elderly people and chronically ill patients [5] [6]. Recent developments in ambient assistive living technologies have demonstrated the feasibility of using ambient sensors in supporting independent living [7]. Unlike wearable sensors, they tend to have more battery and processing capacities, but have limited use in capturing physiological information or in multiple occupancy dwellings. Effort is being made to build-in sensors in mobile device platforms, so that sensor data can be captured, processed and sent to a central location to improve existing healthcare services [8]-[10].

Based on the growing research and development in this area, there is strong indication that bio-sensors will be an integral part of ordinary cell phone and smart phone platforms [8] [10]. The introduction of embedded sensors in mobile phone can benefit from better computing power, speed and memory as compared to standalone sensors [10]. The existence of GPS, location information and time tags can make mobile devices smarter and useful for healthcare services, through bio-sensor integration [8] [11].

The purpose of this work is to summarize and present recent advances and development of sensors applied in healthcare environments. Therefore, researchers and practitioners will be aware of the challenges in this area. The paper is organized as: Section 2 presents a variety of sensors applied in modern health and wellness measurement, assistive technologies, and mHealth; finally, Section 3 concludes the paper.

2. Sensors Used for Healthcare

Sensor is a device for the detection of an analyte that combines a biological component with a physicochemical detector component. It normally consists of three parts [12]:

1) the sensitive biological element (biological material, a biologically derived material or biomimic);

2) the transducer or detector element works in a physicochemical way that transforms the signal resulting from the interaction of the analyte with the biological element into another signal (i.e., transducers) that can be more easily measured and quantified;

3) the associated electronics or signal processors that are primarily responsible for the display of the results in a user-friendly way. This sometimes accounts for the most expensive part of a sensor device.

Sensors have four main components: sensing, processing, communication, and energy/power units. Body sensors fall into two main categories, implantable and wearable. The former measure parameters inside the body and mostly operate as interfaces to relatively small software components attached to or implanted into human bodies. The implantable sensors provide bidirectional communication interfaces between a person and a remote information system that provides healthcare services, diagnosis, or upgrades [2]. Wearable sensors, although not as invasive as their implantable counterparts, nevertheless must withstand the human body’s normal movements and infringe on them as little as possible [5]. Wearable sensors may be categorized according to their functional aim, including monitoring system, iLife fall detection sensors [13]-[15] recognize and react to falls, Health buddy which measures and records vital signs, PROACT [16] [17] glove which monitors contact with everyday objects and SenseCam [18] [19] which improves retrospective memory. Body sensors fall into those that detect only body activity and so react only to movement and those that measure some consequence of physiological change during exercise or other conditions (QT interval, respiration, temperature, and venous oxygen saturation).

2.1. Sensors Used in Wearable Devices for Fall Detection

Recent researches estimated that each year, in the U.S., nearly 30% of elderly people incur in falls, and the likelihood of falling increases substantially with age. Falls may directly result in traumas, fractures, permanent disability, or even death. Hence, falling is a major concern for older people due to the higher risk of breakages that results from a lower bone density associated with aging [20]. It is therefore imperative that their support team is alerted in the event of a fall to minimize distress associated with injury. There are devices that seek to monitor an assisted person’s status or activities, and devices that provide active support.

iLife fall detection sensor which is worn by the assisted person, may be utilized in conjunction with the Independent LifeStyle Assistant system [13]. This integrates individual devices and augments them with reasoning capabilities enabling the assisted person greater independence. This iLife fall detection sensor not only triggers when abnormal body movements or extended periods of inactivity are detected, but also can be activated manually by pressing a distress button.

In [21], the authors describe two fall sensors, an embedded video-based one and a wearable accelerometer- based one, which can be managed within a data-fusion-oriented framework, implementing policies aimed at maximizing system reliability and minimizing the presence of false alarms. The two sensors can be integrated into a modular architecture to compensate for each other’s limits, favoring the development of a harmonic, modular and easily extensible system able to manage different areas of the environment.

The video fall detector is based on a digital camera and a FPGA programmable logic, able to locally process the images and to transmit to a server only aggregated information relating to the “state of alert”, with obvious advantages in terms of end users’ privacy. The wearable accelerometer-based sensor is based on a new powerful soft-computing paradigm which makes it possible to extend the task it performs to detecting a whole set of situations, and therefore to make the whole architecture more flexible.

The visual sensor is used to send a central supervision system only aggregated information and not the whole video stream. The sensor output consists only of signals that account for the “state of alert” on the potential occurrence of a fall. When used in conjunction with other sensors (audio, wearable, etc.), such a compact, economical and little-invasive device can provide a description of the environment being monitored [21]. In [22] the authors speak of their plans to use additional sensors, such as RFID for object localization, floor-mounted vibration sensors for fall detection in privacy-sensitive areas, and infrared cameras, in conjunction with their smart cameras, for additional tracking and health monitoring capabilities.

2.2. On-Body Sensors

Concerning the kinematics, different sensors are used to capture the musculoskeletal motion. Laboratories synchronously measure in-body data from several sensors, the most commonly used include: 1) Optokinetic cameras which are used for more precise data capture, 2) Electromyography, 3) Gyroscope, accelerometer, or altimeter. When it comes to dynamics capturing, surface electromyography (EMG) are commonly used. EMG measures the electric activation of the muscle. Surface EMG is a non-invasive way to acquire data from the muscles [23].

Additionally, with the recent improvements in the field of movement recognition, gyroscopes, accelerometers and altimeters allow us now to carry out new kind of movement studies [14]. As these technologies do not require the patients to remain in the camera field, it allows them to leave the laboratory and therefore offers the possibility to pursue the measurements for a longer period of time [1]. The recent progress over the past few years in the field of wireless technology allows the sensors to be less obtrusive for the patients and more easy to use by the clinicians. These sensors are called body sensor networks (BSN) [6].

A wireless sensor networks consist a collection of different sensors nodes which are connected through wireless channels and which can be used to build different distributed system for data collection, processing. These sensors have found many applications in different areas, including sampling, processing, communication with more than one vital signs, heart rate, blood pressure, oxygen saturation activity or environmental parameters like location, temperature, humidity and light [2]. The structures of BSNs are mostly application-dependent. Thus, BSNs face the security issues, e.g., privacy, integrity, and authentication.

Secure communication in healthcare system is no exception, although in real time, healthcare monitoring and necessary information need to be providing quickly. If a healthcare system lacks the necessary protection in communication, it can expose a patient medical data to malicious intruder or eavesdroppers. If unauthorized parties or person can easily access the private data of a patient or medical records then false information can be injected into the data stream by a prohibited node [12] [13]. An efficient secure security scheme was proposed by Haque et al. [14] for patient monitoring system using wireless sensor network which consist of three main components, patient, healthcare system and secure base station. To establish a secure communication between healthcare system and base station or patient a pseudo-inverse matrix is used to derive the pair-wise shared key and a bilateral key handshaking method is also used [24]. The techniques used in BSN for security and privacy are summarized in Table 1 [25].

2.3. Sensor Network Used in Assistive Technologies

As the sensor technology advances, assistive devices have been able to detect various kinds of physiological variables. For speech- and hearing-impaired disabilities, non-verbal form of communication is very important. To alleviate the communication problem among speech-impaired disabilities, an assistive technology that provides a more convenient and less time-consuming means of communication is required. An assistive device for speech- and hearing-impaired disabilities has been developed based on the BSN technology [6]. In the system, real-time recognition of American Sign Language (ASL) fingerspelling gestures is performed based on input signals acquired from a wireless sensor gloves. The recognized gestures will then be mapped into corresponding sounds using speech synthesizer.

In [26], a framework for constructing the fingerspelling gesture recognition model based on the data acquired from a wireless BSN sensor glove is proposed. The glove consists of five flex sensors and a 3D accelerometer

Table 1. Technique used in BSN for security and privacy.

providing a measure of finger bending, as well as motion, orientation of the hand, recognition and a speech synthesizer as shown in Figure 1 [26]. The flex sensors placed along five fingers are used for detecting finger bending and the 3D accelerometer placed on the back of the hand is used for detecting hand orientation and motion. Data are transmitted to the computer via BSN node placed on the wrist.

In [27], the authors developed a prototype for a working rehabilitation system that would operate safely and accurately in the home of an elderly or recovering patient. The shoe insole and sensor are inserted into the footwear of the user as shown in Figure 2 [27]. In this insole, FlexiForce sensor is used to measure the force on the ball and the heel of the foot. The sensors implanted in the legs of the walker are used to collect data to help identify the patient’s prognosis during the recovery period. Recovering patients can be monitored while they walk and use their walker in their daily lives. The FlexiForce sensors are used in both the walker and the shoe insole. The sensors aided in the determination of force distribution over the toe of the insole and the legs of the walker. The Tiny Bee Tri-Axis Accelerometer is used in the shoe insole to sense the degree of pronation and supination in the foot. The accelerometer is made by EVBPlus. These sensors output three analog voltages, each proportional the amount of gravitational force experienced along three axes.

In recent years great advancements have been made with prosthetic technology aided by advancements in materials science and sensors technology. Buckley et al. [28] introduced a sensor suite framework for the partial automation of prosthetic arm control allowing high level control with a reduction of cognitive burden placed upon the user. A framework is established to use interchangeable sensors to emulate the low level hand-eye co-ordination of a healthy individual. A shoulder mounted depth sensor is used to obtain environment information to be used to locating the target object relative to the prosthesis. Sensors applied in the system include: Laser range finders, stereo cameras, and structured light. When combined together these sensors mimic the natural way in which a user co-ordinates arm movement to a sufficient extent as to allow low level control to be taken over by the system freeing the user into a more high level role.

2.4. Sensors Incorporated in Assistive Robots

Robotic technology for supporting human activities, such as intelligent wheel chairs [29] [30], prosthetic limbs [31] [32], and wearable robots [33], have been studied for their practical use in various aspects of daily life. As

Figure 1. Fingerspelling-based speech synthesizing sensor gloves.

Figure 2. (a) Schematic of the insole design. (b) A photograph of the insole.

mobility declines it becomes increasingly difficult to maintain independence. Increasingly assistive devices for mobility are proposed, examples include the robot suit, “designed to help the elderly and enfeebled to walk and carry heavy objects”, which is designed to support bodyweight, reduce stress on the knees, help people climb steps and stay in crouching positions.

In [34]-[36], three infrared (IR) sensors are mounted on the robot hand as Figure 3, in order to control the robot hand to grasp objects using the information from the sensors readings and the interface component. The IR sensors are used for proximally sensing the object distance and providing a corrective signal for the hand to close in on the object. Besides, tactile sensors are employed in the system with contact based sensing methods. Optical infrared sensors are introduced for pre-touch during final grasp adjustments. The method in [37] detects the orientation of an object surface using the IR sensors that fit inside the fingers. However, [37] can only adjust the fingers and be used for one dimension of the end-effector.

The first capability is that the hand can autonomously move to a suitable pre-grasp position if at least one of the three sensors detects the object. In the pre-grasp position, the robot can directly close the hand to grasp the object. The second functionality of the system is collision avoidance, for the reason that it will keep a distance to any object. With more sensors detecting the object, the distance would be larger. Thirdly, the system is robust. The end-effector even can track a mobile object. KINECT is employed as the sensor to collect human joint position data, then the robot could mimic human after mapping between their joint coordinates. The human hand gesture could be recognized as open or closed, which could provide the signal for the robot to open or close hand.

2.5. Bio-Sensors in m-Health

Bio-sensor enabled mobile phones have the potential to improve health (m-health) information gathering both for ambulatory and continuous chronic disease monitoring applications. There is a strong indication that existing health care systems can benefit more from mobile phone and sensor integrated devices. The existence of GPS,

Figure 3. Hardware of IR sensors.

location information and time tags can make mobile devices smarter and useful for healthcare services, through bio-sensor integration [8] [10] [11]. Sensor enabled mobile phone devices are going to revolutionize personal and social network based sensor data collection leading into the next generation of innovative and secure mHealth services [8]-[10]. The basic technology used incorporates:

・ Noninvasive Bio-Sensor;

・ Mobile phone processing of sensor signal and Secure Radio Transceiver design;

・ Web server and database for data storage, management and analysis.

In bio-sensor based mHealth systems, security will be a major challenge. Researchers have been proposing different security solution for standalone sensor network and few research has been done to address security issues of mHealth system. A number of medical research efforts have been based on mHealth to monitor and study the deployment of sensor-based technologies in the course of patient diagnosis and treatments, including: aged population health monitoring, calorie in-take monitoring, treating patients with diabetes, blood-pressure monitoring, treating cardiac patients, and blood oxygen level monitoring. However, the realization of mHealth systems and infrastructures has its own set of challenges [35].

1) Security: Identification of the weakest point in an end-to-end sense.

2) Semantic interoperability.

3) Scalability in linking healthcare providers to end users.

4) Unified agreements among healthcare providers.

5) Unified mHealth education.

3. Conclusion

This paper has attempted to present a variety of sensors applied in healthcare environments with their features and applications. Through introducing different types of sensors, it is important to point out that sensors used for health monitoring must consider aspects such as safety, scalability, autonomy, privacy and its impact on the care of the users. Advances in sensors can produce powerful and very rapid movements through a large operational space in development of assistive technologies, m-Health solutions, and health and wellness measurements. Therefore, careful thought needs to be given to hazard assessment while the researchers design the healthcare devices using appropriate sensors. Future research should also consider the interaction between human and healthcare devices using sensors to maintain physical and cognitive functional capabilities.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Kothiyal, K. and Tettey, S. (2001) Anthropometry for Design for the Elderly. International Journal of Occupational Safety and Ergonomics: JOSE, 7, 15-34. http://dx.doi.org/10.1080/10803548.2001.11076474
[2] Organization for Economic Co-Operation and Development (2012) World Population Aging: 1959-2050.
[3] Hoffman, A.H. (2009) Design of Robotic Devices to Assist Persons with Disabilities. IEEE International Conference on Technologies for Practical Robot Applications, Woburn, 1-4. http://dx.doi.org/10.1109/tepra.2009.5339653
[4] Marshal, M. (1997) State of the Art in Dementia Care. Centre for Policy on Ageing, UK.
[5] Shi, W.V. and Zhou, M.C. (2011) Recent Advances of Sensors for Pacemakers. IEEE International Conference on Networking, Sensing and Control, 520-525. http://dx.doi.org/10.1109/icnsc.2011.5874939
[6] Yang, G.-Z. (2014) Body Sensor Networks. 2nd Edition, Spring-er-Verlag, London. http://dx.doi.org/10.1007/978-1-4471-6374-9
[7] Tamura, T., Kawarada, A., Nambu, M., Tsukada, A., Sasaki, K. and Yamakoshi, K. (2007) E-Healthcare at an Experimental Welfare Techno House in Japan. Open Medical Informatics Journal, 1, 1-7. http://dx.doi.org/10.2174/1874431100701010001
[8] Lane, D.L., et al. (2010) A Survey of Mobile Phone Sensing. IEEE Communications Magazine, 140-146. http://dx.doi.org/10.1109/MCOM.2010.5560598
[9] Nkosi, M. and Mekuria, F. (2010) Cloud Computing for En-hanced Mobile Health Applications. Proceedings of the IEEE Cloud Computing and Technology Conference, 2706-2715. http://dx.doi.org/10.1109/cloudcom.2010.31
[10] Nkosi, M.T., Mekuria, F. and Gejibo, S.H. (2011) Challenges in Mobile Bio-Sensor Based mHealth Development. 13th IEEE International Conference on e-Health Networking Applications and Services (Healthcom), 21-27. http://dx.doi.org/10.1109/health.2011.6026750
[11] Mekuria, F., et al. (2010) Intelligent Mobile Sensing & Analysis Systems. Proceedings of 3rd CSIR Biennial Conference, 2873-2882.
[12] Loos, K. (2012) e-Study Guide for Bio-catalysis in Polymer Chemistry, Content Technologies, Incorporated.
[13] Vichitvanichphone, S., Talaei-Khoei, A., Kerr, D. and Ghapanchi, A.H. (2014) Adoption of Assistive Technologies for Aged Care: A Realist Review of Recent Studies. 47th Hawaii International Conference on System Sciences, 2706- 2715. http://dx.doi.org/10.1109/hicss.2014.341
[14] Haigh, K.Z. and Kiff, L.M. (2004) The Independent Life Style Assistant: AI Lessons Learned. 16th Innovative Applications of Artificial Intelligence Conference, California.
[15] Haigh, K.Z. and Kiff, L.M. (2004) The Independent Life Style Assistant (I.L.S.A.): AI Lessons Learned. 16th Innovative Applications of Artificial Intelligence Conference (IAAI), San Jose.
[16] Pollack, M.E. (2005) Intelligent Technology for an Aging Population: The Use of AI to Assist Elders with Cognitive Impairment. AI Magazine, 26.
[17] Yang, Q. and Shen, Z. (2015) Active Aging in the Workplace and the Role of Intelligent Technologies. 2015 IEEE/ WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, 22, 391-394.
[18] Hodges, S., Williams, L., Berry, E., Izadi, S., Srinivasan, J., Butler, A., et al. (2006) SenseCam: A Retrospective Memory Aid. UbiComp 2006: Ubiquitous Computing, 177-193. http://dx.doi.org/10.1007/11853565_11
[19] Hodges, S., Williams, L., Berry, E., et al. (2006) SenseCam: A Retrospective Memory Aid. UbiComp 2006: Ubiquitous Computing, California, 177-193. http://dx.doi.org/10.1007/11853565_11
[20] O’Brien, A. and Mac Ruairi, R. (2009) Survey of Assistive Technology Devices and Applications for Aging in Place. Second International Conference on Advances in Human-Oriented and Personalized Mechanisms, Technologies, and Services, Porto, 7-12. http://dx.doi.org/10.1109/CENTRIC.2009.9
[21] Cagnoni, S., Matrella, G., Mordonini, M., Sassi, F. and Ascari, L. (2009) Sensors Fusion-Oriented Fall Detection for Assistive Technologies Applications. 9th International Conference on Intelligent Systems Design and Applications, 673-678. http://dx.doi.org/10.1109/isda.2009.203
[22] Williams, A., Xie, D., Ou, S., Grupen, R., Hanson, A. and Riseman, E. Distributed Smart Cameras for Aging in Place.
[23] Hernandez, S., Raison, M., Torres, A., Gaudet, G. and Achiche, S. (2014) From On-Body Sensors to In-Body Data for Health Monitoring and Medical Robotics: A Survey. Global Information Infrastructure and Networking Symposium, 1-5. http://dx.doi.org/10.1109/giis.2014.6934279
[24] Sain, M., Kumar, P. and Hoon, J.L. (2011) A Survey of Middleware and Security Approaches for Wireless Sensor Networks. 6th International Conference on Computer Sciences and Convergence Information Technology, 64-69.
[25] Sain, M., Kumar, P. and Hoon, J.L. (2015) A Survey on Wireless Body Area Network: Security Technology and Its Design Methodology Issue. International Conference on Innovations in Information, Embedded and Communication Systems, 1-5.
[26] Vutinuntakasame, S., Jaijongrak, V.R. and Thiemjarus, S. (2011) An Assistive Body Sensor Network Glove for Speech- and Hearing-Impaired Disabilities. 2011 International Conference on Body Sensor Networks (BSN), 7-12. http://dx.doi.org/10.1109/BSN.2011.13
[27] Mellodge, P. and Vendetti, C. (2011) Remotely Monitoring a Patient’s Mobility: A Digital Health Application. IEEE Potentials, 33-38. http://dx.doi.org/10.1109/MPOT.2010.939453
[28] Buckley, M., Vaidyanathan, R. and Mayol-Cuevas, W. (2011) Sensors Suites for Assistive Arm Prosthetics. 24th International Symposium on Computer-Based Medical Systems, 1-6. http://dx.doi.org/10.1109/cbms.2011.5999153
[29] Cooper, R.A., Boninger, M.L., Cooper, R., Dobson, A.R., Kessler, J., Schmeler, M. and Fitzgerald, S.G. (2003) Use of the Independence 3000 IBOT Transporter at Home and in the Community. Journal of Spinal Cord Medicine, 26, 79- 85.
[30] Simpson, R.C. (2005) Smart Wheelchairs: A Literature Review. Journal of Rehabilitation Research & Development, 42, 423-436. http://dx.doi.org/10.1682/JRRD.2004.08.0101
[31] Miller, L.A., Stubblefield, K.A., Lipschutz, R.D., Lock, B.A. and Kuiken, T.A. (2008) Improved Myoelectric Prosthesis Control Using Targeted Reinnervation Surgery: A Case Series. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 16, 46-50. http://dx.doi.org/10.1109/TNSRE.2007.911817
[32] Muzumdar, A. (2004) Powered Upper Limb Prostheses: Control, Implementation & Clinical Application. Springer- Verlag. http://dx.doi.org/10.1007/978-3-642-18812-1
[33] Guizzo, E. and Goldstein, H. (2005) The Rise of the Body Bots. IEEE Spectrum, 42, 42-48. http://dx.doi.org/10.1109/MSPEC.2005.1413730
[34] Chen, N., Tee, K.P. and Chew, C.M. (2013) Assistive Grasping in Teleoperation Using Infra-Red Proximity Sensors. 2013 IEEE RO-MAN, 232-237. http://dx.doi.org/10.1109/ROMAN.2013.6628451
[35] Smith, J., Garcia, E., Wistort, R. and Krishnamoorthy, G. (2007) Electric Field Imaging Pretouch for Robotic Graspers. IEEE/RSJ International Conference on Intelligent Robots and Systems, 676-683. http://dx.doi.org/10.1109/iros.2007.4399609
[36] Wistort, R. and Smith, J. (2008) Electric Field Servoing for Robotic Manipulation. IEEE/RSJ International Conference on Intelligent Robots and Systems, 494-499. http://dx.doi.org/10.1109/iros.2008.4650721
[37] Hsiao, K., Nangeroni, P., Huber, M., Saxena, A. and Ng, A.Y. (2009) Reactive Grasping Using Optical Proximity Sensors. IEEE International Conference on Robotics and Automation, 2098-2105. http://dx.doi.org/10.1109/robot.2009.5152849

  
comments powered by Disqus

Copyright © 2020 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.