Share This Article:

Holographic Raman Tweezers Controlled by Hand Gestures and Voice Commands

Full-Text HTML Download Download as PDF (Size:340KB) PP. 331-336
DOI: 10.4236/opj.2013.32B076    3,379 Downloads   5,302 Views   Citations

ABSTRACT

Several attempts have appeared recently to control optical trapping systems via touch tablets and cameras instead of a mouse and joystick. Our approach is based on a modern low-cost hardware combined with fingertips and speech recognition software. Positions of operator's hands or fingertips control the positions of trapping beams in holographic optical tweezers that provide optical manipulation with microobjects. We tested and adapted two systems for hands position detection and gestures recognition – Creative Interactive Gesture Camera and Leap Motion. We further enhanced the system of Holographic Raman tweezers (HRT) by voice commands controlling the micropositioning stage and acquisition of Raman spectra. Interface communicates with HRT either directly by which requires adaptation of HRT firmware, or indirectly by simulating mouse and keyboard messages. Its utilization in real experiments speeded up the operator’s communication with the system cca. Two times in comparison with the traditional control by the mouse and the keyboard.

Cite this paper

Z. Tomori, M. Antalik, P. Kesa, J. Kanka, P. Jakl, M. Sery, S. Bernatova and P. Zemanek, "Holographic Raman Tweezers Controlled by Hand Gestures and Voice Commands," Optics and Photonics Journal, Vol. 3 No. 2B, 2013, pp. 331-336. doi: 10.4236/opj.2013.32B076.

References

[1] K. C. Neuman and S. M. Block, “Optical Trapping,” Review of Scientific Instruments, Vol. 75, No. 9, 2004, pp. 2787-2809.doi:10.1063/1.1785844
[2] R. Bowman, D. Preece, G. Gibson and M. Padgett, “Stereoscopic Particle Tracking for 3D Touch, Vision and Closed-loop Control in Optical Tweezers,” Journal of Optics, Vol. 13, No. 4, 2011, p. 044003. doi:10.1088/2040-8978/13/4/044003
[3] J. E. Curtis, B. A. Koss and D. G. Grier, “Dynamic Holographic Optical Tweezers,” Optics Community,Vol. 207, No. 1-6, 2002, pp. 169-175. doi:10.1016/S0030-4018(02)01524-9
[4] G. Whyte, G. Gibson, J. Leach, M. Padgett, D. Robert, and M. Miles, “An Optical Trapped Microhand for Manipulating Micronsized Objects,” Optics Express, Vol. 14, No. 25, 2006, pp. 12497-12502. doi:10.1364/OE.14.012497
[5] J. A. Grieve, A. Ulcinas, S. Subramanian, G. M. Gibson, M. J. Padgett, D. M. Carberry and M. J. Miles, “Hands-on with Optical Tweezers: A Multitouch Interface for Holographic Optical Trapping,” Optics Express, Vol. 17, No. 5, 2009, pp. 3595-3602. doi:10.1364/OE.17.003595
[6] R. W. Bowman,et al., “iTweezers: optical micromanipulation controlled by an Apple iPad,” Journal of Optics, Vol. 13, 2011, p. 044002. doi:10.1088/2040-8978/13/4/044002
[7] C. McDonald, M.McPherson, C. McDougall and D. McGloin. HoloHands: Kinect Control of Optical Tweezers. arXiv: 1211.0220v1[physics.pop-ph].
[8] Intel Perceptual Computing SDK 2013 Beta. http://software.intel.com/en-us/vcsource/tools/perceptual-computing-sdk
[9] “Leap Motion”, 2013, https://www.leapmotion.com
[10] R. Hartley and A. Zisserman, “Multiple View Geometry in Computer Vision,” 2-nd edition, Cambridge University Press, 2004.
[11] Open sound format specification http://opensoundcontrol.org
[12] F. Vollmer and S. Arnold, “Whispering-gallery-mode biosensing: label-free detection down to single molecules,” Nature Methods, Vol.5, No. 7, 2008, pp. 591-596. doi:10.1038/Nmeth.1221

  
comments powered by Disqus

Copyright © 2017 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.