Measuring Light and Geometry Data of Roadway Environments with a Camera

Abstract

Evaluation of the conspicuity of roadway environments for their environmental impact on driving performance is vital for roadway safety. Existing meters and tools for roadway measurements cannot record light and geometry data simultaneously in a high resolution. This study introduced a new method that adopted recently developed high dynamic range (HDR) photogrammetry to measure the luminance and XYZ coordinates of millions of points across a road scene with the same device—a camera, and a MatLab code for data treatment and visualization. To validate this method, the roadway environments of a straight and flat section of Jayhawk Boulevard (482.8 m long) at Lawrence, KS and a roundabout (15.3 m in diameter) at its end were measured under clear and cloudy sky in the daytime and at nighttime with dry and wet pavements. Eight HDR images of the roadway environments under different viewing conditions were generated using the HDR photogrammetric techniques and calibrated. From each HDR image, synchronous light and geometry data were extracted in Radiance and further analyzed to identify potential roadway environmental hazards using the MatLab code (http://people.ku.edu/~h717c996/research.html). The HDR photogrammetric measurement with current equipment had a margin of errors for geometry measurement that varied with the measuring distance, averagely 23.1% - 27.5% for the Jayhawk Boulevard and 9.3% - 16.2% for the roundabout. The accuracy of luminance measurement was proven in the literature as averagely 1.5% - 10.1%. The camera-aided measurement is fast, non-contact, non-destructive, and off the road, thus, it is deemed more efficient and safer than conventional ways using meters and tools. The HDR photogrammetric techniques with current equipment still need improvements on accuracy and speed of the data treatment.

Share and Cite:

H. Cai and L. Li, "Measuring Light and Geometry Data of Roadway Environments with a Camera," Journal of Transportation Technologies, Vol. 4 No. 1, 2014, pp. 44-62. doi: 10.4236/jtts.2014.41005.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] X. Su, M. Mahoney, M. I. Saifan and H. Cai, “Mapping Luminance Gradient across a Private Office for Daylighting Performance,” The Illuminating Engineering Society (IES) Annual Conference, Minneapolis, 11-13 November 2012, pp. 1-15.
[2] A. Ekrias, M. Eloholma and L. Halonen, “Analysis of Road Lighting Quantity and Quality in Varying Weather Conditions,” LEUKOS, Vol. 4, No. 2, 2007, pp. 89-98.
[3] A. Ekrias, M. Eloholma, L. Halonen, X. Song, X. Zhang and Y. Wen, “Road Lighting and Headlights: Luminance Measurements and Automobile Lighting Simulations,” Building and Environment, Vol. 43, No. 4, 2008, pp. 530-536. http://dx.doi.org/10.1016/j.buildenv.2007.01.017
[4] D. Drkopoulos and E. Ornek, “Use of Vehicle-Collected Data to Calculate Existing Roadway Geometry,” Journal of Transportation Engineering, Vol. 126, No. 2, 2000, pp. 154-160.
http://dx.doi.org/10.1061/(ASCE)0733-947X(2000)126:2(154)
[5] Y. C. Tsai, J. P. Wu, Y. C. Wu and Z. H. Wang, “Automatic Roadway Geometry Measurement Algorithm Using Video Images,” Image Analysis and Processing-ICIAP, Vol. 3617, 2005, pp. 669-678.
[6] Y. C. Tsai, Z. Hu and Z. Wang, “Vision-Based Roadway Geometry Computation,” Journal of Transportation Engineering, Vol. 136, No. 3, 2009, pp. 223-233.
http://dx.doi.org/10.1061/(ASCE)TE.1943-5436.0000073
[7] J. P. Wu and Y. C. Tsai, “Enhanced Roadway Geometry Data Collection Using an Effective Video Log Image Processing Algorithm,” Journal of the Transportation Research Record, Vol. 1972, No. 1, 2006, pp. 133-140.
[8] J. P. Wu and Y. C. Tsai, “Enhanced Roadway Inventory Using a 2-D Sign Video Image Recognition Algorithm,” Computer-Aided Civil and Infrastructure Engineering, Vol. 21, No. 5, 2006, pp. 369-382.
http://dx.doi.org/10.1111/j.1467-8667.2006.00443.x
[9] B. W. He, X. L. Zhou and Y. F. Li, “A New Camera Calibration Method from Vanishing Points in a Vision System,” Transactions of the Institute of Measurement and Control, Vol. 33, No. 7, 2011, pp. 806-822.
http://dx.doi.org/10.1177/0142331209103040
[10] R. Cipolla, T. Drummond and D. Robertson, “Camera Calibration from Vanishing Points in Images of Architectural Scenes,” British Machine Vision Conference, British Machine Vision Association, London, 7-10 September 1999, pp. 224-233.
[11] B. Caprile and V. Torre, “Using Vanishing Points for Camera Calibration,” International Journal of Computer Vision, Vol. 4, No. 2, 1990, pp. 127-140.
http://dx.doi.org/10.1007/BF00127813
[12] G. Nehate andM. Rys, “3D Calculation of Stopping-Sight Distance from GPS Data,” Journal of Transportation Engineering, Vol. 132, No. 9, 2006, pp. 691-698.
http://dx.doi.org/10.1061/(ASCE)0733-947X(2006)132:9(691)
[13] M. Castro, L. Iglesias, R. Rodriguez-Solano and J. A. Sanchez, “Geometric Modelling of Highways Using Global Positioning System (GPS) Data and Spline Approximation,” Transportation Research Part C, Vol. 14, No. 4, 2006, pp. 233-243.
http://dx.doi.org/10.1016/j.trc.2006.06.004
[14] W. Koc, “Design of Rail-Track Geometric Systems by Satellite Measurement,” Journal of Transportation Engineering, Vol. 138, No. 1, 2012, pp. 114-122.
http://dx.doi.org/10.1061/(ASCE)TE.1943-5436.0000303
[15] Y. C. Tsai and F. Li, “Critical Assessment of Detecting Asphalt Pavement Cracks under Different Lighting and Low Intensity Contrast Conditions Using Emerging 3d Laser Technology,” Journal of Transportation Engineering, Vol. 138, No. 5, 2012, pp. 649-659.
http://dx.doi.org/10.1061/(ASCE)TE.1943-5436.0000353
[16] S. Amarasiri, M. Gunaratne and S. Sarkar, “Modeling of Crack Depths in Digital Images of Concrete Pavements Using Optical Reflection Properties,” Journal of Transportation Engineering, Vol. 136, No. 6, 2010, pp. 489-499.
http://dx.doi.org/10.1061/(ASCE)TE.1943-5436.0000095
[17] S. Amarasiri, M. Gunaratne and S. Sarkar, “Use of Digital Image Modeling for Evaluation of Concrete Pavement Macrotexture and Wear,” Journal of Transportation Engineering, Vol. 138, No. 5, 2012, pp. 589-602.
http://dx.doi.org/10.1061/(ASCE)TE.1943-5436.0000347
[18] D. Arditi, M. A. Ayrancioglu and J. Shi, “Effectiveness of Safety Vests in Nighttime Highway Construction,” Journal of Transportation Engineering, Vol. 130, No. 6, 2004, pp. 725-732.
http://dx.doi.org/10.1061/(ASCE)0733-947X(2004)130:6(725)
[19] A. Zatari, G. Dodds, K. McMenemy and R. Robinson, “Glare, Luminance, and Illuminance Measurements of Road Lighting Using Vehicle Mounted CCD Cameras,” LEUKOS, Vol. 1, No. 2, 2004, pp. 85-106.
http://dx.doi.org/10.1582/leukos.2004.01.02.005
[20] F. Aktan, T. Schnell and M. Aktan, “Development of a Model to Calculate Roadway Luminance Induced by Fixed Roadway Lighting,” Journal of the Transportation Research Board, Vol. 1973, 2006, pp. 130-141.
http://dx.doi.org/10.3141/1973-18
[21] J. Armas and J. Laugis, “Increase Pedestrian Safety by Critical Crossroads: Lighting Measurements and Analysis,” The 12th European Conference on Power Electronics and Applications, Aalborg, 2-5 September 2007, pp. 1-10.
[22] J. Armas and J. Laugis,. “Road Safety by Improved Road Lighting: Road Lighting Measurements and Analysis,” 2007.
http://egdk.ttu.ee/files/kuressaare2007/Kuressaare2007_83Armas-Laugis.pdf
[23] TechnoTeam, n.d., “LMK Mobile Advanced Specification,” 2012.
http://www.technoteam.de/products/lmk_luminance_measuring_camera/lmk_mobile_advanced/index_eng.html
[24] A. Ylinen, L. Tahkamo, M. Puolakka and L. Halonen, “Road Lighting Quality, Energy Efficiency, and Mesopic Design-LED Street Lighting Case Study,” LEUKOS, Vol. 8, No. 1, 2011, pp. 9-24.
[25] S. K. Nayar and V. Branzoi, “Adaptive Dynamic Range Imaging: Optical Control of Pixel Exposures over Space and Time,” Proceedings of the 9th IEEE International Conference on Computer Vision, Nice, 13-16 October 2003, pp. 1168-1175.
http://dx.doi.org/10.1109/ICCV.2003.1238624
[26] H. Cai and T. M. Chung, “Improving the Quality of High Dynamic Range Images,” Lighting Research & Technology, Vol. 43, No. 1, 2011, pp. 87-102.
http://dx.doi.org/10.1177/1477153510371356
[27] M. N. Inanici, “Evaluation of High Dynamic Range Photography as a Luminance Data Acquisition System,” Lighting Research & Technology, Vol. 38, No. 2, 2006, pp. 123-136.
http://dx.doi.org/10.1191/1365782806li164oa
[28] H. Zhou, F. Pirinccioglu and P. Hsu, “A New Roadway Lighting Measurement System,” Transportation Research Part C, Vol. 17, No. 3, 2009, pp. 274-284.
http://dx.doi.org/10.1016/j.trc.2008.11.001
[29] J. A. Gutierrez, D. Ortiz de Lejarazu, J. A. Real, A. Mansilla and J. Vizmanos, “Dynamic Measurement of Traffic Sign Luminance as Perceived by a Driver,” Lighting Research & Technology, Vol. 44, No. 3, 2011, pp. 350-363. http://dx.doi.org/10.1177/1477153511420049
[30] H. Cai, “High Dynamic Range Photogrammetry for Synchronous Luminance and Geometry Measurement,” Lighting Research & Technology, Vol. 45, No. 2, 2013, pp. 230-257.
http://dx.doi.org/10.1177/1477153512453273
[31] T. M. Chung and H. Cai, “High Dynamic Range Images Generated under Mixed Types of Lighting,” The Illuminating Engineering Society (IES) Annual Conference, Toronto, 7-9 November 2010, pp. 1-13.
[32] Commission International de l’Eclairage (CIE), “Discomfort Glare in the Interior Working Environment,” CIE Publication 055, CIE, Paris, 1983.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.