Publication ahead of print
Journal
Radioprotection
DOI https://doi.org/10.1051/radiopro/2022005
Published online 03 February 2022

© SFRP, 2022

1 Introduction

Workers eyes may be exposed to small and very bright sources, hence potential eye hazards.

As a first case, Light Emitting Diodes (LEDs) have become widespread in lighting systems and in some case operators have to watch them in operation (see example in Fig. 1a). Indeed, they have several advantages over traditional light sources. Firstly, these light sources do not emit ultraviolet or infrared radiations, nor can they explode, unlike high-pressure lamps. Secondly, their luminous power is instantly available, dimmable and their lifetime is independent from their usage cycle. Thirdly, lighting system designers can assemble these small components in small spaces. Finally, LED luminous efficiency has increased over time and now exceeds that of high-pressure lamps. These advantages make LEDs the light source of choice when considering power consumption.

LEDs are electronic components emitting a monochromatic blue light, making fluorescent layers re-emitting a continuous spectrum of white light. Two characteristics of LED light emission may be harmful for workers: high luminance and richness in blue light.

Firstly, LEDs are far brighter than tubes used in interior lighting: up to 106 cd.m−2 compared to 20 000 cd.m−2 for T5 fluorescent tubes (ANSES, 2019). If we consider that the luminance of walls and objects is about 30 to 100 cd.m−2, contrasts with LEDs may be as high as 30 000:1. Recommended contrast ratios go from 5:1 in the working field of view to 50:1 between any lighting system and the ceiling around (Commission de normalisation AFNOR X35A – Ergonomie et Commission de normalisation AFNOR X90X – Lumière et éclairage, 2013; Comité Technique CEN/TC 169 « Lumière et éclairage », 2021). Apart from making workers uncomfortable, these contrasts create a veiling luminance in the worker’s eyes, thus preventing the workers from perceiving parts around the LEDs in operation, which may result in hazards (e.g., electrical shock).

In addition, LED luminance is high enough to create a persistent glare in the worker’s field of view and impairing workers’ sight is an accident generator.

Secondly, the high-radiance light emitted by LEDs contains a variable level of blue light. This has been an increasing concern in both public and occupational health (SCHEER, 2018). Selling LED lights and lamps requires CE marking. This label requires the light sources to be classified according to the IEC 62471 standard (IEC, 2008) in four groups. Regarding the blue-light hazard (ICNIRP, 2013), which is retina destruction caused by a photochemical blue light effect, LED lighting systems classified as GR0 or GR1 are recommended. With optics and filters removed for maintenance, worker’s eyes may be directly exposed to LED chips potentially classified as GR2. In this case, eye exposure to blue light may exceed the daily acceptable threshold of 0.25 to 100 s. In this case, it is necessary to protect workers’ eyes. Density filters such as sunglasses are not suitable as they prevent the workers from correctly perceiving the working environment.

A second case of bright sources is welding arcs. They have been proved harmful for eyes and skin (Guénel et al., 2001; Schwass et al., 2011; Piernick et al., 2019; Boiano et al., 2020). Their light is composed of UV radiations as well as visible and infrared, all showing adverse effects on the different parts of the eyes. For instance, a simulation with CatRayon (INRS, 2018) (Barlier-Salsi and Salsi, 2007) of an exposure to a 160 A electrode-welding arc on steel shows that daily exposure thresholds are exceeded in 11 s regarding keratoconjunctivitis, in 20 s for blue light hazard and 20 mn for UV-induced cataract.

In addition, welders tend to ignite arcs without screening their eyes, because this operation requires visual precision while passive screens following NF 169 standard hide objects (AFNOR, 2003). In this case, it has been demonstrated that the ignition of welding arc can pose significant risk arising from emitted UV radiation for unprotected eyes and skin (Rybczýnski et al., 2018), as aversion time me must considered as too long to protect eyes. This paper shows that 30 J/m2 exposure limit (ICNIRP, 2004) can be reached in 0.1 to 0.5 s depending on the welding method.

Nowadays, helmets are commonly equipped with LCD screens that darken as soon as the welding process begins. As exposed in Figure 1b, helmet screens limit the operator perception to the arc, hence hiding the environment and parts to weld.

A third case of bright sources is lasers, especially during experiments (see example in Fig. 2b). Depending on laser wavelength, effects go from keratoconjunctivitis or cataract to partial of total blindness. Lasers are classified according to EN 60825-1 (AFNOR, 2008), where only class 1 lasers are safe. When working with higher rated lasers, wearing coloured glasses prevents from eyes hazards but inconveniently impairs seeing the laser spot at the same time. In addition, some lasers are not visible, hence requiring viewing cards to detect spots.

The system described below aims to solve these issues when working on such intensive light sources, while keeping sight of the occupational situation. The described method fits to LEDs in operation and welding arcs, while working with lasers can easily benefit from an adaptation of the method.

thumbnail Fig. 1

Examples of occupational eye exposure to bright light sources. (a) Operators directly see LEDs in operation while checking medical (scialytic) lamps quality. (b) Viewing through the LCD screen of a welder helmet while welding (here a 140 A TIG) hides surrounding and welded parts. (c) Working with lasers expose to dangerous reflected light beams.

thumbnail Fig. 2

The proposed eye protective equipment when working on LEDs in operation (a) and welding (b). Note that the welder skin was exposed shortly enough – about 3 s – to prevent from sunburn.

2 Material and methods

The proposed protection when working on bright sources consists in a virtual reality helmet holding a smartphone with a dedicated application. As illustrated in Figure 2, the helmet cap is removed so that the smartphone camera can observe the working task.

2.1 Application scheme

Operators’ eyes do not directly see the light sources. Instead, an Android application captures camera pictures, and then processes and displays them on the smartphone screen as illustrated in Figure 3.

Firstly, successive pairs of pictures are constantly being captured. As illustrated in Figure 4, each pair is comprised of an under-exposed and an over-exposed picture. Typically, low exposure requires less than 1/8000th of a second to capture the shape of the LED in operation or the welding arc, while high exposure requires 1/250th to 1/100th of a second to capture the low-luminance surroundings and the background area.

These pictures directly go from the camera to the graphics card in the smartphone. Thus, there is no main-memory access bottleneck and no CPU processing. This is necessary to achieve real-time rendering (about 30 frames per second) of the workplace.

Secondly, these pictures are stored in the video memory as Open GL (Khronos Group, 2011) textures, acting as digital wallpapers applied on surfaces to be displayed. Via General-Purpose Computing on Graphics Processing Units (GPGPU) (Wikipedia, 2020), each pair of textures is used in a tone-mapping algorithm to compose a high-dynamic-range image, mixing low and high-luminance elements. The tone-mapping algorithm used in this paper is described later. This new picture is also stored in the video memory, thus avoiding any access bottleneck.

Thirdly, the composed picture is adapted and displayed in front of each of the operator’s eyes with a luminance below 1000 cd.m−2. As stated by the European health authorities (SCHEER, 2018), luminance via a virtual reality device is safe.

The display stage takes advantage of the Google VR library (Google, 2020) to split the smartphone screen in two, one for each eye. As the camera is monocular, this Augmented and Virtual Reality library applies geometrical transformations to make each half screen correspond to the point of view of one eye. The operator then receives a pseudo-3D display through the helmet lenses.

thumbnail Fig. 3

Application scheme of image capture, processing and display using VR helmet.

thumbnail Fig. 4

(a) Under-exposed picture of the LED in operation. (b) Corresponding over-exposed picture to capture low-luminance elements in the field of view.

2.2 Image processing

Rendering the captured field of view too slowly would be uncomfortable and dangerous for the operator. Therefore, it is necessary to achieve real-time rendering (at least 30 frames per second).

To achieve this frame rate, the pairs of captured images (under and over-exposed) are directly stored in the video memory, so that the smartphone graphics card can directly access them as OpenGL textures (Wright, 2011). These textures are 2D arrays of colour values, mapped on 3D geometry in order to render them realistically.

The graphics card can execute a program called shader, to combine the two captured textures into an output texture of the same size. This tone-mapping shader program is required once per output texture element. As hundreds of these elements are computed in parallel, the application achieves real-time rendering. This last texture will then be displayed on the smartphone screen using the Google VR library (Google, 2020).

In the tone-mapping shader, the following should be noted:

  • The tone mapping weights over- and under-exposed pictures before mixing them. These weights rely on texture elements green channel, as its spectral sensitivity is the closest to eyes photopic sensitivity;

  • CU is the green component of the under-exposed texture element TU;

  • CO is the green component of the over-exposed texture element TO;

  • nU is a constant weighting parameter for the under-exposed texture elements;

  • nO is a constant weighting parameter for the over-exposed texture elements.

The tone-mapping result is a normalized sum of the under-exposed and over-exposed textures elements, weighted respectively by WU and WO as defined in equations (1) and (2) and illustrated in Figure 5: (1) (2)

Figure 6 shows values of WU and WO in the example of Figure 4: for ease of reading, here nU = 2 and nO = 6, although nU = 0.05 and nO = 3 avoid dark halos around the brightest areas in the picture.

Lastly, TU and TO are combined in the displayed colour texture element T, as in equation (3): (3)

thumbnail Fig. 5

Weight of colour components in under and over-exposed pictures, following equations (1) and (2), before gamma correction. In green, the level after mixing pictures, following equation (3).

thumbnail Fig. 6

From (a) to (b), grey levels represent weight WU and WO in respectively under and over-exposed pictures of the LED lamp in Figure 4, when nU=2 and nO=6. Note that perspective differs from Figure 4 as it is adapted to binocular viewing through VR lenses.

3 Results

The VR system delivers 30 frames per second on a mid-range smartphone, making the system comfortable to operate on LEDs and to weld. Exposure time was set at 1/32 000th s for under-exposed pictures, and 1/100th s for over-exposed pictures to correspond to in situ lighting conditions. Gamma correction is set to 2.6. Tone-mapping parameters were set to nU = 0.05 and nO = 3.

As illustrated in Figure 7, the remarkably simple tone mapping technique presented in the equations (1)(3) is sufficient to operate on LEDs and to weld while perceiving the surroundings.

thumbnail Fig. 7

Binocular rendering of the situation in Figure 2, with perspective adapted to VR lenses. nU=0.05 and nO=3. (a) TIG welding; (b) dedicated LED bench.

3.1 Are VR googles really safe to use?

Despite Health European authorities’ statement (SCHEER, 2018), we checked VR googles safety against blue-light hazard (NF EN 62471, 2008) and chronobiologic disturbance as conveniently summarized by Prayag et al. (2019a).

We turned the VR helmet with the smartphone (a Xiaomi Redmi 7) screen intensity at its maximum, toward and close to the bottom-middle set of LED ships seen in Figure 3.

In place of a user eye inside the helmet, we measured spectral irradiance using a stray light corrected (Barlier-Salsi, 2014) CCD spectroradiometer and a luminance map using a videoluminancemeter with macro lense (see Fig. 8).

Melatonin suppression is best predicted using eye melanopic illuminance (Prayag et al., 2019b). Using Lucas et al.’s tool (Lucas, n.d), eye irradiance was converted into 29 melanopic lux. Prayag et al. modelled the effect of melanopic illuminance on melatonin suppression (Prayag et al., 2019b). This model predicts around 15% melatonin suppression to ponder with exposure moment and duration as well as the user’s light history. Melatonin phase shift may also occur at this melanopic illuminance level, in accordance with general advices on not using computer and smartphone screens in the evening.

Luminance map in Figure 8 reveals luminances being below 150 cd/m2. Eye irradiance was converted into a 150 cd/m2 source blue-light efficient radiance LB= 0.15 W/m2/sr. It is well below the lowest exposure threshold (100 W/m2/sr for 10 000 s): the system is safe for the retina even for brighter smartphone screens.

thumbnail Fig. 8

Luminance perceived inside the VR helmet of a set of LED.

4 Discussion

This work simply proves Augmented Reality to be useful in protecting operators from bright sources glare and hazards. It shows that it is possible to clearly observe both the LEDs and welding arcs, which are very luminous, and the surrounding area, whose luminance is several orders of magnitude lower. This system is easily available and low-cost.

For the moment, the proposed prototype is basic and requires improvement for use in occupational conditions, as discussed below.

4.1 Tone-mapping techniques

Tone-mapping as exposed in the equations (1)(3) is simplistic, with several flaws such as non-conservation of colours (e.g., colours tend to loose saturation as illustrated in Fig. 7). There is an obvious need to preserve colours and avoid visual artefacts. It should be noted that any tone-mapping algorithm would still have to work on two images only to preserve the frame rate.

In addition, tones are mapped in one pass. A preliminary treatment of over-exposed pictures could additionally improve contrasts by accounting for the pixel neighbourhood. This would help operators perceive shapes, marks and writings.

4.2 Ergonomics

Quality ergonomics is necessary to protect against bodily injuries (for example, falling from a stepladder or dropping something), as well protecting the operator’s eyes.

Firstly, the rendering perspective needs to be like human vision: the perceived scale of objects must be the closest possible to reality.

Secondly, viewing Augmented Reality or Virtual Reality can be uncomfortable in case the binocular rendering does not fit the perspective of the operator’s eyes. This application needs to be adjusted to the operator’s own eye parameters (distance from nose to eye pupil, pupil height…), in order to feel comfortable, and avoid headaches or other disorders.

Thirdly, the application must not lose and search for focus, thus impairing the operator’s sight. As pairs of images are captured continuously, it is possible to keep track of the successive focusing distances, just like a camera in continuous autofocus mode. In addition, it would be better if the application could predict the next focus point. This can be achieved by exploiting motion sensors usually available on smartphones, to predict where the operator is going to look.

Lastly, the application cannot be safe without avoiding smartphone activities like notifications, screensavers and incoming calls from appearing in the field of view.

4.3 Adaptive exposure for over-exposed pictures

We used fixed exposure times for both under and over-exposed pictures. Adaptive exposure is necessary as luminance in the field of view may strongly vary when the operator moves his/her head.

In under-exposed pictures, light sources like LEDs in operation and welding arcs will always saturate the camera. Therefore, under-exposed pictures will simply use the lowest exposure time permitted by the smartphone.

On the other hand, over-exposed pictures show the least luminous parts in the field of view. Therefore, the application needs to compute the necessary exposure time without being fooled by the most luminous parts in the field of view (LEDs and other artificial or natural light sources and their reflections on surfaces). This computation could, for instance, use the under-exposed picture in the previously captured pair, to ignore the most luminous elements in sight.

4.4 Reducing contrasts with flash

Lighting with the camera LED on the smartphone reduces the contrast between the most and the least luminous parts in the operator’s field of view. It is likely to reduce the over-exposure time, and so improve the quality of the over-exposed pictures: less exposure time reduces motion blur and sensor noise. In addition, it may improve the frame rate as images become quicker to capture.

4.5 Perceiving very NIR lasers (and diodes)

Eyes perceive light in the 380 to 780 nm range, while CCD perceives light up to 950 nm at least. Working with NIR lasers requires laser-viewing cards, what is not always convenient. Figure 9 shows how a camera perceives an 808 nm laser spot on a metal plate after mirror reflection.

As a general scheme, since users know their laser wavelength, a dedicated application could render lasers efficiently.

thumbnail Fig. 9

Using an 808 nm laser, as seen by human eyes (a) and by a CCD camera (b), reflected by a mirror toward a metal plate. (b) We can see the laser impact on the metal plate after reflection or the mirror. We added a little smoke to show the camera perception of the 808 nm laser trajectory toward the mirror (no reflected beam appeared).

4.6 LED flicker

Even if invisible, LEDs flicker as their drivers are unable to deliver a continuous current. This is especially the case when dimming the lighting system. Flickering LEDs can make people uncomfortable and even endanger them (IEA 4E SSL, 2011): the lower the flicker frequency, the lower the flicker magnitude required for risk of adverse effects (Miller and Lehman, 2015).

We found that filming LEDs can result in displayed pictures blinking at low frequency. This can be uncomfortable and even unbearable for operators. Therefore, a time filter should be applied on successive renderings to avoid such a situation.

On the positive side, this suggests that smartphones may be investigated as a cheap way to characterize LED flickers. As it is possible to choose the duration between two camera shots and exposure time, it seems possible to carry out a non-uniform time-sampling of captured images and run flicker indices analysis. This type of camera-based high-grade tool would be a lot more widespread and low-cost than current flicker meters.

Lastly, instead of counterbalancing high contrasts with HDR rendering, a smartphone application could be used to evaluate contrasts and related risks.

5 Conclusion

Working on bright sources like LEDs in operation, welding arcs and lasers can be harmful for the eyes, as well as uncomfortable for the operator, and the glare can be an accident generator. An Augmented Reality system is potentially useful as protective equipment, composed of a virtual reality helmet and a smartphone. The operator looks at the light source through a tone-mapped rendering of his/her filmed field of view.

This low-cost and easily available prototype protects the eyes while being able to work on LEDs, weld and lasers, and perceive the surroundings and even NIR lasers spots. To be usable, it will be necessary to improve tone-mapping rendering, ensure high quality ergonomics and image capturing and avoid flicker effects. To prevent bodily injuries, the system must protect the user’s sight from smartphone notifications.

Lastly, this study is a first step towards risk assessment smartphone applications like LED flicker indices and visual comfort assessments.

Conflict of interest

The authors declare that they have no conflict of interest.

Funding

This research did not receive any specific funding.

Statement of informed consent

This article does not contain any studies involving human subjects.

Ethical approval

Ethical approval was not required.

Authors contributions

J.-M. Deniel: Conceptualization, methodology, supervision and writing original draft. S. Thommet: Conception, design and prototyping.

References

Cite this article as: Deniel J-M, Thommet S. 2022. Occupational eye protection using Augmented Reality: a proof of concept. Radioprotection, https://doi.org/10.1051/radiopro/2022005.

All Figures

thumbnail Fig. 1

Examples of occupational eye exposure to bright light sources. (a) Operators directly see LEDs in operation while checking medical (scialytic) lamps quality. (b) Viewing through the LCD screen of a welder helmet while welding (here a 140 A TIG) hides surrounding and welded parts. (c) Working with lasers expose to dangerous reflected light beams.

In the text
thumbnail Fig. 2

The proposed eye protective equipment when working on LEDs in operation (a) and welding (b). Note that the welder skin was exposed shortly enough – about 3 s – to prevent from sunburn.

In the text
thumbnail Fig. 3

Application scheme of image capture, processing and display using VR helmet.

In the text
thumbnail Fig. 4

(a) Under-exposed picture of the LED in operation. (b) Corresponding over-exposed picture to capture low-luminance elements in the field of view.

In the text
thumbnail Fig. 5

Weight of colour components in under and over-exposed pictures, following equations (1) and (2), before gamma correction. In green, the level after mixing pictures, following equation (3).

In the text
thumbnail Fig. 6

From (a) to (b), grey levels represent weight WU and WO in respectively under and over-exposed pictures of the LED lamp in Figure 4, when nU=2 and nO=6. Note that perspective differs from Figure 4 as it is adapted to binocular viewing through VR lenses.

In the text
thumbnail Fig. 7

Binocular rendering of the situation in Figure 2, with perspective adapted to VR lenses. nU=0.05 and nO=3. (a) TIG welding; (b) dedicated LED bench.

In the text
thumbnail Fig. 8

Luminance perceived inside the VR helmet of a set of LED.

In the text
thumbnail Fig. 9

Using an 808 nm laser, as seen by human eyes (a) and by a CCD camera (b), reflected by a mirror toward a metal plate. (b) We can see the laser impact on the metal plate after reflection or the mirror. We added a little smoke to show the camera perception of the 808 nm laser trajectory toward the mirror (no reflected beam appeared).

In the text

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.