Langues:

Effets sur la santé de la lumière artificielle

4. What effects on health have been observed?

  • 4.1 Thermal and chemical effects.
  • 4.2 Effects on the eyes
  • 4.3 Effects on the sleep, mood and the circacian rhythm.

4.1 Thermal and chemical effects.

The SCENIHR opinion states:

3.5. Adverse health effects in the general population

Besides the short-term local effects presented in section 3.4.3, optical radiation (or lack of it) can cause systemic or long-term adverse health effects.

3.5.1. Photothermal effects

Acute thermal damage (burns) to the skin is usually prevented or minimized by aversion responses. Such burns may be evoked by extremely intense sources of optical radiation, such as lasers or high-power flash lamps. Under extreme conditions (environmentally and/or physically) high levels of (solar) visible light and/or IR (as in IR saunas) may heat the skin and thus contribute to a breakdown of the body’s thermoregulation through the skin resulting in a “heatstroke”. If not treated properly such a systemic hyperthermia may be fatal. However, such extreme heat assaults are not to be expected from artificial optical sources intended for lighting purposes.

Regular localized heating of the skin (stoves under the feet or a hot water bottle on the stomach), not necessarily causing burns, can cause a skin condition dubbed “erythema ab igne”, reddish to brown colouration (Edwards et al. 1999) and has anecdotally and in clinical case reports been associated with the development of skin cancer, “turf fire cancer” or “cangri cancer” (ICNIRP 2006a). In mouse experiments IR and higher room temperatures were found to enhance skin tumour formation from chronic UV radiation (van der Leun and de Gruijl 2002); epidemiologic data of skin cancer incidence in different geographic locations indicate that this may also be true in humans (van der Leun et al. 2008).

3.5.2. Photochemical effects

3.5.2.1. Vitamin D status

UV deprivation can lead to adverse health effects. As described in 3.4.3.2 A2, vitamin D3 is produced naturally in the skin from exposure to UVB radiation in sunlight. As the nutrient vitamin D is contained in inadequate quantities in our modern Western diet, the vitamin D status shows considerable seasonal variation in temperate climates because of ineffective solar UV exposure in wintertime. Vitamin D needs to be metabolised in order to become active as a hormone: it is hydroxylated to 25 hydroxyvitamin D3 and 1,25 hydroxyvitamin D3 in the liver and kidney, respectively. The latter metabolite binds to the “vitamin D receptor” in gut epithelial and bone cells. This is known to regulate calcium absorption and mobilisation in the gut and bones. Over the last few decades it has, however, become clear that 1,25 dihydroxyvitamin D can be formed outside the kidneys (extrarenally) in various tissues and immune cells, and the vitamin D receptor is present in a plethora of different cell types triggering various responses. Thus, vitamin D may potentially have a broad impact on health, but the evidence for this is generally inconclusive. Although many observational epidemiological studies have reported decreases in the risk of various diseases and conditions (e.g. schizophrenia, autism, multiple sclerosis, diabetes, respiratory tract infections, influenza and certain types of cancer) associated with elevated levels of vitamin D or increased sun exposure, the available data in humans are mostly either too sparse or inconsistent, and generally inadequate to assert any causal relation (Norval et al. 2011, Zhang and Naughton 2010). The evidence for colorectal cancer due to vitamin D deficiency is mounting and found to be “persuasive” but “limited” by the International Agency for Research on Cancer (IARC 2008), although some experts have recently qualified it as “sufficient” (Dutch Cancer Society 2010).

Furthermore, UV deprivation in wintertime causes a loss of skin photo adaptation which in some individuals suffering from photosensitivity disorders such as polymorphic light eruptions may predispose to springtime provocation of their condition upon renewed UV exposure in springtime or higher levels of exposure during summer holidays (see section 3.6.1.1).

3.5.2.2. Assessment of effects on healthy skin

Pain sensation from thermal effects occurs at lower skin temperatures than those at which burns occur. The thresholds for “effective” irradiancies of “thermal radiation” are given in DIN 33403 (DIN 2001), and amount to 1 kW/m2 for exposure times over 5 minutes (where “effective” refers to the difference between the incoming and emitted radiant flux density at the skin surface; the latter is about 460 W/m2 at 30°C skin surface temperature and emissivity of 0.97, while about 410 W/m2 would be incoming in a room with 18°C wall temperature and emissivity of 1). With shorter exposure times (t), this threshold goes up (approximately in direct proportion to t-1/2); the International Commission on Non-Ionizing Radiation Protection (ICNIRP) has formulated limits for exposure times up to 10 s (irradiance <20,000 t-3/4 W/m2; t in s), which could be extended to longer exposure times to give very conservative limits (ICNIRP 2006b). No limits were given by ICNIRP at these longer exposure times because effects strongly depend on thermal environmental conditions. Natural avoidance behaviour will restrict exposure times and prevent thermal injury. In practice, thermal pain sensation from an illuminator is quite exceptional and would only occur in very close proximity to very high intensity sources.

Other thermal damage, such as erythema ab igni, would require protracted substantial heating of the skin which is not likely to occur from lighting sources. There are indications that elevated ambient temperatures and/or IR radiation may increase skin cancer formation from sun (UV) exposure (van der Leun et al. 2002, van der Leun et al. 2008) and that IRA may enhance skin aging, but evidence in humans is very weak or lacking, and the effects cannot be quantified in any reliable way. Overexposure to UVB and UVA radiations cause well known sunburn reactions. The first feature is skin reddening (“erythema”) after a couple of hours and at higher doses severe discomfort and, after a couple of days, skin peeling. The UV radiation at these dose levels has a clear toxic impact which evokes an inflammatory reaction causing dilation of superficial capillary blood vessels (increased redness, and skin temperature), leakage of serum through vessel walls (causing oedema) and trafficking of immune cells from the blood vessels into the skin and from the skin into draining lymph vessels. The effective UV dose for sunburn is generally assessed by an action spectrum standardized by the Commission International de l’Éclairage (the erythemal action spectrum; CIE 103/3 Reference Action Spectra for Ultraviolet Induced Erythema and Pigmentation of Different Human Skin Types). The skin can adapt to gradually increasing levels of UV exposure (occurring from spring to summer). This decrease in sensitivity is accompanied by a tanning reaction, but people that do not tan are nevertheless capable of acclimation to increasing UV levels.

The threshold UV dose for a minimal reddening of the skin occurring some 4 to 8 hours after exposure (the minimal erythemal dose, MED) for a fair-skinned Caucasian (skin phototype II) is typically 200-250 J/m2 when the exposure is spectrally weighted according to the CIE-erythemal action spectrum, which equals 2-2.5 SEDs (SED stands for standard erythemal dose = 100 J/m2 of CIE-erythemally weighted UV). The solar midday exposure rate is given as the UV index in weather reports, where 1 hour of exposure at a UV index of 7, as in summer in Northern Europe, amounts to 6.3 SEDs and at a UV index of 10, as in the Mediterranean region, to 9 SEDs, i.e. about 2.5 to 4.5 times the typical MED of a previously unexposed, unadapted fair-skinned Caucasian. The actual MED of unexposed fair skin varies, with 95% within the range of 2.5 times below and over the median of about 2.2 SED (based on a median of 34 J/cm2 and 95% in 14– 84 J/cm2 at 300 nm; Diffey 1991). MEDs also vary over the body. In acclimation of the skin to UV radiation the MED is raised several-fold, preventing sunburns from occurring as the UV index increases from spring to summer. Skin peeling occurs after severe overexposure, >4 MEDs.

As 1 MED of whole body irradiation has been estimated to produce roughly up to 20,000 IU of vitamin D (Holick 2004), and because most fair skinned people maintain adequate levels of vitamin D over summer (>50 nmol/l 25-hydroxyvitamin D) (Frost et al. 2010, Hyppönen and Power 2007, Webb et al. 2010), it has been asserted that regular brief exposures (15–30 minutes) in summer clothing to midday summer sun is adequate for vitamin D, which is supported by recent experimental evidence (Rhodes et al. 2010b). With low UV indices it is not possible to produce adequate levels of vitamin D in winter time, and the majority of people in Northern and Northwestern Europe do not maintain an adequate vitamin D status (Hyppönen and Power 2007, Webb et al. 2010). It was shown that 3.9 SEDs/week from a simulated summer sun to a group of volunteers in summer clothing was sufficient to attain sufficient vitamin D statuses in wintertime (Rhodes et al. 2010b). UV exposure from indoor lighting will commonly fall well below this level; even fluorescent lamps at the top end of CEI/IEC Risk Group 0 for UV emission will not come close in an office setting. Hence, lamps for indoor lighting would most likely need to overstep the current UV emission norm to be effective in vitamin D production. However, a shortage of vitamin D from sunlight in winter can easily be compensated for by oral vitamin D supplements, or by a diet rich in fatty fish. Overexposure to UV radiation, but also to a lesser degree sub-acute doses (<1 MED), can suppress adaptive cellular immunity (i.e. acquired immunity against a pathogenic agent or substance and effected by direct cell-to-cell contact) which in animal experiments was proven to contribute to skin cancer formation and aggravate bacterial and viral infections (Norval 2006b). Solar overexposure is thus known to cause cold sores in humans, a flare-up of an infection with Herpes Simplex viruses (Norval 2006a, Sayre et al. 2007). On the other hand, UV irradiation is known to boost innate immunity (inborn defences against infectious agents; UV exposure increases levels of anti-bacterial proteins in the skin) (Gläser et al. 2009). The immunosuppression apparently serves to prevent adverse immune (allergic) reactions to the UV-exposed skin (putatively against photochemically altered molecules) while the boosted innate immunity increases acute defences against exogenous infectious agents.

Episodes of severe sunburns have been found to be associated with increased risk of skin cancer, specifically of malignant melanomas (Gandini et al. 2005), but also of basal cell carcinomas (Kütting and Drexler 2010). UV exposure in childhood is linked to increased risk of melanoma later in life (Armstrong and Kricker 2001). The action spectrum for the UV induction of squamous carcinomas has been determined in (hairless) mice and resembles that of sunburn (de Gruijl and Van der Leun 1994). Hence, UV indices, based on sunburn effectiveness, given in weather reports also reflect the carcinogenic effectiveness of sun exposure. The action spectrum for photo-ageing is not well defined, and could range from UV to IRA. In contrast to sunburn, there is no threshold dose known for photo-aging and it does not exist for the induction of skin cancers.

Cancer is the result of a probabilistic process for which increasing UV dose or more excessive UV dosages increase the chance. This complicates defining an “acceptable UV dose” because it requires a choice regarding what is an “acceptable risk”. Moreover, the data on skin cancer and related sun (UV) exposures are generally not detailed enough for adequate risk assessments of personal UV exposure. As the experimentally determined action spectrum for the induction of skin carcinomas roughly resembles the sunburn action spectrum (de Gruijl and Van der Leun 1994), relating annual ambient erythemal exposures and typical personal exposures to the actual skin cancer incidence in a population is informative. Skin cancer incidences are quite substantial in Northwestern Europe even though ambient UV levels are considered low. For Denmark the following was recently reported: “Between 1978 and 2007, the age-adjusted incidence of basal cell carcinoma (BCC) increased from 27.1 to 96.6 cases per 100,000 person-years for women and from 34.2 to 91.2 cases for men. The incidence of squamous cell carcinoma (SCC) increased from 4.6 to 12.0 cases per 100,000 person-years for women and from 9.7 to 19.1 cases for men” (Birch-Johansen et al. 2010). These increases are most likely attributable to increases in sun exposure decades earlier. The median annual exposure among 164 Danish volunteers (age 4-67 years) was found to equal 166 SEDs (95% within 37-551 SEDs; Thieden et al. 2004). A quarter of lifetime exposure was received before the age of 20 years. A recent Danish study also reported that sunburning was quite common, with 35% of 3,499 people (age 15–59) reported to have been sunburned in the preceding 12 months (Køster et al. 2010). The incidence of melanoma in Denmark in 2008 was 20.2 and 26.6 per 100,000/y for men and women, respectively. This was the highest among women in Europe (mortality 4.3 and 2.5 per 100,000/year, respectively4). Based on Norwegian data and on US skin cancer surveys among non-Hispanic white Caucasians, SCC and BCC incidences increased by 2.3 ± 0.5 and 1.7 ± 0.3%, respectively, per 1% increase in ambient annual erythemal dose (NRPB 2002). Incidences of melanoma are reported to increase by about 0.6 ± 0.4% (Eide and Weinstock 2005, Slaper et al. 1996) per 1% increase in ambient annual erythemal dose. However, although episodes of severe sunburn are linked to an increase in melanoma risk and sunscreen use by adults has been proven to protect against melanoma (Green et al. 2011), whether the erythemal action spectrum is appropriate for melanoma is still debated.

With a large spread, the annual erythemal dose of outdoor workers is about twice that of indoor workers in Northwestern Europe (medians of 3.1 vs 6.7% of ambient dose, excluding holidays, i.e. 138 vs 300 SEDs/y, 95% interval about 3-fold below and over medians; Schothorst et al. 1985). Outdoor professions are commonly associated with increased risk of SCC and BCC. Thus, it was found that in German males, outdoor occupations brought about a relative risk (RR) = 2.9 (95% CI, 2.2-3.9), for BCC and 2.5 (95% CI, 1.4-4.7) for SCC (Radespiel-Tröger et al. 2009). A recent metastudy on occupational UV exposure found that 16 out of 18 studies reported an increased risk of SCC in outdoor workers (in 12 of the studies there was a significant increase) and an overall odds ratio of 1.8 (95% CI, 1.4-2.2) (Schmitt et al. 2011). This pattern is not entirely consistent (Green et al. 1996, Håkansson et al. 2001), and a recent Danish study even found a lower risk of SCC in outdoor workers, an odds ratio of 0.83 (95% CI, 0.77 – 0.88) (Kenborg et al. 2010). The steady increase in SCC in the general population over decades, most likely owing to increased sun exposure in leisure time, probably gradually evens out the differences with outdoor workers.

No increase in the risk of malignant melanoma (Radespiel-Tröger et al. 2009), except for those on face, hands (Beral and Robinson 1981) and eye (Håkansson et al. 2001), has been observed among outdoor workers. On the contrary, some studies show a significant reduction in risk (Rivers 2004), whereas office workers showed an increased risk on trunk and limbs (Beral and Robinson 1981); plausibly attributable to fewer sunburns in the acclimated skin of outdoor workers, in contrast to more frequent sunburns among indoor workers because of intermittent overexposure of their un-acclimated skin.

An estimate of a 3.9% (1.6-12%) increase in lifelong SCC risk for office workers by exposure to fluorescent daylight lamps was previously calculated (Lytle et al. 1992- 1993). The effect is small but not entirely negligible. The study combined measurements of UV spectra of commercially available fluorescent daylight lamps (four different types) on the US market with the above mentioned percentage increase in SCC per percent increase in ambient UV, and estimates of personal UV exposures from sunlight and unfiltered fluorescent lighting in schools and offices. Filtering through acrylic prismatic diffusers, instead of louvers, in the fixtures was found to reduce the effective UV exposure by more than 100-fold, i.e. virtually eliminating any contribution to the SCC risk. More recently, a survey of indoor lighting sources in the USA revealed UV levels of wavelengths around 300 nm in the order of 10-5-10-4 W/m2/nm at a distance of 20 cm, comparable with outdoor summer levels of solar radiation, and even higher levels than in solar radiation at shorter wavelengths (Sayre et al. 2004). In the UVA range the levels were generally orders of magnitude lower than those in summer sun. No erythemally effective dosages were given.

Based on a small sample of CFL spectra (kindly provided by Ms. M. Lukovnikova of the Belgian Federal Public Service of Health) we found that the emission of erythemally effective UV from compact fluorescent lamps varied enormously from one type to the next: from undetectable levels (<0.05 mW/m2) at 500 lx to substantial levels (a few mW/m2, i.e. around 100 SED/year with persistent exposure at this level during office hours) (office lighting up to 1,000 lx, which should be compared to the levels in a living room which would typically be aroununion are easily prevented by producing lamps with UV-absorbing glass envelopes. In actual practice the indoor exposures for most indoor workers are likely to fall well below those corresponding to a coUncococontrolled (limits given in Directive 2006/25/EC for indoor workers). Studies performed by the Dermatology Department of the Bispebjerg Hospital in Comedianionthreshold stated at 0.1 SED/h) and may not have adequately included indoor exposures (more directed at solar exposures).

For indoor workers the EU (Directive 2006/25/EC; following the American Conference of Governmental Industrial Hygienists, ACGIH, and ICNIRP) has chosen a daily limit of 30 J/m2 of “actinic UV” to avoid short-term damage (“suneyes. This actinic UV dose is spectrally weighted according to an action spectrum largely based on photokeratitis of the eyenmgenerally exceeded by outdoor workers, and by people whoexpose themselves excessively during sunny summer holidays (e.g. with some 10-20 SEDs/day). Thus, this UV limit for indoor workers would ensure that the risk stays well below that of outdoor workers, and only in exceptional cases (e.g. welders) might this limit be reached, i.e. on average the personal risk will be small. However, one should be careful with applying this exposure limit to the indoor UV exposure of the general population: a small increase in personal UV exposure for an entire population may then result in a substantial additional number of skin cancer cases per year. Thus, a life-long increase of 7% in erythemal UV in Northwestern Europe (caused by an ozone depletionincidences by 14 and 25%, respectively, (Madronich and de Gruijl 1993) which would have amounted to about 2,000 and 750 additional cases of BCC and SCC in the Netherlands in the year 2000 in a populationionionionaccumulated UV dose at 500 lx. As an illustration, we present worst case scenario studies of populationexposures to fluorescent lamps currently on the EU market to assess the potential impact on incidence of SCCs; see section 3.7.

Conclusions on effects on healthy skin

Thermal effects from visible or IR radiation emitted by lighting sources are unlikely to cause any serious health effects in healthy skin; problems may arise only with excessively intense sources with exposures in close proximity of the source (adherence to DIN 33403 for pain thresholds or more conservative expansion of the ICNIRP limit to exposure times over 10 s).

Considering the data on UV effects, the sunburn reaction would appear to be the practical key for proper control of UV exposure levels on the skin, both for short- and long-term health effects. Minimizing sunburn reactions is advisable to prevent acute discomfort from inflammatory skin reactions, and to minimize possible adverse effects from immune modulation. In the long term it is likely to lower the risk of the most fatal skin cancer, melanoma. Limiting the daily erythemal dose will further limit the life-long accumulated dose which is likely to lower the risk of the skin carcinomas, SCCs and BCCs.

Although a risk assessment of SCC risk from indoor UV exposure appears feasible, SCENIHR had to resort to simplified worst case scenarios (see section 3.7) as adequate data on personal exposures were not available. The worst case scenarios presented in section 3.7 suggest that lamps that comply with current limits on UV emission (preventing acute/short-term adverse effects) may have a substantial impact on SCC incidences when a population is subjected to extensive and large scale exposure to these lamps.

A comprehensive database of emission spectra (including UV) of lamps on the European market, with routine checks and updates, together with data on actual personal exposures would be very useful for public health monitoring. Future measurements of actual personal indoor UV exposures may indicate whether more vigilance is warranted and if current regulation on indoor UV exposures of the general public is appropriate. The current categorization of UV emissions from lighting lamps in risk groups is primarily based on a UV exposure limit for indoor workers. This exposure limit has been translated into an emission limit of 2 mW actinic UV per klm. Lamps below this emission limit are in the first and lowest risk group which is considered “safe” and exempt of any liability. Although personal risks may be low under these exposure and emission limits, adopting these limits for the general population can, nevertheless, conceivably result in a substantial number of additional cases of skin carcinomas each year (section 3.7). Any acceptable limit on population-wide risk should be translated into UV exposure limits for the general population and corresponding limits on UV emissions from lamps for lighting purposes. This retracing of a risk limit to an emission limit would require reliable data on personal (UV) exposures from lamps and luminaires in actual practice (with known spectral output in UV and VIS, and known UV radiant power over luminous flux ratios [W/lm] and illuminances [lx]). As such detailed data are currently lacking, a UV emission limit can now only be based on worst case scenarios like those presented in section 3.7.

With any of these potential health effects from artificial lighting sources, it is always advisable to take sun exposure (however variable it may be) as a reference. Designing light sources to include UV and stimulate vitamin D (such as “Full Spectrum Fluorescent Lighting”, Hughes and Neer 1981, as referred to by McColl and Veitch 2001) and related health effects may introduce unwarranted long-term risks to eyes (cataracts) and skin (carcinomas). In contrast to persons deliberately exposing themselves to sunbeds for cosmetic or presumed health effects, persons staying indoors do not expect to be exposed to UV radiation from the lighting, and lighting lamps should therefore evidently be adequately low in UV output.

Source & ©: , Health effects of artificial light, 19 March 2012,
 3.5.1 Photothermal effects and 3.5.2 Photochemical effects, pp. 38-45.

4.2 Effects on the eyes

The SCENIHR opinion states:

3.5.2.3. Assessment of effects on the healthy eye

There are no studies available on the possible effects of domestic artificial light on the human eye. However, studies have been performed on hazards of specific artificial lights, mostly ophthalmologic instruments, on human eyes. A considerable number of studies are also available on the effects of artificial light on the retina of laboratory animals (e.g. mice, rats, monkeys, dogs...). These studies have been the basis of the understanding of light induced toxicity to the retina and have defined the wavelengths responsible for photochemical damage to the retina

A. Light and cornea and lens pathology

UV and IR radiation can cause corneal and lens lesions. However, medical and/ or surgical treatments are available for these pathogenic events and permanent vision decline would only in exceptional cases result from UV or IR induced permanent damages. Lesions differ whether they result from acute or chronic exposure. In contrast, putative light-induced retinal pathology is in the majority of the cases non- reversible and not treatable.

i) Corneal and conjunctival lesions induced by UV exposure The ocular UV environment is a function of both the direct as well as the diffuse UVR. The diffuse component of UVR has a strong effect on the eye since it is incident from all directions. Compared to visible light, UVR is strongly scattered, as the amount of scatter increases greatly with decreasing wavelength. On average 40% of the total global UVB dose is diffuse radiation. This fact and the natural aversion of the eye from direct bright radiation mean that the majority of UVR arriving at the cornea is from diffuse scatter and not from direct sunlight (Parisi et al. 2001, Sliney 1999). For acute UV exposure (normally limited to wavelengths below 315 nm in the UVB-UVC bands) the only effect on the normal eye is photokeratitis (Bergmanson and Sheldon 1997) (see also Figure 6). In severe cases of this condition, an anterior stromal edema can be observed. Rarely, endothelial lesions can occur as a result of UV keratitis (Dolin and Johnson 1994). Lamps used for normal lighting purposes that belong to RG0 or RG1 would not be expected to cause photokeratitis because toxic thresholds will not be reached (0.03-0.06 J/cm2). Chronic exposure to UV and other environmental factors such as sand, dust, wind, and dry conditions induce climatic droplet keratopathy. Its occurrence is more frequent in people with previous solar keratitis and the process progresses as the individuals continue their light exposure. Climatic droplet keratopathy is a degenerative process characterized by golden-brown translucent material found in the anterior corneal stroma, Bowman’s layer, and subepithelium. Initially, the deposits are found near the limbus of the cornea within the interpalpebral zone. They may progress as large nodules up to the central cornea resulting in decreased vision. Deposits may also infiltrate the epithelium and the conjunctiva and become painful (Cullen 2002). UV exposure can also induce conjunctival lesions such as pingueculae and pterygium. The former are elevated masses of conjunctival tissue, almost always occurring within the interpalpebral zone at the 3 and/or 9 o’clock limbal area. The mass consists of basophilic subepithelial tissue. The lesions are usually bilateral but tend to be more frequent nasally due to increased UV radiation exposure from reflection off the nose. Occasionally, the pingueculae may become inflamed and cause ocular discomfort. A pterygium is typically triangular in shape with the apex extending onto the cornea. The tissue is fibrous and tends to be highly vascularized. Bowman’s layer below the tissue is destroyed as it crosses the cornea. There is strong evidence that pterygia are caused by UV. Pterygia are found with more frequency nasally. Pterygium can cause decreased vision as it progresses on the visual axis. It also causes inflammation and discomfort and astigmatism. Outdoor work is a recognized factor for pterygium development (Luthra et al. 2001, Shiroma et al. 2009). These studies showed that pterygium was almost twice as frequent among persons who worked outdoors, but was only one fifth as likely among those who always used sunglasses outdoors. The Blue Mountains Eye Study examined 3,654 residents aged 49+ years during 1992 to 1994 and then re-examined 2,335 (75.1% of survivors) after 5 years to assess the relationship between baseline pterygium and pingueculae and the 5-year incidence of age-related maculopathy (ARM) (Pham et al. 2005). The study found that pterygium was associated with a 2- to 3-fold increased risk of incident late and early ARM. There are, however, no studies available that have specifically investigated the effect of indoor lighting on these conditions. Furthermore, the available scientific literature has focused on acute effects of UV and not evaluated effects of chronic exposures.

ii) Cataracts A cataract is defined as a decreased transparency of the ocular crystalline lens or its capsule. Cataracts are divided into three subtypes; subcapsular cataract occurs just behind the anterior capsule of the lens or in front of the posterior capsule, cortical cataract affects the cortex of the lens, and a nuclear cataract is an opacification of the lens nucleus that in addition causes refractive myopic changes. These subtypes can occur concurrently in any combination. Cortical cataracts have been associated with chronic, but not acute, UV exposure (Taylor et al. 1988). Indeed, in a case-control study within the Nambour (Australia) Trial of Skin Cancer Prevention conducted between 1992 and 1996, 195 cases with a nuclear opacity of grade 2.0 or greater were compared with 159 controls (Neale et al. 2003). Structured questionnaires were used to ascertain lifetime sun exposure history, eyeglasses and sunglasses use, and potentially confounding variables such as education and smoking. There was a strong positive association of occupational sun exposure between the ages of 20 and 29 years with nuclear cataract (odds ratio = 5.9; 95% confidence interval = 2.1-17.1). Exposure later in life resulted in weaker associations. Wearing sunglasses, particularly during these early years, afforded some protective effect. When the pupil is dilated (i.e. when wearing sunglasses), the incident sunlight reaches the germinal epithelial lens cells. Most cortical opacities are located in the lower nasal quadrant, most likely owing to the convergence of sideways incident solar rays (the “Coroneo effect”, Coroneo et al. 1991).

iii) Conclusions on light and cornea and lens pathology The only effect of acute exposure to UV, particularly UVB and UVC below 300 nm, is photokeratitis of the cornea and conjunctiva, a condition which would not be expected to be caused by lamps used for normal lighting purposes and belonging to RG0 or RG1. Chronic UV exposure from sunlight may cause corneal lesions (climatic droplet keratopathy) as well as cortical and nuclear cataracts of the lens.

B. Light and retinal pathology

i) Sunlight and retinal pathology

Acute exposure: Solar retinitis

The visual disturbances caused by a few minutes of directly looking into the sun or to a solar eclipse have been known for many years (Young 1988). Eyes from patients who volunteered to stare at the sun prior to enucleation had various degrees of injury in the RPE cells 38-48 hours later, and only minor changes of the outer segments and inner segments of the photoreceptors. This explains the good vision shortly after exposure. The damage to the RPE was very similar to photochemical damage observed in the RPE of the monkey 48 hours after exposure to blue light (Ham et al. 1978). Whilst RPE and the blood retinal barrier restores rapidly, permanent degeneration of the photoreceptors was observed some time after the exposure (Tso and La Piana 1975) inducing various degrees of visual disturbances and central scotoma. It is now well recognized that sunlight-induced retinal lesions result from chemical damage similar to that observed after blue light exposure, and not from thermal injury. At noon in the summer the sunlight may reach 100,000 cd/m2 but the overhead protection of the cornea by the upper lid and the natural avoidance reflex protects from direct exposure. However, in some specific situations, increased retinal exposure may occur from ground reflections even during decreased luminance conditions, which may enhance the lid opening. Indeed, ground surface reflection is the most important environmental exposure factor (Sliney 2005b). For example, prolonged exposure to a hazy sky on fresh snow may induce overexposure and lesions. Unprotected soldiers exposed to sand reflexion for several months in the desert have shown macular lesions similar to solar eclipse retinitis. Even if nobody stares directly at the sun, cumulative chronic low intensity exposure to the sunlight (particularly with ground reflexion) may induce similar lesions (Gladstone and Tasman 1978).

Glare

The eye continuously adapts to light, which allows humans to see about 10 orders of magnitude of illuminance, from almost total darkness to highly luminous environments. Nevertheless, at a given time, vision is possible and comfortable only within a two or three order of magnitude range. Glare occurs with too much light. It is empirically divided into two types (see Marshall and Sliney (1997) for a comprehensive review). Discomfort glare does not impair visibility but causes an uncomfortable sensation that causes the observer to look away from the glaring source. It increases when the light source is facing the observer. Disability glare is due to the light scattering within the ocular media which creates a veil that lowers any contrast and renders viewing impossible. High luminance light sources generate a veiling glare with a luminance which decreases as the inverse of the angle between the direction of the point source and the direction of the gaze.

The luminance of the sky is rather stable at about 5,000 cd/m2. This value can be exceeded on bright surfaces on clear days when luminance can reach several tens of thousands cd/m2. The sun is never viewed directly except when it is at sunrise or at sunset when its luminance is about the same as the sky and its colour temperature low or moderate. It is when both the luminance and the colour temperature of the light are high that the blue light hazard increases. The spectral sensitivity to glare at night has its maximum around 507 nm which is the most efficient radiation for the rods. However, the mechanisms for glare are not fully understood and the role of the recently discovered intrinsically photosensitive retinal ganglion cells (ipRGCs) which are active during daytime is not clear. Nevertheless, light with a relatively high content of blue is liable to generate glare both during daytime and night-time.

Whatever the type of glare and its source, it is not in itself a health effect, but an inconvenience that can substantially affect the vision.

Blue light hazard

As detailed above in section 3.4.3.2 B1, the interaction of blue light with molecules constituting the retina or accumulating in the retina with age or in pathological conditions can induce damage to RPE cells, photoreceptor cells, and to ganglion cells. This “blue light hazard” was identified more than 40 years ago (Noell et al. 1965, Noell et al. 1966). Subsequent studies have shown that the shortest wavelengths in the visible spectrum are the most dangerous ones for the retina (e.g. Gorgels and Van Norren 1998, Ham et al. 1976, Ham et al. 1979) and the mechanisms of light-induced damage have been reviewed previously by others (Organisciak and Vaughan 2010, Wu et al. 2006). A recent review (van Norren and Gorgels 2011) has furthermore analyzed the relevant studies on action spectra of photochemical damage to the retina. Data from four different species were included, and the outcome of the analysis is that most studies agree that retinal damage is higher at shorter wavelengths and decreases with increasing wavelength. The review furthermore stresses that there are significant knowledge gaps in several areas related to retinal damage.

The potential phototoxic retinal damage is expected to occur with wavelengths in the blue light spectrum between 400 and 460 nm (Algvere et al. 2006, Ham 1983, van Norren and Schellekens 1990). The first evidence for retinal light toxicity of blue light came from the observation of Noell in 1965, who accidentally discovered that the retina of albino rats can be damaged irreversibly by continuous exposure for several hours or days to environmental light within the intensity range of natural light. The intensity of light that damages the retina is several orders of magnitude below the threshold of thermal injury in pigmented animals. The same damage as in albinos of different strains is produced in pigmented rats when the pupils are dilated (Noell et al. 1966). Other investigators have by now characterized non-thermal retinal damage in several species (see Organisciak and Vaughan (2010) for a recent review).

The wavelength is considered to be one of the factors that enhance the susceptibility to light damage in animal studies. Thus, for both Class I and Class II photochemical damage, the action spectrum peaks in the short wavelength region, providing the basis for the concept of blue light hazard.

Laboratory studies have suggested that photochemical damage includes oxidative events. Grimm et al. (2001) have shown that blue light causes photopigment-mediated severe retinal damage under experimental conditions (Class I damage). Indeed when light hits a photoreceptor, the cell bleaches and becouncometabolic process called the “visual cycle.” Absorption of blue light, however, has been shown to cause a reversal of the process in rodent models. The cell becomes unleads to a buildup of lipofuscin in the retinal pigment epithelium (RPE) layer.

Further evidence that light damage is mediated by photopigments was provided by studies in which monkeys were exposed to different wavelengths to bleach predominantly the pigment in one particular class of photoreceptors. The damage to the blue cocoretinanmretinaspecies production after 30 min of blue light exposure. Longer exposures were accoreceptorcellwavelength dependency, it has been suggested that photochemical damage depends on the total dose received. This implies that the light intensity and the duration required to cause a certain level of damage are cocoretinal damage. They showed that a 5-minute exposure does not produce a significant effect, whereas three and four exposures, each of 5 minute duration and each followed by a 1-hour dark interval, lead to significant damage. However, the cumulative effect does not take place if the retina recovers sufficiently from subliminal damage before the next exposure is applied. The cumulative nature of light damage has been observed in several subsequent investigationsreceptor damage than continuous exposure, and that it exacerbates Class I photochemical damage in rats.

Susceptibility to light damage increases with age in a process that is distinct from age- related degenerative changes. O’Steen et al. (1974) exposed rats of different age to the same durationcoretinareceptor destructioncoseverereceptors were damaged in adult retinas (16-24 weeks).

Importantly, all animal experimental studies analyzing retinal light toxicity have been performed using artificial light and not sunlight exposure. However, due to the fact that the retina of animal models, mostly rodents, differ from the human retina and to the fact that monkey studies are performed on anesthetized animals directly exposed to light with dilated pupils, extrapolation of doses to human exposure is not possible. In summary, studies indicating a blue light hazard within the intensity range of natural light to the retina are based on animal experiments. They have shown that Class II damage is strictly mediated by blue light illumination, while Class I damage can be mediated by different photopigment wavelengths, although only blue cone damage was seen to be permanent. Therefore, for both classes of damages, blue light seems to be more dangerous than other components of white light. The relevance of these experimental data for human pathological conditions is not totally clear, although these studies are suggestive of retinal damage due to blue light also in humans. Epidemiologic studies have provided conflicting results regarding the relationship between sun exposure and retinal pathologies, mostly due to the fact that dosimetry is difficult to evaluate during long-term exposure and is highly dependent on geometric factors. High quality epidemiologic studies are needed to evaluate the real impact of light on retinal diseases (age related macular changes, age related macular degeneration, and also other retinal and macular pathologies).

ii) Artificial light and retinal damage

In humans, the only direct evidence for acute light toxicity due to artificial light exposure has been observed after acute accidental exposures to ophthalmologic instruments and to sunlight. No epidemiologic studies have evaluated the potential hazards of artificial light exposure for the eye.

Ophthalmologic instruments

Exposure to ophthalmologic instruments has caused accidental overexposures and subsequent retinal lesions, which has led to threshold exposure limits and guidelines. The risks of retinal damage to patients in the operating room were recognized about 20 years ago. Operating microscopes can induce paramacular lesions, very similar to those induced experimentally by intense blue light exposure in animals. Moreover, filtration of blue light has been seen to significantly reduce the risks, although not entirely eliminating them. Increased duration of illumination of the retina through dilated pupils increases the risk of retinal damage. In 1983, on a series of 133 patients, it was shown that at 6 months post surgery, visual acuity was significantly higher in patients operated on with a fiberoptic light attenuated in the blue range as compared to a high-intensity tungsten filament microscope (Berler and Peyser 1983). Since then, several reports have identified blue light output as the major risk for the retina when compared to red and UV wavelengths (Cowan 1992).

Welder exposure

Arc welding exposes workers to UV and to blue light. Radiation in the UV range is absorbed mostly by the cornea and lens if welders are unprotected and gives rise to “arc- eye” or “welder’s flash” (keratoconjunctivitis), well known as an occupational hazard for welders. Even if very painful, this condition is not expected to induce any permanent ocular damage. On the other hand, visible light, particularly in the blue range may expose welders to retinal photochemical damage. Okuno and co-workers (2002) evaluated the blue-light hazard for various light sources and found that arc welding was among the highest effective hazardous sources. Blue-light hazard effective irradiance has a mean value of 18.4 W/m2 (300-700 nm) at 100 cm with a tmax (allowed exposure time) of 5.45 s. Exposure times of 0.6-40 s are typical, which may be very hazardous to the retina (Okuno et al. 2002). Several case reports have been published stressing that welding should be performed in good background lighting and with permanent adequate protection since pupillary constriction in response to striking the arc is too slow to block the initial surge of radiation.

iii) Chronic exposure to sunlight and Age-related Macular Degeneration (AMD) Oxidative stress and sub-clinical local inflammation have been suggested to be associated with aging processes in the retina (Chen et al. 2010), and benzo(a)pyrene toxicity through smoking has been shown to contribute to the development of AMD (Fujihara et al. 2008, Sharma et al. 2008, Wang et al. 2009a). The involvement of photochemical damage of the retina in AMD progression is also suggested by the observed protective effects of macular pigments and vitamins (Desmettre et al. 2004). However, due to lack of support from epidemiological studies, there is no consensus regarding sunlight exposure, which also generates oxidative stress and AMD (see Mainster and Turner 2010). One of the exceptions is the Beaver Dam Eye Study, where a correlation between sunlight and 5-year incidence of early AMD was observed. The study showed that leisure time spent outdoors while persons were teenagers (aged 13-19 years) and in their 30s (aged 30-39 years) was significantly associated with the risk of early AMD. People with red or blond hair were slightly more likely to develop early AMD than people with darker hair (Cruickshanks et al. 2001). A population-based cohort study with a 10-year follow-up confirmed the finding that, controlled for age and sex, exposure to the summer sun for more than 5 hours a day during the teens, the 30's, and at the baseline examination led to a higher risk of developing increased retinal pigment damage and early AMD signs as compared to exposure for less than 2 hours during the same period (Tomany et al. 2004).

The Beaver Dam Eye Study and the Blue Mountains Study, respectively, provided data on a total of 11,393 eyes from 6,019 subjects undergoing cataract surgery (Cruickshanks et al. 2001, Tomany et al. 2003, Tomany et al. 2004). Of these patients, 7% developed AMD in the 5 years following cataract surgery as compared to 0.7% in the phakic population (with the natural crystalline lens present). The cataracted lens is a strong blue light filter. However, more recent studies such as the large prospective AREDS study in 2009 (Chew et al. 2009) did not confirm this finding. To better control for light environmental conditions, the effect of sun exposure on AMD was evaluated on 838 watermen on the Chesapeake Bay (Taylor et al. 1992). In this specific population, it was possible to estimate the relative exposure to blue light and UV. Compared with age-matched controls, patients with advanced age-related macular degeneration (geographic atrophy or disciform scarring) had significantly higher exposure (estimated to 48% higher) to blue or visible light over the preceding 20 years, but were not different with respect to exposure to UVA or UVB. This suggests that blue light exposure could indeed be related to the development of AMD, particularly in the more advanced ages. However, these associations were not found in other studies such as the French POLA study (Delcourt et al. 2001).

In the light of newly discovered genetic susceptibility factors for AMD, associations between sunlight exposure and genetic markers are relevant to study. Since polymorphisms in genes encoding proteins involved in the control of inflammation in the choroid/retina are strongly associated with the risk of developing AMD, the effect of light on these populations should allow better analysis of the risk of sunlight on AMD.

iv) Blue light and glaucoma or other optic neuropathy

Osborne et al. (2008) showed that mitochondrial enzymes such as cytochromes and flavin oxidases absorb light and generate ROS. Because retinal ganglion cells are unprotected from visible light, they are directly exposed to such photo-oxidative stimuli. In vitro, ganglion cells have been seen to undergo a caspase-independent form of apoptotic death due to light exposure. Studies of the effects of broad-band light exposure (400-700 nm) in rats have shown that only blue light exposure induced signs of ganglion cell suffering (Osborne et al. 2008). Moreover, because melanopsin-containing ganglion cells participate in the light-induced pupil response, patients with ganglion cell dysfunctions owing to anterior ischemic optic neuropathy, demonstrated global loss of pupil responses to red and blue light in the affected eye, suggesting that in those patients retinal illumination could be enhanced, increasing the blue light hazard (Kardon et al. 2009). However, to-date no epidemiologic study has evaluated the correlation between sunlight, or blue light, exposure and the progression or occurrence of glaucoma or other optic neuropathy.

v) Conclusions on light and retinal pathology

There is strong evidence from animal and in vitro experiments that blue light induces photochemical retinal damage upon acute exposure, and some evidence that cumulative blue light exposure below the levels causing acute effects can induce photochemical retinal damage. In humans, there is direct evidence of acute light-induced damages to the retina from accidental high-intensity artificial or sunlight exposure. Regarding long-term exposure at sub-acute levels, there is no consistent evidence for a link between exposure from sunlight (specifically blue light) and photochemical damage to the retina, particularly to the retinal pigment epithelium. Taking into account that AMD primarily affects the choroids and the retinal pigment epithelial cells, future epidemiologic studies should focus on the impact of light on retinal diseases (age-related macular changes, age-related macular degeneration, and also other retinal and macular pathologies) in particular after long-term light exposure at lower intensities. On the basis of the action spectrum of blue light in animal studies, blue light exposure may be considered a risk factor for long-term effects which should be investigated further in dedicated case-control and cohort studies. There is no consistent evidence that sunlight exposure early in life may contribute to retinal damage which can lead to AMD later in life. Available epidemiologic studies are also not consistent regarding aggravation of AMD. Whether light exposure from artificial light could induce similar lesions remains to be demonstrated. There is no clinical or epidemiological evidence that blue light causes neuropathy.

C. Conclusions on effects on the healthy eye

Under specific circumstances, exposure to sunlight or artificial light can cause acute as well as chronic effects and damage to various structures of the eye. Acute UV exposure, particularly UVB and UVC below 300 nm, may cause photokeratitis of the cornea and the conjunctiva. In experimental studies, it is also shown that acute photochemical damage to the retina can occur due to blue light exposure. Acute damage to the human retina can occur due to accidental high-intensity artificial light or sunlight exposure.

Chronic UV exposure from sunlight may cause damage to the cornea (climatic droplet keratopathy) and the lens (cataracts). There is no consistent evidence that long-term exposure to sunlight (especially blue light) may be involved in retinal lesions that can develop into AMD.

There is no evidence that artificial light from lamps belonging to RG0 or RG1 would cause any acute damage to the human eye. It is unlikely that chronic exposures to artificial light during normal lighting conditions could induce damage to the cornea, conjunctiva or lens. Studies dedicated to investigating whether retinal lesions can be induced by artificial light during normal lighting conditions are not available.

Source & ©: , Health effects of artificial light, 19 March 2012,
 3.5.2.3 Assessment of effects on the healty eye, pp. 45-52.

4.3 Effects on the sleep, mood and the circacian rhythm.

The SCENIHR opinion states:

3.5.3. Circadian rhythms, circadian rhythm disruptions, sleep and mood

3.5.3.1. Circadian rhythms

From an evolutionary perspective, exposure to artificial light is very new (Stevens 1987, Stevens and Rea 2001). Thus, life on earth has for billions of years been organized around the 24-hour day with a normal period of approximately 12 hours of light and 12 hours of dark at the equator, which varies with latitude and seasonal changes throughout the year (Stevens et al. 2007). Hence, almost all organisms on earth show 24-hour circadian and biological rhythms in adaptation of their biochemical systems to the rotation of the earth around its axis. This fundamental component of our biology, with the main function of coordinating biological rhythms, is controlled by endogenous biological clocks, and this periodicity has a profound impact on biochemical, physiological, and behavioural processes in almost all living organisms (Reddy and O’Neill 2010, Reppert and Weaver 2002).

In mammals, these rhythms are primarily generated by the master circadian pacemaker located in the suprachiasmatic nucleus (SCN) of the hypothalamus in the brain. The SCN clock can function autonomously, without any external input, with a period close to 24 h in all species studied (Dunlap et al. 2004). The clock is, however, not independent from the environment, as it is synchronized to the 24 h day through daily resetting by environmental cues (“Zeitgebers” = time givers), in particular light in mammals. Thus, the SCN receives input from both internal and external stimuli and its period may be entrained by these time cues. Information on light, by far the most potent synchronizer, reaches the SCN exclusively via the retinohypothalamic tract in the eyes in mammals, including humans. The visual rod and cone photoreceptor systems, necessary for normal vision, seem only to have a minimal role in circadian photosensitivity (Brainard et al. 2001a, Brainard et al. 2001b, Thapan et al. 2001). Circadian photoreception is primarily mediated by intrinsically photosensitive melanopsin (a vitamin-A photopigment) containing retinal ganglion cells (ipRGCs) distributed in a network across the inner retina (Berson et al. 2002, Brainard et al. 2001a, Brainard et al. 2008, Hattar et al. 2002, Hankins et al. 2007, Guler et al. 2008). In the absence of these two systems (classical photoreceptors and ipRGCs), the circadian timing system is free-running, expressing its own endogenous rhythmicity (Hattar et al. 2003).

Melanopsin contained in ipRGCs is a rhabdomeric photopigment, and possesses response properties of the invertebrate opsins. A unique property of rhabdomeric photopigments is the dual function as sensory photopigments and photoisomerases (Koyanagi et al. 2005). The photopigment chromophore is regenerated by light (conversion of all-trans back to 11-cis retinal). Due to this property, melanopsin is resistant to “light-bleaching”, and retains its ability to respond to light at high levels of irradiance and for long duration exposures. ipRGCs project mainly to the SCN, but also to other structures involved in non-visual responses, including, but not limited to, the pretectum (the pupillary reflex), the VLPO (sleep), the amygdala and the hippocampus (mood, memory). These photopigments require high irradiances, display a high degree of inertia in their responses, and show a peak of sensitivity between 460 and 484 nm in all vertebrates studied so far, including humans.

Melatonin, N-acetyl-5-methoxytryptamine, is a ubiquitous hormone in all groups of organisms. In vertebrates, including humans, it is primarily synthesized in the pineal gland and immediately secreted into the blood. Its 24-h rhythm is directly driven by the circadian clock through a polysynaptic sympathetic output pathway from the SCN to the pineal gland. Thereby, in normally entrained individuals, pineal melatonin is synthesized during the night (normal peak 1-3 a.m.), whereas during the day, production is virtually null. The primary role of melatonin is considered to provide an internal biological signal (“the third eye”) for the length of night (Wehr 1991), and a signal for dawn to dusk (Arendt 2006, Arendt and Rajaratnam 2008, Brzezinski 1997). In addition, melatonin has been shown in studies in vitro to have antioxidant properties, including scavenging of free radicals, direct antiproliferative effects, enhancing the immune response, and possibly an epigenetic regulator, which may influence certain metabolic diseases (Brzezinski 1997, Korkmaz et al. 2009, Reiter et al. 2010). In a recent study, it was shown that exposure to room light (<200 lx) in the evening before bedtime had a profound effect not only by a suppressed melatonin level, but the exposure also shortened the duration of melatonin production by about 90 minutes, and thus induced a shortened internal biologic night (Gooley 2011).

Melatonin has been suggested to function as a protective agent against “wear and tear” in several tissues. It has been shown that in normal retinas, melatonin exerted protection against free radical damage. Moreover MT1-type melatonin receptors were found in photoreceptor cells and MT1 knock-out mice demonstrated a loss of photoreceptors at 12 and 18 months of age, suggesting that lack of melatonin may be involved in retinal degeneration (Baba et al. 2009). Also in the brain, melatonin is suggested to have protective functions, including protection against oxidative damage (Kwon et al. 2010) and also by inhibiting the intrinsic apoptotic pathway (as reviewed by Wang 2009b). In addition to acutely inhibiting melatonin synthesis at night via the SCN sympathetic output pathway, light resets the phase of the circadian timing system (advances and delays the 24-h rhythms of temperature, melatonin, cortisol etc.). The response of the circadian system to light, generally quantified by the degree of melatonin phase shift and suppression, is dependent on the timing of light exposure, duration, intensity and spectral composition (Gronfier et al. 2004, Gronfier et al. 2007, Lockley et al. 2003, Rimmer et al. 2000, Thapan et al. 2001). Short wavelength blue light (460-480 nm) has been shown to exert a stronger effect on light-induced melatonin suppression at equal photon density than green light (555 nm; Figueiro and Rea 2010, Lockley et al. 2003). West et al. (2011) has recently shown that narrow-bandwidth blue light (469 nm, 20 μW/cm2) is significantly more efficient in night-time melatonin suppression in humans than polychromatic white light (4,000 K) as well.

However, both light intensity and other spectral components seem to influence nocturnal melatonin suppression in studies on human volunteers (see for example, Duffy and Czeisler 2009, Gooley et al. 2010, Revell and Skene 2007) suggesting that although melanopsin is the primary circadian photopigment, it is not alone in regulating melatonin production levels and the circadian phase. The important role of melanopsin is nevertheless suggested in many studies since a greater effect of monochromatic blue (460 nm) light compared to green (560 nm) light has been frequently documented. This includes phase shifting of the melatonin rhythm (Lockley et al. 2003), enhancing alertness, temperature, and heart rate (Cajochen et al. 2005), activating PER2 gene expression (Cajochen et al. 2006), phase shifting PER3 gene expression (Ackermann et al. 2009), enhancing psychomotor performances, and activating waking EEG (Lockley et al. 2006). Blue light also affects sleep structure (Münch et al. 2006), and activates brain structures, including the hippocampus and the amygdala that are involved in cognition, memory and mood (Vandewalle et al. 2007a, Vandewalle et al. 2007b, Vandewalle et al. 2009, Vandewalle et al. 2010).

Recently, ten core circadian clock genes (CLOCK, CSNK1E, CRY1, CRY2, PER1, PER2, PER3, NPAS2, BMAL1, TIMELESS) have been discovered (Cermakian and Boivin 2009, Fu and Lee 2003) with direct control of at least 10% of the genome (Bellet and Sassone- Corsi 2010, Storch et al. 2002). Their main function is to be responsible for generating the rhythmic oscillations on a cellular level. They seem also to play critical roles in many disease-related biological pathways including cell cycle, DNA repair and apoptosis (Fu and Lee 2003). The SCN orchestrates temporal alignment of physiology by transmitting daily signals to multiple mainly self-sustained clocks in peripheral tissues (Panda et al. 2002). Ill-timed light exposure (late evening, night or early morning), e.g. during fast transmeridian travelling or night shift work, the central oscillator in the SCN, however, tends to shift more rapidly than the peripheral oscillators, resulting in transient uncoupling of the peripheral oscillators from the central oscillator leading to internal de-synchronisation among circadian periodic physiologic variables within the body (Haus and Smolensky 2006, Wood et al. 2009).

3.5.3.2. Circadian rhythm disruptions

Appropriate exposure to electrical light during periods of the day with local environmental darkness has, over the last 100 years, become a milestone of modern life. “Ill-timed light exposure” (i.e. late evening, night, early morning), in addition to light exposure during the day, may, however, result in attenuation of melatonin production and disruption of normal circadian rhythms (Czeisler et al. 1990), dependent on duration, wavelength and intensity of light exposure (Stevens et al. 2011). Circadian disruption is mainly characterized by desynchronization between internal (circadian rhythms) and external time (environmental clock time), including desynchrony of the master pacemaker (SCN) with the sleep cycle and with the peripheral oscillators in tissues throughout the body (Dibner et al. 2010). A desynchronization of the SCN with peripheral oscillators will persist for a variable period of time depending on the exposure pattern and the characteristics of the individual, e.g. age and chronotype (i.e. morning or evening preferences) (Davidson et al. 2009). Light exposure induces phase advances, and phase delays at different points in the circadian cycle, i.e. depending on the time during which light exposure occurs (on average in humans, light between about 5 a.m.-5 p.m. advances, and light between about 5 p.m.-5 a.m. delays the clock(Khalsa et al. 2003)). Thus, consecutive ill-timed light exposures may induce inappropriate phase shift of the circadian system, not allowing for its complete synchronization to the actual light conditions, and leading to circadian disruption.

Recent studies indicate that ill-timed exposures to even low levels of light in house-hold settings may be sufficient for circadian disruptions in humans. A comparison between the effects of living room light (less than 200 lx) and dim light (<3 lx) before bedtime showed that exposure to room light suppressed melatonin levels and shortened the duration of melatonin production in healthy volunteers (18-30 years) (Gooley et al. 2011). Cajochen et al. (2011) compared the effects of a white LED-backlit screen with more than twice the level of blue light (462 nm) emission to a non-LED screen on male volunteers. Exposure to the LED-screen significantly lowered evening melatonin levels and suppressed sleepiness. In another study from the same group (Chellappa et al. 2011) 16 healthy male volunteers were exposed to cold white CFLs (40 lx at 6,500 K) and incandescent lamps (40 lx at 3,000 K) for two hours in the evening. The melatonin suppression was significantly greater after exposure to the 6,500 K light, suggesting that our circadian system is especially sensitive to blue light even at low light levels (40 lx). However, no study has investigated whether the impact of warm white CFLs and LEDs (2,700-3,000 K) on melatonin suppression is in any way different from that of incandescent lamps.

Disruptions of fundamental circadian rhythms including communication between different cell types (Cermakian and Boivin 2009) may have the potential to significantly affect human health. Circadian disruptions, including decrease of melatonin levels, have been suggested to play an important role in development of chronic diseases and conditions such as cancer (breast, prostate, endometrial, ovary, colo-rectal, skin and melanomas, non-Hodgkin’s lymphomas), cardiovascular diseases, reproduction, endometriosis, gastrointestinal and digestive problems, diabetes, obesity, depression, sleep deprivation, and cognitive impairment (Bass and Takahashi 2010, Boyce and Barriball 2010, Frost et al. 2009, Haus and Smolensky 2006, IARC 2010, Kvaskoff and Weinstein 2010, Mahoney 2010, Poole et al. 2011, Rana and Mahmood 2010, Stevens et al. 2007). It is, however, difficult to study directly the effects of ill-timed light exposures and long term health consequences, especially because virtually all humans are, to a various degree, exposed to artificial ligtht in the period between dusk and down. Therefore, epidemiologic studies can mainly provide indirect support for the theory. Thus, regarding breast cancer, it has in four out of five prospective cohort studies been observed that women with the lowest concentration of the main melatonin metabolite sulfatoxymelatonin, have the highest risk (Travis et al. 2004, Schernhammer & Hankinson 2005, Schernhammer et al. 2008, Schernhammer et al. 2009, Schernhammer et al. 2010). Further, some relatively consistent epidemiological support has been found from other very different aspects of light exposure and potential circadian disruption: 1) increased risk in night-shift workers, and in 2) flight attendants potentially suffereing from both jet-lag and night shiftwork; 3) decreased risk in blind women, 4) and by long sleep duration; 5) increased risk by ambient light during the night in bedroom, and 6) high community light level, e.g. in cities; and decreased risk 7) for persons living in the arctic with long winters without or with only little light (Stevens 2009). Since light exposures are only measured indirectly in existing epidemiologic studies, other factors than light may, however, be involved at least partly in the observed breast cancer risk (Fritschi et al. 2011; Kantermann & Roenneberg, 2009).

It has also been suggested that melatonin deficits, e.g. caused by exposure to light at night, could be part of the etiology of osteoporosis. However, in vitro and experimental in vivo studies are inconsistent in their outcomes (Sánchez-Barceló et al. 2010). One single prospective study on nurses has investigated the association between hip and wrist fractures and duration of rotating night shift-work. Overall, nurses with at least 20 years of night shift-work, followed up from 1988 to 2000 for hip and wrist fractures, had an adjusted relative risk of 1.10 (0.87-1.42) compared to nurses who had never had shift- work; no dose-response relationship appeared by duration of exposure (Feskanich et al. 2009). In sub-analyses, including 8 years of follow-up and 20 or more years of night shift-work, a significantly increased relative risk (2.36; 1.33-4.20) was observed in nurses who had never used hormone replacement therapy and who had a body mass index <24. Overall, there is inadequate, or no evidence, for an association between exposure to light and risk of osteoporosis.

So far, the most comprehensive evidence of an association between circadian disruption and disease is found for breast cancer in night-shift workers. Night-shift work which may occur for several years, affects about 10-20% of the EU-workforce, is the most extreme source of ill-timed exposure to light and thereby simultaneous reduction of melatonin production, sleep deprivation and circadian disruption (Costa et al. 2010). An expert group convened by IARC in October 2007 concluded that “shift-work that involves circadian disruption is probably carcinogenic to humans, Group 2A”, based on sufficient evidence in experimental animals for the carcinogenicity of light during the daily dark period (biological night), and limited evidence in humans for the carcinogenicity of shift work that involves night work and strong bio-mechanistic support (IARC 2010, Straif et al. 2007). In a recent meta-analysis based on eight published studies of shift-work and female breast cancer risk, a significantly increased risk of 40% (95% confidence interval: 1.2-1.7) was found (Viswanathan and Schernhammer 2009).

The majority of included shift-work studies have been adjusted for potential confounders, including two large independent prospective cohort studies of high quality (Schernhammer et al. 2001, Schernhammer et al. 2006). After the IARC evaluation, three studies of shift-work and breast cancer have provided further support for the light-at-night hypothesis (Pesch et al. 2010;Lie et al. 2011;Hansen and Stevens 2011), whereas one new study did not provide further support (Pronk et al., 2010). Furthermore, three independent studies of breast cancer risk after exposure to non- occupational light-at-night in the home have recently been published (Davis et al. 2001, Kloog et al. 2011, O'Leary et al. 2006), and significant associations were found for women who did not sleep during the period of the night where melatonin levels are normally peaking (Davis et al. 2001), or who frequently turned on the light during the night (OR=1.65; 1.02-2.69; O'Leary et al. 2006). An increased breast cancer risk was also correlated with increasing bedroom light levels (Kloog et al. 2010). All results are adjusted for potential confounders, but these three studies are based on self reports of light exposure and therefore prone to recall bias, which may limit interpretations. Due to the frequent exposure to light at inappropriate times (ill-timed exposure) there is an urgent need for further multidisciplinary research on occupational and environmental exposure to light-at-night and risk of certain diseases (Blask 2009, IARC 2010, Stevens et al. 2007).

Conclusions

There is a moderate overall weight of evidence that ill-timed exposure to light (light-at- night indirectly measured by night shift work), possibly through melatonin suppression and circadian disruption, may increase the risk of breast cancer. There is furthermore moderate overall weight of evidence that exposure to light-at-night, possibly through circadian disruption, is associated with sleep disorders, gastrointestinal and cardiovascular disorders, and with affective disorders. The overall evidence for other diseases is weak due to the lack of epidemiological studies.

3.5.3.3. Sleep

Circadian rhythms, including melatonin rhythms, are involved in different aspects of facilitation of sleep (Cajochen et al. 2005, Dijk et al. 2001). A number of comprehensive reviews deal with the effects of acute light exposure on sleep (see e.g. Antle et al. 2009, Bjorvatn and Pallesen 2009, Czeisler and Gooley 2007). The effect of blue light on sleep is the subject of some recent work. Mottram et al. (2011) compared effects between exposures to 17,000 K (blue-enriched white light) and 5,000 K (white light) for 4-5 weeks on personnel at a research station in Antarctica. The blue-enriched higher colour temperature lamps significantly influenced sleep onset (earlier) and reduced sleep latency. This result, suggesting that blue-enriched white light synchronized the circadian timing system, is in accordance with some other studies, showing that blue-enriched light is more efficient in melatonin suppression than other wavelengths (Figueiro and Rea 2010, Gooley et al. 2011) and induces a circadian phase delay persisting into sleep (Münch et al. 2006). This latter study furthermore shows that monochromatic light exposure before bedtime increases slow wave activity (sleep depth) at the end of the subsequent night of sleep, with a greater effect of blue (460 nm) than green (555 nm), suggesting that light before bedtime can affect sleep.

Conclusions

The effects on sleep are sparsely investigated, making it difficult to draw any conclusions regarding effects of specific wavelengths, although one single study clearly shows that exposure to light artificially enriched in blue before bedtime affects subsequent sleep structure. However, such blue-enriched light does not emanate from common light sources limiting the relevance of the study for the general public.

3.5.3.4. Mood, alertness and cognitive functions

Seasonal affective disorder (SAD), “winter depression”, is a mood syndrome or depression, particularly occurring in people living in areas with significant differences in exposure to natural light during summer and winter. Patients occasionally experience depressive symptoms in the winter with remissions in summer (Lurie et al. 2006). Disruption of circadian rhythms by insufficient light exposure seems to be involved (Monteleone et al. 2010). Several studies have shown that light therapy may be an efficient treatment for SAD (international committee recommendation; Monteleone et al. 2010, Westrin and Lam 2007, Wirz-Justice et al. 2005). Recent reports have shown that short wavelength blue light from LED sources (Anderson et al. 2009, Glickman et al. 2006, Howland 2009, Strong et al. 2009) has similar clinical effects to white light sources.

Since humans are day-living organisms, light is linked with a state of wakefulness or alertness. A number of studies have investigated specifically the effect of light on alertness (Dijk et al., 2009). Typically, such studies have been using subjective measures to assess alertness of subjects, but more and more, neurophysiological tools such as EEG, EOG, and also fMRI and PET have also been used. Cajochen reviewed the evidence for alerting effects of light recently (Cajochen 2007) and pointed out that light exerts an alerting effect both during night and daytime conditions. The night-time effect is normally ascribed to suppression of melatonin levels, whereas the daytime effect is more difficult to explain. The intensity requirement has been investigated (Cajochen et al. 2000, Zeitzer et al. 2000), revealing that white light has an acute alerting effect at 50% of the maximal alertness (achieved with a 10,000 lx light exposure) already around 100 lx. The wavelength dependency of alertness effects has also been studied. Thus, several studies report that shorter wavelengths (460-470 nm) are significantly more efficient in generating alertness responses than longer (555 nm) wavelengths (Cajochen et al. 2005, Lockley et al. 2006, Revell et al. 2006, Vanderwalle et al. 2007a). A recent study by Figueiro et al. (2009) recorded alerting effects by both blue (470 nm) and red (630 nm) light. They investigated 14 volunteers with both neurophysiological and psychomotor tests, self reporting and measurements of salivary melatonin, in a within-subject study with two levels of intensity (10 and 40 lx at the cornea). Also the red light exposure exerted alerting effects at the higher level. However, only the blue light reduced the melatonin levels. The authors concluded that alertness may be mediated by the circadian system, but that this might not be the only light- sensitive pathway that can affect alertness at night. Viola et al. (2008) performed an occupational study where subjects spent the working day (4 weeks) either in a 17,000 K (blue-enriched white light) environment or in a white light environment (4,000 K). A number of subjective measures of alertness, mood, performance, fatigue, etc. improved in the blue-light condition as compared to the white light condition.

There are a few studies suggesting that short wavelength (blue) monochromatic light has an effect on cognitive functions via affecting circadian rhythms or directly through brain structures involved in memory, cognition and alertness (An et al. 2009, Vandewalle et al. 2006, Vandewalle et al. 2007b, Vandewalle et al. 2010). In a study where psychological effects of light were tested, cognitive effects elicited by blue light exposure were found to be different from the effects caused by exposure to longer (red) wavelengths (Mehta and Zhu 2009), but neither the light spectra nor the light intensities used were reported, making it difficult to compare with the aforementioned studies.

Conclusions

There is moderate evidence that monochromatic blue light or light artificially enriched in blue has an effect on cognitive functions, memory, and mood that is stronger than other lights. Whether these studies are relevant for evaluation of effects of common light sources is unclear, since monochromatic or blue-enriched light of this type is not produced by lamps for the general population.

3.5.3.5. Overall conclusions on circadian rhythms, circadian rhythm disruptions, sleep and mood

Light is typically installed for the beneficial purpose of illuminating space to allow for leisure, entertainment or work. Similarly shutters and windows are often used to prevent exposure to daylight and facilitate prolonged sleep, particularly with children. Importantly bright light enables better vision and affects mood which is desired in almost any illuminated public or private environment. Notably also the colour temperature is typically adapted to the specific environment which is an important feature of light design and architecture. By doing so, an individual is exposed to light which affects the circadian rhythm with immediate and medium term psychological effects.

This behaviour builds on the cyclic behaviour of the spectrum and intensity of solar light, and has been increasingly used with the emergence of artificial light sources. Only recently have these effects on psychologic conditions and wake/sleep cycles been studied systematically. In general, levels of light intensity remain well below the peak intensity of the sun on a clear day, while in some applications (stage-art, film, TV recordings) it may be essential, and necessary, to surpass this “natural” reference value. In this context however it needs to be noted that these effects are not a feature of a lamp technology of concern, but of lighting and light design in general which suggests the need to provide appropriate information to citizens, as well as increasing alertness for the issue of light pollution. Light (at night) and elsewhere may be of beneficial or essential use for some while simultaneously negatively affecting others.

Despite the beneficial effects of light, there is mounting evidence that suggests that ill- timed exposure to light (light-at-night), possibly through circadian rhythm disruption, may be associated with an increased risk of breast cancer and also cause sleep disorders, gastrointestinal, and cardiovascular disorders, and possibly affective states. Importantly, these effects are directly or indirectly due to light itself, without any specific correlation to a given lighting technology.

Specifically under certain conditions blue light may be more effective in influencing human biological systems than other visible wavelengths. Thus, monochromatic blue light or light artificially enriched in blue is particularly effective in melatonin phase shift and suppression. However monochromatic or blue-enriched light of this type is not produced by lamps for the general population, so the relevance for the evaluation of effects of common light sources is unclear.

Source & ©: , Health effects of artificial light, 19 March 2012,
 3.5.3 Circadian rhythms, circadian rhythm disruptions, sleep and mood, pp. 53-59.


FacebookTwitterEmail
Thèmes
Publications A-Z
Dépliants

Get involved!

This summary is free and ad-free, as is all of our content. You can help us remain free and independant as well as to develop new ways to communicate science by becoming a Patron!

PatreonBECOME A PATRON!