What exactly are flickers?
The term "flicker" is used to refer to the subjective perception of a change in light intensity. This is usually caused by periodically fluctuating voltage changes, i.e. small but frequently occurring voltage dips, in contrast to usually one-off voltage dips (see also "White paper on voltage dips from Janitza electronics GmbH"). The voltage dips that cause flicker can be measured using an algorithm described in DIN EN 61000-4-15. Flicker is caused by rapidly changing high load changes (transformer overloads) in welding machines, arc furnaces, photovoltaic systems, wind turbines, nuclear magnetic resonance tomographs, etc.
The fact that the personal perception of flicker depends on many subjective conditions, such as visual acuity, irritability of the retina in the human eye, the existing light conditions in general, and much more, causes difficulties. The perception threshold is very different for each person and can therefore only be determined statistically. For example, the flicker level of 1 was determined as the level at which the perception threshold applies to 50% of the persons tested. The perception threshold changes with the frequency of the voltage change under consideration. It is lowest at 8.8 Hz. This also shows that flicker is a measured value that must be measured over a certain frequency spectrum.

But why measure flicker?
Very high flicker levels lead to increased maintenance costs and malfunctions in electronic equipment or even their destruction (power supplies).
In addition, the flicker causes employees, particularly those working at VDU workstations in office buildings, to tire more quickly, become irritable and lose concentration. The constant adaptation of the optic nerve to changing light conditions quickly becomes tiring and ultimately affects a person's overall perception. For this reason, limit values have been set in the power quality standard EN 50160, compliance with which should help to avoid the negative effects of flicker.