The BBC News article Driverless car laser ruined camera describes a situation where a particularly powerful infrared laser from the LIDAR of a prototype car at the CES show damaged the sensor of a photogrpher’s camera.
Question: Are there any standards, specifications, or even guidelines anywhere in the sensor or camera manufacturing industries for thermal damage due to intense sources of light?
If a LIDAR manufacturer wanted to be responsible and build a system that they could say probably will not damage security cameras and traffic cameras up and down the street, is there any place they could turn for information or limits on laser emission? Perhaps a maximum radiance value in each of several wavelength ranges?
Or if a camera manufacturer wanted to be responsible and build a camera that they could say probably will not be damaged by car, robot, or other LIDAR systems?
Or if a LIDAR were part of a display of another product (like a car or robot) but it may not be obvious to every member of the public that there were IR lasers involved, and the display owners wanted to know what laser level might warrant them including a warning about cameras?
So far, answers to the question Are there industry standards or specs for image sensor resistance to damage from intense light? Ask Question are basically “no”, but outdoor photography is so ubiquitous that there’s plenty of experience.
Now however, eye-level infrared laser beams are something new and different, and these are invisible and so one doesn’t necessarily know one is photographing a laser until the dot shows up in the photo.
If I understand correctly these LIDAR systems use wavelengths that are absorbed in the front of the eye and so never pass through the lens and get focused to a small spot on the retina. An IR-blocking filter on the lens can mitigate the problem, but an IR-blocking filter on the sensor, near the focus, can melt and fail for the obvious reason that it absorbs the power which is now focused to a small spot.
Jit Ray Chowdhury/BBC
The lidar system on the top of the demonstration car
Jit Ray Chowdhury/BBC
The purple dots and lines on this photo of the Stratosphere hotel in Las Vegas show the damage…
The article goes on to explain:
Lidar works in a similar way to radar and sonar, using lasers rather than radio or soundwaves, explained Zeina Nazer, a postgraduate researcher at the University of Southampton specialising in driverless car technology.
“Powerful lasers can damage cameras,” she said.
“Camera sensors are, in general, more susceptible to damage than the human eye from lasers. Consumers are usually warned never to point a camera directly at laser emitters during a laser show.”
Ms Nazer added that for cameras to be immune to high power laser beams, they need an optical filter that cuts out infrared which is invisible to humans. However, it can affect night vision, when infrared can be an advantage.
“AEye is known for its lidar units with much longer range than their competitors, ranging 1km compared to 200m or 300m,” she said.
“In my opinion, AEye should not use their powerful fibre laser during shows.”