Autonomous driving development, built on dependable data, will deliver a safer and more reliable driving experience for both passengers and pedestrians.
Humans rely on a complex mix of sight, reasoning, perception, sensing, and reaction when driving. We can see and identify objects instantaneously and, with experience, estimate the distance and velocity of objects to determine if we should slow down, swerve, or stop. However, humans are also subject to error. Poor visibility, lack of experience, distractions, stress, and exhaustion (to name a few) can all slow our reaction time and lead to poor driving decisions and potential accidents.
Autonomous vehicles (AVs) alleviate those issues by relying on a suite of sensors that provide exact measurements of the objects around the vehicle. The AVs perception stack ingests this data and uses technology such as machine learning to create a digital picture of the world. This picture allows predictive calculations and near instantaneous decision making that leads to precise driving behaviors. To achieve full autonomy AVs need to be equipped with data they can trust.
LiDAR sensors visualize the world around the vehicle. They detect objects near and far and create a real-time 3D map that enables the vehicle’s control center to detect, measure, perceive, and respond to objects in the inherently dynamic conditions of driving on the road to achieve superhuman performance.
The data generated from a LiDAR sensor is critical to how the vehicle ultimately responds to its environment—slow down, speed up, change direction, or stop. Because of this, when it comes to LiDAR sensors, there is no room for error.
LiDAR sensors can provide clarity but achieving high levels of fidelity without sacrificing other performance requirements has been one of the biggest hardware obstacles to high-level AV. While some high-performance, long-range legacy LiDAR sensors can provide reliable data, much of that data is often compromised due to issues inherent in the technology’s architecture. This can lead to distortion or absence of objects, making it harder or impossible for perception algorithms to identify the exact positions of objects and increasing the probability of accidents.
There are many reasons why data can be compromised, but an abrupt issue experienced by legacy LiDAR sensors is interference.
External light sources, such as bright sunlight or the beams of a headlight (as well as lasers from other LiDAR sensors on the road), can interfere with a LiDAR system. This interference can confuse the sensor, sometimes creating non-existent objects that cause the vehicle to suddenly swerve or brake. In some instances, the interference can even blind the sensor.
Split seconds matter when driving. Even a brief moment of distortion can be serious.
At Baraja, we’ve designed a LiDAR system with inbuilt interference immunity, providing an accurate and precise record of the vehicle’s external environment regardless of the conditions. Our patented Spectrum-Scan™ technology filters out external light sources. And if any light makes its way through to the detectors, our ranging technique, Random Modulated Continuous Wave, uses code and frequency matching to prevent unwanted light from being detected.
Another set of problems inherent to legacy LiDAR sensors are pointcloud speckle and blur. This causes data degradation, which can prevent the vehicle from seeing the full picture of its surrounding environment by misjudging the location, and existence of points in the pointcloud.
Speckle manifests as randomly missing data points in the pointcloud and can lead to objects going undetected, which can ultimately result in an accident.
This artifact is mostly a result of the industry’s shift from direct detection, which measures the amplitude of the returning light but fails to measure other optical properties necessary to detect a Doppler shift, to homodyne detection. Homodyne detection is capable of measuring all optical properties required for Doppler detection, effectively measuring the distance and velocity of every point with a single measurement instead of relying on object tracking through multiple continuous frames and cumbersome calculations.
Homodyne doesn’t come without its trade-offs, though.
Speckle is an issue inherent to homodyne detection that causes missed detections and, therefore, missed data points. The results look much like a grainy image that lacks accurate detail. At Baraja, we have eliminated the detriments of speckle from our homodyne detection method. By using a custom-integrated photonic receiver, we are able to extract the previously missed points, eliminating the effects of speckle from the pointcloud.
Blur is another common problem that compromises data integrity. Pointcloud blur often occurs in LiDAR systems that use high-speed, mechanical, mirror-based steering systems. These systems are used to constantly reposition the laser towards objects of interest, however this motion together with a finite flight time of return light results in a blurred pointcloud. This artifact is similar to that of a photo that was taken while swinging a camera from one side to another.
Our Spectrum-Scan™ technology eliminates the potential of pointcloud blur by removing the need for mechanical components in the fast axis of scanning. Spectrum-Scan™ technology instead relies on RMCW wavelength-tunable lasers and prism-like optics to scan the environment.
The unique combination of our proprietary Spectrum-Scan™ steering technology with RMCW ranging technique has allowed us to develop a LiDAR sensor capable of incredible resolution and precision, has a fully solid-state fast axis, and is immune to interference from external light.
Our Spectrum-Scan™ LiDAR sensor generates data you can trust without sacrificing power, size, or cost, making it the best option on the market for powering your autonomous projects.
Talk to our sales team today to see how the Baraja Spectrum-Scan™ LiDAR system can provide data you can trust.