In this blog, I will discuss how to calibrate a suite of sensors used in a L2 autonomous prototype vehicle.
Note:
- To ensure a dataset generated from a L2 autonomous prototype, calibrate all sensors on board per each trip.
In our autonomous vehicle prototypes, we use
6 - Cameras
- we use 6 cameras
- cropped native resolution from 1600x900 to smaller images
- are in native BGR format.
- with auto-exposure with a maximum limit of 20 ms.
- use Bayer8 format for 1 byte per pixelin encoding
- 1/1.8" CMOS sensor for 12 Hz capture frequency.
- positions:
- one front center camera
- one front side mirrow camera per side
- one rear center camera
- one rear door centered camera per side
5 - Long Range RADARs
- we use 5 sensors of RADAR
- @ 13Hz capture frequency at 77 Ghz
- measures distance & velocity, independently, in one cycle
- positions:
- one front bumper center radar
- one front side mirror radar per side
- one rear door center radar per side
1 - LIDAR
- @ 20Hz capture frequency with 32 Channels
- 360 degrees - Horizontal FOV
- +/- 10 degrees to -30 degrees Vertical FOV
- range: 80m - 100m, but usuable at 70 m
- accurte: +/- 2cm
- upto 1.39 million points per second capture
Camera Calibration - Extrinsics
- use a cube-shaped object with known charuco patterns on three orthogonal planes.
- compute the transformation matrix from the camera to LIdAR by aligning the planes of the object.
- compute the camera to the ego frame transformation matrix from the LIDAR to the ego frame transformation.
note:
- the ego frame is at the rear vehicle axle's mid point.
Camera Clibartion - Intrinsics
- compute the intrinsics and the distortion parameters of the camera with a calibration board with a know set of patterns.
RADAR caliboration
- calibrate the yaw angle using a brute force approach to minimize the compensated range rates for static objects.
LIDAR calibration
- use a laseer liner to measure the relative location of the LIDAR to the ego frame.
Note:
- To ensure a dataset generated from a L2 autonomous prototype, calibrate all sensors on board per each trip.
In our autonomous vehicle prototypes, we use
6 - Cameras
- we use 6 cameras
- cropped native resolution from 1600x900 to smaller images
- are in native BGR format.
- with auto-exposure with a maximum limit of 20 ms.
- use Bayer8 format for 1 byte per pixelin encoding
- 1/1.8" CMOS sensor for 12 Hz capture frequency.
- positions:
- one front center camera
- one front side mirrow camera per side
- one rear center camera
- one rear door centered camera per side
5 - Long Range RADARs
- we use 5 sensors of RADAR
- @ 13Hz capture frequency at 77 Ghz
- measures distance & velocity, independently, in one cycle
- positions:
- one front bumper center radar
- one front side mirror radar per side
- one rear door center radar per side
1 - LIDAR
- @ 20Hz capture frequency with 32 Channels
- 360 degrees - Horizontal FOV
- +/- 10 degrees to -30 degrees Vertical FOV
- range: 80m - 100m, but usuable at 70 m
- accurte: +/- 2cm
- upto 1.39 million points per second capture
Camera Calibration - Extrinsics
- use a cube-shaped object with known charuco patterns on three orthogonal planes.
- compute the transformation matrix from the camera to LIdAR by aligning the planes of the object.
- compute the camera to the ego frame transformation matrix from the LIDAR to the ego frame transformation.
note:
- the ego frame is at the rear vehicle axle's mid point.
Camera Clibartion - Intrinsics
- compute the intrinsics and the distortion parameters of the camera with a calibration board with a know set of patterns.
RADAR caliboration
- calibrate the yaw angle using a brute force approach to minimize the compensated range rates for static objects.
LIDAR calibration
- use a laseer liner to measure the relative location of the LIDAR to the ego frame.
Comments
Post a Comment