CALIBRATING AV SENSORS WITH NVIDIA DRIVEWORKS SDK...IMU Coordinate System DriveWorks SAL layer...
Transcript of CALIBRATING AV SENSORS WITH NVIDIA DRIVEWORKS SDK...IMU Coordinate System DriveWorks SAL layer...
Vahid Taimouri and Hope Allen
CALIBRATING AV SENSORS WITH NVIDIA DRIVEWORKS SDK
2
Introduction to NVIDIA DRIVE and DriveWorks SDK
DriveWorks Calibration Overview
Static Calibration in the Garage
In-Vehicle Self-Calibration
AGENDA
3
NVIDIA DRIVE PLATFORMOpen | Scalable | Modular
Lanes Lights
Path
Signs
PedestriansCars
Train Models DriveCollect Data Simulate
4
DRIVE DEVELOPER WORKFLOWDevelop on PC | Test On Bench | Deploy to Vehicle | Repeat
DEPLOY AND CONTINUOUSLY IMPROVE
DEVELOP / DEBUG ON PC ON ROAD TESTINGDEBUG / PROFILE ON BENCH
5
NVIDIA DRIVE SOFTWARE ARCHITECTUREOpen and Modular Software for AV Development
DRIVEWORKS
DRIVE AV
Perception
DRIVE AGX DEVELOPER KITS (Xavier/Pegasus) DRIVE HYPERION DEVELOPER KIT
Mapping Planning
DRIVE IX
Visualization AI CoPilot AI Assistant
Sensor
Abstraction
Image/Point
Cloud ProcessingVehicle IO DNN Framework Recorder Calibration Egomotion
DRIVE OS
NvMedia NvStreams CUDA TensorRT Developer Tools
Radar, LidarIMU/GPS, CAN
Cameras
6
NVIDIA DRIVEWORKS SDKFoundation for AV SW Development on NVIDIA DRIVE
Open, modularized library of functions | Optimized for DRIVE AGX | Easy to integrate and build upon
SOFTWARE MODULES
Software dev tools ranging from sensor data capture to calibration and visualization
TOOLS
Sample apps | Documentation, tutorials | Active developer forum and community
DEVELOPER SUPPORT
Architected for automotive safety | Compliant with industry standards, (ISO 26262/21448, MISRA)
PRODUCTION GRADE
7
DRIVEWORKS ARCHITECTURE GOALS
Careful architecture of the end-to-end software pipeline
Efficiently utilize the many hardware engines inside the DRIVE AGX platform
Optimize data communication formats between engines
Minimize data copies (zero copy exchange of buffers)
Implement and utilize the most efficient algorithms
Achieve the Full Throughput of NVIDIA DRIVE
8
Hardware sensor abstraction
Virtual sensors (for replay)
Raw sensor serialization (for recording)
Seeking capabilities for virtual sensors
Exploitation of hardware engines for acceleration(H264/H265 codec, VIC, ISP)
Sensor Abstraction Layer (SAL)Generalize Sensor Interfaces
9
REAL-TIME SENSOR RECORDING & REPLAY TOOLS
Sensor output saved to system’s data storage devices, including SATA/USB 3.0
Tuned performance to avoid glitches during capturing and recording
Recorder performance tuned for seamless capture
Recording Post-Processing Tools provided for convenience:
Verification of configuration and integrity of each recording session
Support for fast seeking in recorded files
Recording clipping, conversion and time alignment tools
Compressed/uncompressed, synchronized and time stamped sensor data capture
Record | Synchronize | Playback | Time Stamp | Data Integrity Validation
10
CALIBRATION OVERVIEW
11
DRIVEWORKS CALIBRATION
DRIVEWORKS
DRIVE AV
Perception
DRIVE AGX DEVELOPER KITS (Xavier/Pegasus) DRIVE HYPERION DEVELOPER KIT
Mapping Planning
DRIVE IX
Visualization AI CoPilot AI Assistant
Sensor
Abstraction
Image/Point
Cloud ProcessingVehicle IO DNN Framework Recorder Calibration Egomotion
DRIVE OS
NvMedia NvStreams CUDA TensorRT Developer Tools
Radar, LidarIMU/GPS, CAN
Cameras
12
MOTIVATION
Why calibration is needed:
Perception
Sensor fusion
Localization and mapping
Path planning and control
Lidar-Camera Fusion (calibrated)
Lidar-Camera Fusion
Localization
Calibration is a process to bring readings of multiple
sensors mounted on a vehicle into a common
coordinate system
13
CALIBRATIONFidelity | Consistency | Real Time
IN THE GARAGE
ON THE ROAD
Parameter
Corrections
Perception
Pipeline
Continuous Sensor
Outputs
Self
Calibration
Rig File
Static Calibration
Nominal
Parameter
Estimates
14
DRIVEWORKS COORDINATE SYSTEM
Front-Left-Up (FLU) coordinate system
Orientation
Vehicle reference point in ISO8855
Vehicle reference point in DriveWorks
Follows ISO8855 Standard for X-forward, Y-left, Z-up Coordinate System
15
CAMERA COORDINATE SYSTEM
Camera coordinate system: Defined as x-right, y-down, z-forward [metric]
Image coordinate system: Defined as (u,v) in [pixels]
Ray2pixel (R³ -> R²): Mapping from camera to image coordinate system
Camera model: DriveWorks supports pinhole, f-theta, ocam
F-theta: Default camera model; Compute angle between ray R and optical axis, i.e.
θ = cos-1(R.z / |R|)
(u,v) = (u0,v0) + f(θ) * (cos(φ), sin(φ))
φ being angle of the projected ray in image space to u-axis
Camera Coordinate System is Defined as X-right, Y-down, Z-forward
16
COORDINATE SYSTEMS FOR OTHER SENSORS
LIDAR Coordinate System
Manufacturer defined
DriveWorks SAL output is in FLU
IMU Coordinate System
DriveWorks SAL layer output is right-handed
IMU biases (gyroscope and linear accelerator)
RADAR Coordinate System
DriveWorks SAL layer supports multiple radars
Only one real DoF - azimuth angle (-90;90) deg
17
MULTIPLE SENSORS AND EGO-VEHICLE
Click to add textClick to add text
VIDEO VIDEO
18
RIG FILE
JSON representation of properties of vehicle and enabled sensors
Different fields are listed as key/value [string] pairs
Protocol and Parameter are to initialize SAL layer
Nominal transformation (aka sensor2Rig) in FLU coordinate system
Correction is split into rotational and translational part
DriveWorks Convention for Storing Calibration Results
{
"name": "camera:rear:center:60fov",
"correction_rig_T": [0.0,0.0,0.0],
"correction_sensor_R_FLU": {
"roll-pitch-yaw": [3.33e-09,-0.0,0.0]
},
"nominalSensor2Rig_FLU": {
"roll-pitch-yaw": [0.0,0.0,-180.0],
"t": [0.461,0.0,1.485]
},
"parameter": "camera-type=ar0231-rccb-bae-sf3324,camera-
group=a,camera-count=3,siblingIndex=0,output-format=raw,fifo-
size=6,format=lraw",
"properties": {
"Model": "ftheta",
"bw-poly": "0.0 5.46e-4 4.91e-9 -1.17e-11 3.65e-15
",
"cx": "967.592407",
"cy": "628.175537",
"height": "1208",
"width": "1920"
},
"protocol": "camera.gmsl"
}
19
// Initialize rig configuration module
{
dwRig_initializeFromFile(&m_rigConfig, m_context, m_rigFile.
c_str());
m_lidarInfo = readSensorInformation(m_rigConfig, "lidar:*");
m_imuInfo = readSensorInformation(m_rigConfig, "imu");
m_canInfo = readSensorInformation(m_rigConfig,
"can:vehicle");
const dwVehicle* vehicle;
dwRig_getVehicle(&vehicle, m_rigConfig);
m_vehicle = *vehicle;
}
// Initialize EgoMotion
{
dwEgomotionParameters egomotionParameters{};
egomotionParameters.imu2rig = m_imuInfo.nominalSensorToRig;
egomotionParameters.vehicle = m_vehicle;
egomotionParameters.motionModel = DW_EGOMOTION_IMU_ODOMETRY;
dwEgomotion_initialize(&m_egomotion,
&egomotionParameters, m_context);
}
/*
Contains relevant information for a particular
sensor
which is read from rig configuration.
*/
struct SensorInfo
{
uint32_t sensorId;
std::string name;
std::string parameters;
std::string protocol;
dwTransformation3f nominalSensorToRig;
dwTransformation3f nominalRigToSensor;
};DRIVEWORKS RIG API
20
static SensorInfo readSensorInformation(dwRigHandle_t rigConfig, cons
t std::string& sensorSearchPattern)
{
uint32_t sensorId;
// Find sensor ID for lidar
dwRig_findSensorByName(&sensorId, sensorSearchPattern.c_str()
, rigConfig);
SensorInfo sensorInfo;
sensorInfo.sensorId = sensorId;
// Read extrinsics
dwRig_getNominalSensorToRigTransformation(&sensorInfo.nominal
SensorToRig, sensorId, rigConfig);
Mat4_IsoInv(sensorInfo.nominalRigToSensor.array, sensorInfo.n
ominalSensorToRig.array);
// Read sensor name
const char* sensorName;
dwRig_getSensorName(&sensorName, sensorId, rigConfig);
sensorInfo.name = sensorName;
// Read parameter string
const char* parameters;
dwRig_getSensorParameter(¶meters, sensorId, rigConfig);
sensorInfo.parameters = parameters;
// Read protocol string
const char* protocol;
dwRig_getSensorProtocol(&protocol, sensorId, rigConfig);
sensorInfo.protocol = protocol;
cout << "Found sensor " << sensorInfo.name << " with sensor
ID: " << sensorInfo.sensorId << endl;
return sensorInfo;
}
DRIVEWORKS RIG API
21
CALIBRATIONFidelity | Consistency | Real Time
Estimates sensors’
nominal
parametersIN THE GARAGE
ON THE ROAD
Parameter
Corrections
Perception
Pipeline
Continuous Sensor
Outputs
Self
Calibration
Rig File
Static Calibration
Nominal
Parameter
Estimates
22
STATIC CALIBRATION -CAMERA INTRINSIC
Camera intrinsic calibration is a mapping between camera and image coordinate systems
For production use cases, camera OEMs estimate intrinsic nominals and store them in camera EEPROM
For R&D use-cases, DriveWorks provides tooling to estimate intrinsic nominals
Distortion coefficients for ocam, pinhole, and f-theta camera models
DriveWorks SDK tool: calibration-intrinsic-constraint
Internal Camera Sensor Model
/*
Specifies the supported optical camera models.
The models define the mapping between optical
rays
and pixel coordinates, e.g., the intrinsic
parameters of the camera.
*/
typedef enum dwCameraModel {
DW_CAMERA_MODEL_OCAM = 0,
DW_CAMERA_MODEL_PINHOLE = 1,
DW_CAMERA_MODEL_FTHETA = 2
} dwCameraModel;
23
STATIC CALIBRATION — CAMERA INTRINSIC
Click to add textClick to add text
VIDEO VIDEO
24
STATIC CALIBRATION — CAMERA EXTRINSIC
Target setup around the car
Stand, wheel, ground
Extrinsic Calibration Process
Calibration graph
DriveWorks SDK tools
calibration-graph-cli
calibration-graph-to-rig
Camera Position and Orientation Parameters on the Vehicle
25
CAMERA STATIC CALIBRATIONValidation
Interpretation of validation
Intrinsic validation images: Cover the whole FOV & Look for large reprojection errors
DriveWorks Tool: calibration-intrinsics-validator
26
CAMERA STATIC CALIBRATION
Static calibration is carried out while the car in “default state”
Any deviation from default state will influence the calibration parameters
Changes in weight configuration
Changes in wheel air pressure
To compensate for this deviations, self-calibration is utilized during the rides
Why Do We Need Self-calibration?
27
IN THE GARAGE
ON THE ROAD
Parameter
Corrections
Perception
Pipeline
Continuous Sensor
Outputs
Self
Calibration
Rig File
Static Calibration
Nominal
Parameter
Estimates
Continuously
corrects parameter
estimates as the
vehicle operates
CALIBRATIONFidelity | Consistency | Real Time
28
Goal: To correct nominal calibration values for a particular car during the drive
Parameter: Extrinsics of sensors and several vehicle parameters can be self-calibrated
Constraints:
Ground ConstraintsLidar: Ground plane filtering and fitting
Hand-Eye ConstraintsCamera: Motion from feature tracks
Lidar: Motion from ICP
Ground ConstraintsCamera: Ground plane triangulation + fitting
SELF-CALIBRATIONEstimates the Sensor’s Calibration Parameters While the Vehicle is Moving
29
EXTRINSIC CALIBRATION (DYNAMIC)Camera Calibration View
Horizon Indicators
Accepted (⬤)
Initialization (⬤)
Nominal (⬤)
Ground ROI (⬤)
30
EXTRINSIC CALIBRATION (DYNAMIC)Initial Calibration View in Our Own Reference Application
31
EXTRINSIC CALIBRATION (DYNAMIC)Final Calibration View in Our Own Reference Application
32
CAMERA CALIBRATION API — SAMPLE APP
VIDEO VIDEO
33
CALIBRATION KPI
Cross Validation KPIs for each sensor type
Precision
Recall
Accuracy
Refer to the ‘Self-Calibration Tutorial’ in the DriveWorks SDK Reference Guide (https://docs.nvidia.com/drive/) for further information
Camera Cross-Validation KPI
34
SUMMARY
35
SUPPORTED SENSORS
DriveWorks calibration supports:
Camera
Lidar
Radar
IMU
Vehicle Steering
Please refer to DRIVE Ecosystem Hardware and Software Components for list of sensors supported by DriveWorksout-of-the-box
36
CALIBRATIONFidelity | Consistency | Real Time
IN THE GARAGE
ON THE ROAD
Parameter
Corrections
Perception
Pipeline
Continuous Sensor
Outputs
Self
Calibration
Rig File
Static Calibration
Nominal
Parameter
Estimates
37
GET STARTED WITH DRIVEWORKSSummary
Visit the DRIVE training page for upcoming webinars and other resources (https://developer.nvidia.com/drive/learn)
LEARN MORE
Join the NVIDIA DRIVE™ developer program for DRIVE AGX (https://developer.nvidia.com/drive)Read DriveWorks documentation (https://docs.nvidia.com/drive)
Download DriveWorks SDK (https://developer.nvidia.com/drive/downloads)
GET ACCESS
Use the DriveWorks sample applications as a starting point for development
Develop autonomous vehicle applications with DriveWorks SDK
DEVELOP AND TEST
Get your work featured on the NVIDIA DRIVE partner page (https://www.nvidia.com/en-us/self-driving-cars/partners/)
Browse and connect on the DRIVE AGX forum (https://forums.developer.nvidia.com/c/agx-autonomous-machines/drive-agx/)
SHARE YOUR WORK