Job Description

Key Responsibilities

  • Design, implement, and maintain core autonomy modules that integrate sensing, perception, state estimation, mapping, and planner interfaces into a cohesive real-time system.
  • Develop high-performance computer vision pipelines (classical + AI-based) for detection, segmentation, tracking, and scene understanding, ensuring reliable operation on embedded hardware.
  • Build multimodal perception systems that fuse camera, LiDAR, radar, and IMU data into accurate, navigation-ready environment representations.
  • Deploy, optimize, and maintain autonomy software on embedded platforms (Jetson AGX/Orin), including TensorRT optimisation, cross-compilation, CUDA acceleration, and performance tuning for real-time execution.
  • Own sensor bring-up, configuration, calibration, and synchronization (camera, LiDAR, radar, IMU, GPS), ensuring accurate and stable data for downstream modules.
  • Ensure system-level robustness and...
  • Ready to Apply?

    Take the next step in your AI career. Submit your application to Larsen & Toubro today.

    Submit Application