Skip to content

This project builds a full-stack autonomous driving system on top of NVIDIA Isaac Sim, including a realistic urban simulation and an end-to-end autonomy pipeline. It integrates multi-modal sensors, perception, mapping, planning, control, and learning to enable closed-loop autonomous driving research and evaluation.

License

Notifications You must be signed in to change notification settings

Ashrith5321/full_stack_AV

Repository files navigation

full_stack_AV

This project aims to build a full-stack autonomous driving system, end-to-end, including both the simulation environment and the complete autonomy pipeline. The system is developed on top of NVIDIA Isaac Sim, which serves as the physics-accurate simulation backbone, while all higher-level autonomy components are implemented as custom layers on top.

The project models a realistic urban driving world—roads, lane markings, intersections, buildings, traffic participants, pedestrians, and dynamic obstacles—and equips a simulated vehicle with a rich suite of sensors, including cameras, LiDAR, radar, ultrasonic sensors, IMU, and GPS. These sensors produce realistic, synchronized data streams that are used to drive the full autonomy stack.

On top of this simulation, the project implements the entire autonomous driving pipeline, including:

Perception (lane detection, pedestrian and vehicle detection, semantic understanding)

Multi-sensor fusion and state estimation

Localization and mapping (SLAM and map-based localization)

Behavior planning and decision-making

Trajectory planning and motion control

Learning-based components, including deep neural networks and reinforcement learning

The system operates in a closed loop, where raw sensor data flows through perception, mapping, planning, and control modules to generate vehicle actions that directly affect the simulated world. The simulator is not treated as a black box; instead, it is used as a controllable, extensible environment for autonomy research, data generation, training, and evaluation.

The ultimate goal of the project is to create a research-grade autonomous driving platform that supports experimentation with classical and learning-based methods, multi-modal sensing, risk-aware planning, and sim-to-real transfer. By building both the simulator layer and the full autonomy stack together, the project enables deep understanding and control over every component of an autonomous vehicle system.

About

This project builds a full-stack autonomous driving system on top of NVIDIA Isaac Sim, including a realistic urban simulation and an end-to-end autonomy pipeline. It integrates multi-modal sensors, perception, mapping, planning, control, and learning to enable closed-loop autonomous driving research and evaluation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages