Autonomous UAV for Object Correspondence in Digital Twin Terrains
| Links: GitHub Repo | Final Report | Demo Video

This project ( RAS 598: Space Robotics and AI) develops an integrated UAV framework for detecting and quantifying rock displacements in a digital twin terrain.
It combines PX4 flight control, ROS 2 middleware, Gazebo simulation, and YOLOv8 object detection in a single workflow.
The UAV autonomously flies over a photogrammetry-generated terrain, detects rocks in real time, estimates 3D world coordinates, and compares positions across terrains to measure displacement.
Methodology
- Terrain Reconstruction
- 150+ overlapping photos processed in Meshroom (SfM + MVS).
- Cleaned in Blender, exported as textured meshes.
- Rocks modeled individually for detection.

- Simulation Setup in Gazebo
- Terrain + rocks imported as meshes in a custom SDF world.
- UAV platform: PX4 x500_depth drone.
- ROS–Gazebo bridge enabled RGB, depth, odometry topics.

- Data Collection & YOLOv8 Training
- RGB images captured via teleoperation, annotated in Roboflow.
- YOLOv8n fine-tuned for 4 rock classes.
- Achieved mAP@0.5 ≈ 0.995 with near-perfect classification.


- Real-Time Detection & Localization
- Rock centers projected via pinhole camera model using depth data.
- Coordinates transformed with TF2 (world → base_link → camera_link).
- Results visualized in RViz2.

- Displacement Detection
- A second world with displaced rocks validated correspondence detection.
- UAV detected 0.2–0.4 m displacements with ±0.2 m accuracy.
Results
- YOLOv8: 0.99+ precision, recall, and F1 across thresholds.
- Localization: ±0.2 m deviation in world-frame coordinates.
- Change Detection: Correctly flagged displaced rocks across scenes.
- Inference Speed: 15–20 ms per image → real-time capable.
Demo Video
System Pipeline
- ROS 2 Nodes:
control.py→ UAV teleop & arming.yolo.py→ real-time inference from RGB feed.coordinates.py→ depth-based 3D estimation + TF transforms.tf_broadcaster→ PX4 odometry to world transforms.ros_gz_bridge→ ROS–Gazebo communication.
- Workflow: detect → localize → transform → visualize → compare.
Conclusion
This UAV project demonstrates a complete simulation-driven framework for autonomous perception and environmental change detection.
It is directly extensible to:
- Planetary exploration
- Disaster response
- Archaeological site monitoring
- Infrastructure inspection
📄 Full Report: Download PDF
🔗 GitHub Repository: RAS598 Project