Towards generalizable motion planning: Efficient and safe learning-based frameworks

dc.contributor.authorGhafarian Tamizi, Mehran
dc.contributor.supervisorNajjaran, Homayoun
dc.date.accessioned2025-12-04T23:02:34Z
dc.date.available2025-12-04T23:02:34Z
dc.date.issued2025
dc.degree.departmentDepartment of Electrical and Computer Engineering
dc.degree.levelDoctor of Philosophy PhD
dc.description.abstractRobotic motion planning remains a fundamental challenge in industrial automation, with manipulators offering a clear example of the need for real-time, collision-free, and safe trajectory generation. Traditional planners often face trade-offs among optimality, adaptability, and computational efficiency, limiting their applicability in cluttered and high-dimensional industrial environments. Furthermore, most learning-based planners suffer from poor generalization, requiring retraining when deployed in new scenes or on different robot platforms. This thesis presents two learning-based frameworks designed to address these challenges. First, the Path Planning and Collision Checking Network (PPCNet) is introduced, an end-to-end neural architecture that combines a waypoint generator with a learned collision checker to enable fast, safe, and reliable planning in structured environments. PPCNet is validated in both simulated and real-world bin-picking tasks, demonstrating substantial speedups over classical planners while maintaining path quality. To overcome the generalization limitations of PPCNet, Generalizable and Adaptive Diffusion-Guided Environment-aware Trajectory generation (GADGET) is proposed, a conditional diffusion-based motion planner guided by control barrier functions. GADGET leverages voxel-based scene encoding and goal conditioning to generate safe trajectories across previously unseen environments and robotic arms without retraining. The integration of barrier-function-based guidance enables robust collision avoidance during trajectory generation. Extensive experiments demonstrate that both frameworks achieve real-time planning performance and high success rates, with GADGET offering strong generalization to novel settings. This work highlights the potential of combining deep generative models with adaptable design to create scalable and broadly generalizable motion planners, capable of transferring across diverse environments and robot platforms with minimal modification.
dc.description.scholarlevelGraduate
dc.identifier.bibliographicCitationMehran Ghafarian Tamizi, Marjan Yaghoubi, and Homayoun Najjaran. "A review of recent trend in motion planning of industrial robots." International Journal of Intelligent Robotics and Applications 7, no. 2 (2023): 253-274.
dc.identifier.bibliographicCitationMehran Ghafarian Tamizi, Homayoun Honari, Aleksey Nozdryn-Plotnicki, and Homayoun Najjaran. "End-to-end deep learning-based framework for path planning and collision checking: bin-picking application." Robotica 42, no. 4 (2024): 1094-1112.
dc.identifier.bibliographicCitationMehran Ghafarian Tamizi, Homayoun Honari, Amir Mehdi Soufi Enayati, Aleksey Nozdryn-Plotnicki, and Homayoun Najjaran. "A Cross-Environment and Cross-Embodiment Path Planning Framework via a Conditional Diffusion Model." arXiv preprint arXiv:2510.19128 (2025).
dc.identifier.urihttps://hdl.handle.net/1828/22945
dc.languageEnglisheng
dc.language.isoen
dc.rightsAvailable to the World Wide Web
dc.subjectMotion Planning
dc.subjectPath Planning
dc.subjectDiffusion Models
dc.subjectZero-shot Generalization
dc.subjectCollision Avoidance
dc.titleTowards generalizable motion planning: Efficient and safe learning-based frameworks
dc.typeThesis

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Ghafarian_Tamizi_Mehran_PhD_2025.pdf
Size:
3.78 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.62 KB
Format:
Item-specific license agreed upon to submission
Description: