Withdraw
Loading…
Goal-driven autonomous reality capture for construction monitoring applications
Ibrahim, Amir Ayman Elsayed Mahfouz Hassan
Loading…
Permalink
https://hdl.handle.net/2142/115637
Description
- Title
- Goal-driven autonomous reality capture for construction monitoring applications
- Author(s)
- Ibrahim, Amir Ayman Elsayed Mahfouz Hassan
- Issue Date
- 2022-04-17
- Director of Research (if dissertation) or Advisor (if thesis)
- Golparvar-Fard, Mani
- El-Rayes, Khaled
- Doctoral Committee Chair(s)
- Golparvar-Fard, Mani
- Committee Member(s)
- Liu, Liang
- El-Gohary, Nora
- Gupta, Saurabh
- Department of Study
- Civil & Environmental Eng
- Discipline
- Civil Engineering
- Degree Granting Institution
- University of Illinois at Urbana-Champaign
- Degree Name
- Ph.D.
- Degree Level
- Dissertation
- Keyword(s)
- Reality Capture: Photogrammetry: View planning: Mission planning: visual quality: BIM: UAV: UGV: Improving reality Capture: UAV-based data collection: Data collection simulation: Multi-Objective path planning: Divide-and-conquer: Automatic data collection: Reinforcement learning: Path planning: Robotic navigation policy: SLAM: ROS: LiDAR: RGBD
- Abstract
- The recent growth in reality capture and BIM-enabled workflows on construction projects have created a surge in the development of computer vision solutions that automatically monitor the state of job sites. These solutions offer real-time and actionable insight into the latest state of physical progress, safety, and quality to the project teams. While the performance of these methods heavily relies on the accuracy and completeness of the collected data, the current techniques for reality capture planning and execution do not satisfy such requirements. Furthermore, the current workflows of capturing reality data using camera-equipped drones and rovers impose additional operational constraints such as navigation safety and battery requirements. In the absence of reliable solutions, experienced operators need to thoroughly tweak robotic data collection missions. These manual modifications still do not guarantee the required functional and operational requirements and do not scale well to the needs of frequently operating these camera-equipped robotic platforms. To address the current gaps in knowledge, this dissertation presents an end-to-end solution for automatic planning and collection of goal-driven reality capture data on construction sites. The solution consists of (1) new metrics to objectively evaluate and compare reality capture plans for both technical and operational requirements, (2) a 3D environment to visualize and simulate reality capture plans against 4D BIM and existing 3D reality models and provide interactive visual feedback on planned robotic missions, (3) comprehensive optimization methods to improve robotic missions by maximizing the resulting visual quality and minimizing the data collection duration, and (4) an autonomous navigation system that utilizes cameras, 2D Light Detection and Ranging (LiDAR) sensors, and Global Positioning System (GPS) to automatically execute reality capture missions in both outdoor and indoor workspaces. Using prior 4D BIM and 3D reality models, visual quality metrics are proposed to provide prompt feedback on the quality of a reality capture plan. Specifically, the metrics account for the visibility and resolution of the constructed elements in the collected data, the completeness of the capture, and the expected stability of image-based 3D reconstruction techniques. In addition, in the case of reality captures conducted with camera-equipped Unmanned Aerial Vehicles (UAVs), operational requirements –including battery capacity and operator’s Line Of Sight (LOS)– are also carefully considered for safe and autonomous navigation purposes. A novel multi-objective optimization method is presented to improve the visual quality of the collected data, reduce the data collection duration, and assure safe UAV operation. In the case of indoor reality capture, a reinforcement learning approach is presented to generate optimized data collection paths for camera-equipped Unmanned Ground Vehicles (UGVs). This learning model improves indoor navigation policies based on positive and negative reinforcement rewards that target technical goals, operational requirements, and navigation duration. A divide-and-conquer greedy algorithm is adopted to reduce the space and computational complexities for learning the optimal policy. A mobile application is developed for autonomous outdoor reality capture execution integrating Global Positioning System (GPS) signals to navigate UAVs following the planned waypoints. Besides, UGVs are programmed to navigate the narrow and cluttered indoor construction workspaces automatically. Simultaneous Localization and Mapping (SLAM) and 2D LiDAR scan matching algorithms are used to localize the ground rover indoors due to limited satellite signals. The latter system enables navigating the optimized waypoints, avoiding obstacles, and collecting the data. Results from several datasets captured from real-world construction projects demonstrate (1) the significant correlation between the proposed visual quality metrics and the resulting completeness and accuracy of the captured reality data, (2) the potential of using the developed 3D environment to interactively and precisely improve reality capture missions, (3) the effectiveness of the proposed outdoor reality capture optimization methods reaching a 7.65% improvement in visual coverage, 30.89% enhancement in resolution, and 8.95% more stable 3D reconstruction, (4) the capability of learned indoor navigation policies to achieve the required visual quality with only 47% of the lawnmower patterns’ waypoints and within 38% less duration, and (4) the reliability of the integrated navigation system to automatically and accurately collect construction reality data with an average localization error of 1m outdoors and 0.5m indoors. The benefits and limitations of these techniques to the practice of construction project monitoring are discussed in detail.
- Graduation Semester
- 2022-05
- Type of Resource
- Thesis
- Copyright and License Information
- Copyright 2022 Amir Ayman Elsayed Mahfouz Hassan Ibrahim
Owning Collections
Graduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…