«by Amir H. Behzadan A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Civil Engineering) in ...»
Georeferenced Visualization of Dynamic Construction
Processes in Three-Dimensional Outdoor Augmented Reality
Amir H. Behzadan
A dissertation submitted in partial fulfillment
of the requirements for the degree of
Doctor of Philosophy
in The University of Michigan
Dr. Vineet R. Kamat, Chair
Dr. Photios G. Ioannou
Dr. John G. Everett Dr. Klaus-Peter Beier Amir H. Behzadan © 2008 All Rights Reserved To My Lovely Family, That if it were not for their help and support, I could not make my way towards my every success.
To Sara, Who held my hands all along the way, Who never left me alone.
ii Acknowledgments The present document is my final Ph.D. dissertation. This dissertation describes my graduate research which began in September 2004 and concluded in April 2008.
During this period, I had the privilege to work with Professor Vineet R. Kamat who served as my research and graduate advisor in the Department of Civil and Environmental Engineering at the University of Michigan. During this period, he has been extremely diligent and willingly helpful. I would like to take this opportunity to specially thank him for his sincere and resolute support. He has always been an excellent teacher and an incredible advisor and I really appreciate his every effort and help to achieve the objectives of the presented work. I would also like to appreciate the support of the other members of my Ph.D. committee, Professors Photios Ioannou, Klaus-Peter Beier, and John Everett, for their contributions to the presented work, and for providing me with invaluable advice and deep insight and suggestions to improve the quality of my work. I am also grateful to Professors Sherif El-Tawil and Jerome Lynch at the University of Michigan, Professor Kincho Law at Stanford University (Stanford, CA), and Mr. Parham Khoshkbari at Kleinfelder Construction Company (San Jose, CA) for their confidence in and support of my research. Their deep insight and advice improved the quality of this research and provided me with the necessary boost that helped accomplish the work in a timely manner.
I would also like to thank my parents, Mohammad Ali Behzadan and Mehri Daliri Farahani, my brother Afshin, and my sister Nazanin, who have always been a major source of motivation and support for me in every minute of my life. My special thanks go to Sara Jabbarizadeh, my wonderful friend, who was a constant inspiration and emotional support for me during my Ph.D. years and encouraged me a lot to accomplish my goals. I
I am very thankful to all my friends, especially Hiam Khoury, Mustafa Saadi, Rita Awwad, and Chachrist Srisuwanrat for their sincere friendship and help during the past years.
Amir H. Behzadan
Figure 2.1 – Previous Work in Application of AR Animation in Construction 20 Figure 2.
2 – Relation between the DES, CAD and GPS Data, and AR Animation 24 Figure 2.3 – Profile of the User with Mobile Backpack and Registration Devices 25 Figure 2.4 – ARVISCOPE Animation Trace File Interpretation Cycle 26 Figure 2.5 – Creating an Augmented Scene Using ARVISCOPE Statements 29 Figure 2.6 – ACD for an Earthmoving Operation in STROBOSCOPE 30 Figure 2.7 – Automated Generation of an ARVISCOPE Animation Trace File 31 Figure 2.8 – Main Processing Loop of an Animation Trace File in ARVISCOPE 32 Figure 2.9 – Portion of a Sample ARVISCOPE Animation Trace File 33 Figure 2.10 – Switching from Global to User’s Local Coordinate Frame 35 Figure 2.11 – Sample Scenarios Involving Positional Measurement Problems 37 Figure 2.12 – Calculation of a Global Point Coordinates in ARVISCOPE 39 Figure 2.13 – Definition of Reference, Dummy, and Target Points in Planar view 40 Figure 2.14 – Different Syntax of the POSITION and ROUTE Statements 41 Figure 2.15 – Defining a Route Using Relative Coordinate Values 41 Figure 2.16 – Relation between Coordinate Frames in an AR Scene Hierarchy 42 Figure 2.17 – Relation between Coordinate Frames of Different CAD Objects 43 Figure 2.18 – Transformation Chain between the Lowest and Highest Nodes 44 Figure 2.19 – Calculating the Position of a Newly Disassembled Child Node 45 Figure 2.20 – Designed Transformation Chain Algorithm 46
Figure 6.1 – Aerial View of the Offshore Concrete Delivery Operation Experiment 156 Figure 6.
2 – Timeline of the Offshore Concrete Delivery Operation Experiment 157 Figure 6.3 – Animated Offshore Concrete Delivery in ARVISCOPE and VITASCOPE 157 Figure 6.4 – Aerial View of the Earthmoving Operation Experiment 159 Figure 6.5 – Timeline of the Earthmoving Operation Experiment 160 Figure 6.6 – Earthmoving Operation with Continuous Change in User’s Global 160 Position Figure 6.7 – Earthmoving Operation with Change in User’s Head Orientation 161 Figure 6.8 – Animated Structural Steel Erection in ARVISCOPE and VITASCOPE 163 x Figure 6.9 – Correcting Occlusion between a Virtual Truck and a Real Brick Wall 165 Figure 6.10 – Correcting Occlusion between a Virtual Excavator and a Real Structure 165 Figure 6.11 – Correcting Occlusion between a Virtual Dozer and a Real Tower Crane 166 Figure 6.12 – Correcting Occlusion between a Virtual Forklift and a Real Container 166
xi Figure C.18 – Contents of the STROBOSCOPE Simulation Input File (Part 2) 216 Figure C.19 – Contents of the STROBOSCOPE Simulation Input File (Part 3) 217 Figure C.20 – Contents of the STROBOSCOPE Simulation Input File (Part 4) 218 Figure C.21 – Contents of the STROBOSCOPE Simulation Input File (Part 5) 218 Figure C.22 – Instrumenting the STROBOSCOPE Simulation Input File (Part 1) 219 Figure C.23 – Instrumenting the STROBOSCOPE Simulation Input File (Part 2) 220 Figure C.24 – Instrumenting the STROBOSCOPE Simulation Input File (Part 3) 221 Figure C.25 – Lines Added to the ARVISCOPE Animation Trace File for DigSoil 221 Figure C.26 – Instrumenting the STROBOSCOPE Simulation Input File (Part 4) 221 Figure C.27 – Lines Added to the ARVISCOPE Animation Trace File for LiftBoom 221 Figure C.28 – Instrumenting the STROBOSCOPE Simulation Input File (Part 5) 222 Figure C.29 – Instrumenting the STROBOSCOPE Simulation Input File (Part 6) 223 Figure C.30 – Larger Portion of the ARVISCOPE Animation Trace File 224
xii Figure D.19 – Processing of SIDEORIENT Statement 245 Figure D.20 – Processing of ORIENT Statement 246 Figure D.21 – Processing of ORIENTTO Statement (Part 1) 247 Figure D.22 – Processing of ORIENTTO Statement (Part 2) 248 Figure D.23 – Processing of TRAVEL Statement (Part 1) 249 Figure D.24 – Processing of TRAVEL Statement (Part 2) 250 Figure D.25 – Processing of TRANSFER Statement (Part 1) 251 Figure D.26 – Processing of TRANSFER Statement (Part 2) 252 Figure D.27 – Processing of SHIFT Statement 253 Figure D.28 – Processing of SHIFTTO Statement 254 Figure D.29 – Processing of SIZE Statement 255 Figure D.30 – Processing of SIZETO Statement 256 Figure D.31 – Pseudo Code for the Update Process (Part 1) 257 Figure D.32 – Pseudo Code for the Update Process (Part 2) 258 Figure D.33 – Pseudo Code for the Update Process (Part 3) 259 Figure D.34 – Pseudo Code for the Update Process (Part 4) 260 Figure D.35 – Pseudo Code for the Update Process (Part 5) 261
Table 5.1 – Mechanisms for Handling Occlusion in Different Display Systems 127 Table 5.
2 – Effect of Depth Representative Point Selection on Occlusion 135 Table 5.3 – Manufacturer’s Properties of Different Flash LADAR Devices 141
Appendix A – ARVISCOPE Language Statements 177 Appendix B – Defining a Global Reference Point Appendix C – Guide to Create an ARVISCOPE Animation Appendix D – Flowchart and Pseudo Code Appendix E – Biography
2D: Two Dimensional 3D: Three Dimensional 3DOF: Three Degree of Freedom 6DOF: Six Degree of Freedom 4D: Four Dimensional ACA: American Chiropractic Association ACD: Activity Cycle Diagram API: Application Programming Interface AR: Augmented Reality ARCHEOGUIDE: Augmented Reality Based Cultural Heritage On-Site Guide ARION: Augmented Reality for Intra-Operative Navigation ARVISCOPE: Augmented Reality Visualization of Simulated Construction Operations ASCII: American Standard Code for Information Interchange BARS: Battlefield Augmented Reality System CAD: Computer-Aided Design CAVE: Computer-Aided Virtual Environment COTS: Commercial-off-the-Shelf CPU: Central Processing Unit CRC: Cyclic Redundancy Check CT: Computed Tomography DES: Discrete Event Simulation DGPS: Differential Global Positioning System DLL: Dynamic Link Library EOL: End of Line GPS: Global Positioning System xvi HFOV: Horizontal Field of View HMD: Head Mounted Display HTML: Hyper Text Markup Language IT: Information Technology LADAR: Laser Detection and Ranging LO: Local Origin MRI: Magnetic Resonance Imaging MSL: Mean Sea Level MTM: Modified Transformation Method NMEA: National Marine Electronics Association OOD: Object Oriented Design P&P: Plug and Play RGB: Red-Green-Blue RGP: Rear Guide Point RTK: Real Time Kinematics SVGA: Super Video Graphics Array STROBOSCOPE: State and Resource Based Simulation of Construction Processes URL: Uniform (Universal) Resource Locator USB: Universal Serial Bus VFOV: Vertical Field of View VGA: Video Graphics Array VITASCOPE: Extensible and Scalable 3D Visualization of Simulated Construction Operations VR: Virtual Reality VRML: Virtual Reality Modeling Language WAAS: Wide Area Augmentation System X3D: Extensible 3D XML: Extensible Markup Language
Construction processes can be conceived as systems of discrete, interdependent activities.
Discrete Event Simulation (DES) has thus evolved as an effective tool to model operations that compete over available resources (personnel, material, and equipment). A DES model has to be verified and validated to ensure that it reflects a modeler’s intentions, and faithfully represents a real operation. 3D visualization is an effective means of achieving this, and facilitating the process of communicating and accrediting simulation results.
Visualization of simulated operations has traditionally been achieved in Virtual Reality (VR). In order to create convincing VR animations, detailed information about an operation and the environment has to be obtained. The data must describe the simulated processes, and provide 3D CAD models of project resources, the facility under construction, and the surrounding terrain (Model Engineering). As the size and complexity of an operation increase, such data collection becomes an arduous, impractical, and often impossible task. This directly translates into loss of financial and human resources that could otherwise be productively used. In an effort to remedy this situation, this dissertation proposes an alternate approach of visualizing simulated operations using Augmented Reality (AR) to create mixed views of real existing jobsite facilities and virtual CAD models of construction resources. The application of AR in animating simulated operations has significant potential in reducing the aforementioned Model Engineering and data collection tasks, and at the same time can help in creating visually convincing output that can be effectively communicated.
This dissertation presents the design, methodology, and development of ARVISCOPE, a general purpose AR animation authoring language, and ROVER, a mobile computing xviii hardware framework. When used together, ARVISCOPE and ROVER can create threedimensional AR animations of any length and complexity from the results of running DES models of engineering operations. ARVISCOPE takes advantage of advanced Global Positioning System (GPS) and orientation tracking technologies to accurately track a user’s spatial context, and georeferences superimposed 3D graphics in an augmented environment. In achieving the research objectives, major technical challenges such as accurate registration, automated occlusion handling, and dynamic scene construction and manipulation have been successfully identified and addressed.