September-October 2018 NPJ

Virtual Reality for Education, Training and Dose Reduction By Rizwan-uddin, University of Illinois at Urbana-Champaign. Rizwan-uddin Dr. Rizwan-uddin is professor and head of the Department of Nuclear, Plasma, and Radiological Engineering at the University of Illinois at Urbana- Champaign (UIUC). He directs the Virtual Education and Research Lab (VERL). He is also the Director of Master of Engineering in Energy Systems Program. Professor Rizwan- uddin has made seminal research contributions in the development and analysis of: two-phase flow and BWR stability; advanced numerical methods for thermal hydraulics problems, computational fluid dynamics (CFD), and large scale, high performance computing for nuclear applications; analytical benchmarks for heat transfer problems; and new self-consistent turbulence models and associated closure laws for flow in porous media. He is a Fellow of the American Nuclear Society. Virtual reality (VR) and augmented reality (AR) have seen increasing use in various industries. They have been used for workplace training purposes in, among others, the medical field, aerospace field and even in professional sports. Several universities and research labs have explored the use of VR/AR for the nuclear industry. It has been recognized that this technology is most useful in outreach, education, and maintenance training—specifically for dose minimization efforts. Virtual Education and Research Lab (VERL) The Virtual Education and Research Laboratory (VERL) in the Department of Nuclear, Plasma, and Radiological Engineering at the University of Illinois at Urbana-Champaign (UIUC) has been at the forefront of VR/AR applications to the nuclear field, publishing its first paper on this subject in 2001. With rapid evolution in 3D modeling and VR technology, one of the challenges over the years has been to stay abreast with the latest gadgets, and to make sure that models and tools developed earlier to work with older technology continue to be usable on new, more advanced, technology. The bright side is that the cost impediment is rapidly going away—one can have better VR experience today with a few hundred dollars VR gadget than was possible with a million dollar CAVE a decade ago. Technology advancement in VR is primarily driven by the video-game industry. Final products for outreach or for training of NPP personnel look like a (training or educational) video game, which can be played on a laptop, tablet, smartphone, or in a 3D immersive and interactive environment. 3D Models and Features Development of 3D models of work spaces—essentially the first step in any VR application—has benefitted from several technology advances over the last two decades. With 3D CAD software and software specifically designed to develop videogames, development of realistic looking 3D models of existing facilities has becomemucheasier.Another common method to create 3D models of facilities is through laser scanning and creating point clouds. It is now also possible to use photogrammetry—mapping spaces by stitching 2D photos together, to create 3D objects. Photogrammetry can be used for interior spaces, as well as for outdoor spaces and large buildings. Drones can be used to take overhead photographs of buildings or outdoor environments, which can then be stitched together to build the 3D model. Easy to use software to create realistic 3D models using pictures taken with a smartphone appears to be around the corner. Being able to create 3D spaces through the use of pictures taken with a smartphone, and then creating interactive virtual spaces using the 3D models will be a very efficient way to create training models of actual facilities at NPPs. After the creation of the 3D model, development of specialized features relevant to the intended educational or training goals is the next major step toward the creation of a VR app. These features range from an ability to turn a switch on or off, turn a knob, pick up an object, turn a screwdriver, etc, in the virtual space. Virtual model should also respond to the action by, for example, turning the lights on or off. Interactive features in virtual models are implemented using scripts (user written, and imbedded codes). The two most common scripting languages for Unity3D are C# and JavaScript. Scripts are written and imbedded in virtual models to allow interactivity as well as for data display. Data displayed in these virtual environments are calculated using the corresponding physics models, and are displayed in various forms— including virtual LED displays, virtual computer monitors, and even virtual models of analog meters. Scripts have been written, for example, to watch an 30 NuclearPlantJournal.com Nuclear Plant Journal, September-October 2018

RkJQdWJsaXNoZXIy NDM0NA==