Sutjipto, S, Tish, D, Paul, G, Vidal Calleja, T & Schork, T 2018, 'Towards Visual Feedback Loops for Robot-Controlled Additive Manufacturing', Springer, Cham, Switzerland, pp. 85-97.View/Download from: UTS OPUS or Publisher's site
Robotic additive manufacturing methods have enabled the design and fabrication of novel forms and material systems that represent an important step forward for architectural fabrication. However, a common problem in additive manufacturing is to predict and incorporate the dynamic behavior of the material that is the result of the complex confluence of forces and material properties that occur during fabrication. While there have been some approaches towards verification systems, to date most robotic additive manufacturing processes lack verification to ensure deposition accuracy. Inaccuracies, or in some instances critical errors, can occur due to robot dynamics, material self-deflection, material coiling, or timing shifts in the case of multi-material prints. This paper addresses that gap by presenting an approach that uses vision-based sensing systems to assist robotic additive manufacturing processes. Using online image analysis techniques, occupancy maps can be created and updated during the fabrication process to document the actual position of the previously deposited material. This development is an intermediary step towards closed-loop robotic control systems that combine workspace sensing capabilities with decision-making algorithms to adjust toolpaths to correct for errors or inaccuracies if necessary. The occupancy grid map provides a complete representation of the print that can be analyzed to determine various key aspects, such as, print quality, extrusion diameter, adhesion between printed parts, and intersections within the meshes. This valuable quantitative information regarding system robustness can be used to influence the system's future actions. This approach will help ensure consistent print quality and sound tectonics in robotic additive manufacturing processes, improving on current techniques and extending the possibilities of robotic fabrication in architecture.
Alvarez, JK, Sutjipto, S & Kodagoda, S 2017, 'Validated ground penetrating radar simulation model for estimating rebar location in infrastructure monitoring', Proceedings of the 2017 12th IEEE Conference on Industrial Electronics and Applications, ICIEA 2017, IEEE Conference on Industrial Electronics and Applications, IEEE, Siem Reap, Cambodia, pp. 1460-1465.View/Download from: UTS OPUS or Publisher's site
© 2017 IEEE. Biogenic sulphide corrosion of reinforced concrete sewer pipes is an ongoing problem for wastewater governing bodies. Ensuring Workplace Health and Safety (WHS) is also an issue due to the harsh nature of sewer environments. As such, research into technologies that allow for automatic unmanned site assessments are of major priority to wastewater managing utilities. The use of Ground Penetrating Radar (GPR) is currently being investigated for it's ability to provide subsurface images. However, the GPR technology has not been tested and validated in harsh sewer environments. It is anticipated that the GPR interpretation can be hindered by low signal to noise ratio. As data driven machine learning techniques have proven to work in higly challenging data, our intenetion is to apply such techniques in GPR data processing. However, this is hindered by the lack of large amount of training data as it is prohibitively hard to collect such real experimental testing data. Thus, the aim of this study is to validate a ground penetrating radar simulation software, gprMax, and test it for suitability in generating realistic, big data sets with which to train the aforementioned data driven machine learning models supplemented with actual sewer crown data. The results of the study is the validation of the GPR simulator, tuned and able to generate reasonably realistic data. A novel concrete analog was also developed to allow for ease of testing of various parameters such as rebar cover depths and rebar spacing.
Lai, Y, Sutjipto, S, Clout, M, Carmichael, M & Paul, G 2018, 'GAVRe2: Towards Data-driven Upper-Limb Rehabilitation with Adaptive-Feedback Gamification', 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), IEEE, Kuala Lumpur, Malaysia, pp. 164-169.View/Download from: UTS OPUS or Publisher's site
This paper presents Game Adaptive Virtual Reality Rehabilitation (GAVRe2), a framework to augment upper limb rehabilitation using Virtual Reality (VR) gamification and haptic robotic manipulator feedback. GAVRe2 integrates independent systems in a modular fashion, connecting patients with therapists remotely to increase patient engagement during rehabilitation.
GAVRe2 exploits VR capabilities to not only increase the productivity of therapists administering rehabilitation, but also to improve rehabilitation mobility for patients. Conventional rehabilitation requires face-to-face physical interactions in a clinical setting which can be inconvenient for patients. The GAVRe2 approach provides an avenue for rehabilitation in a
domestic setting by remotely customizing a routine for the patient. Results are then reported back to therapists for data analysis and future training regime development.
GAVRe2 is evaluated experimentally through a system that integrates a popular VR system, a RGB-D camera, and a collaborative industrial robot, with results indicating potential benefits for long-term rehabilitation and the opportunity for upper limb rehabilitation in a domestic setting.