Jesse Clark is a software engineer who has built distributed systems for a number of Silicon Valley startups. At NASA he developed databases for the International Space Station and robotic simulations for the Hubble Space Telescope.
Herse, S, Vitale, J, Ebrahimian, D, Tonkin, M, Ojha, S, Sidra, S, Johnston, B, Phillips, S, Gudi, SLKC, Clark, J, Judge, W & Williams, MA 2018, 'Bon Appetit! Robot Persuasion for Food Recommendation', ACM/IEEE International Conference on Human-Robot Interaction, pp. 125-126.View/Download from: UTS OPUS or Publisher's site
© 2018 Authors. The integration of social robots within service industries requires social robots to be persuasive. We conducted a vignette experiment to investigate the persuasiveness of a human, robot, and an information kiosk when offering consumers a restaurant recommendation. We found that embodiment type significantly affects the persuasiveness of the agent, but only when using a specific recommendation sentence. These preliminary results suggest that human-like features of an agent may serve to boost persuasion in recommendation systems. However, the extent of the effect is determined by the nature of the given recommendation.
Vitale, J, Tonkin, M, Herse, S, Ojha, S, Clark, J, Williams, M, Wang, X & Judge, W 2018, 'Be More Transparent and Users Will Like You: A Robot Privacy and User Experience Design Experiment', Proceedings of 2018 ACM/IEEE International Conference on Human- Robot Interaction, ACM/IEEE International Conference on Human-Robot Interaction 2018, ACM, Chicago, pp. 379-387.View/Download from: UTS OPUS or Publisher's site
Tonkin, M, Vitale, J, Ojha, S, Clark, J, Pfeiffer, S, Judge, W, Wang, X & Williams, M 2017, 'Embodiment, Privacy and Social Robots: May I Remember You?', Social Robotics: 9th International Conference, ICSR 2017, International Conference on Social Robotics, Springer International Publishing, Tsukuba, Japan, pp. 506-515.View/Download from: UTS OPUS or Publisher's site
As social robots move from the laboratory into public settings the possibility of unwanted intrusion into a user's personal privacy is magnified.
The actual social interaction between human and robot may involve anthropomorphising of the robot by the user, and this may prompt the user to disclose private or sensitive information. To comprehend possible impacts we conducted an exploratory study with a novel privacy measure to understand changes to users' privacy considerations when interacting with an embodied robotic system vs a disembodied system.
In this paper we measure the difference in personal information provided to such systems, and discuss the idea that embodiment may increase users' risk tolerance and reduce their privacy concerns.
Raza, SA, Clark, J & Williams, M 2016, 'On Designing Socially Acceptable Reward Shaping', Social Robotics, International Conference on Social Robotics (ICSR), Springer, Kansas City, MO, USA, pp. 860-869.View/Download from: UTS OPUS or Publisher's site
For social robots, learning from an ordinary user should be socially appealing. Unfortunately, machine learning demands an enormous amount of human data, and a prolonged interactive teaching session becomes anti-social. We have addressed this problem in the context of reward shaping for reinforcement learning. For efficient reward shaping, a continuous stream of rewards is expected from the teacher. We present a simple framework which seeks rewards for a small number of steps from each of a large number of human teachers. Therefore, it simplifies the job of an individual teacher. The framework was tested with online crowd workers on a transport puzzle. We thoroughly analyzed the quality of the learned policies and crowd's teaching behavior. Our results showed that nearly perfect policies can be learned using this framework. The framework was generally acceptable in the crowd's opinion.
Cloud Robotics (CR) is an emerging and successful approach to robotics. The number of robots or other IoT
devices may increase drastically in the future which might need
enormous bandwidth and there might be security concerns. If
robots in CR are not secured then robots can even become
surveillance bot by hackers. Moreover, if an internet connection
is lost due to network hitches then in that crucial moment robot
may not be available to complete its given task. For example,
a robot assisting a person can stop working unexpectedly or
work with the instructions from hacker. In order to address
such problems, we propose a new approach to robotics - Fog
Robotics (FR) in this paper, so a network of robots can be used
more securely and efficiently as compared to CR.