[2001.02366v1] What can robotics research learn from computer vision research?
We have shown how simulation can be used to rigorously evaluate a standard SLAM algorithm under varying lighting and viewpoint conditions (Section ??) and to enable a new competition to advance the performance of robotic vision systems with respect to dealing with uncertainty (Section ??)
Abstract. Keywords: robotics, computer vision, research methodology
‹Fig. 1: Performance in ILSVRC over 2011-2016. The lowest datapoint for 2012 is AlexNet. (A short history of computer vision and robotics research)Fig. 6: Simulated environment used to evaluate ORB-SLAM performance. (Applying the evaluation methodology to robotics research)Fig. 7: Example images at the same point showing, from top to bottom, the effect of times of day, lateral offset, camera yaw angle and camera pitch angle. (Applying the evaluation methodology to robotics research)Fig. 8: OrbSLAM average trajectory error over both time of day and lateral offset and camera orientation change (Applying the evaluation methodology to robotics research)Fig. 9: Probabilistic object detectors express semantic and spatial uncertainty. Object locations are reported as probabilistic bounding boxes where corners are modelled as 2D Gaussians (left) used to express a spatial uncertainty over the pixels (centre). Semantic uncertainty is represented as a full label probability distribution (right). (A simulation-based robotic vision competititon)Fig. 10: An overview of the BenchBot system architecture. The client uses BenchBot’s API to test their algorithm on our robots and packages the code in a Docker file that is upload to our server through a website. The code is then executed on the robot and the evaluation results is returned to the client at the end of the task. (Revisiting Assertion 1)Fig. 11: BenchBot can sit on top of ROS on real robots or robotic simulators that use ROS such as NVIDIA Isaac enabling the same client code to run seamlessly on real or simulated robots. (Benchbot: access to robots for everyone)›