Leading research. The University of Sheffield is ranked number 1 for EPSRC research income in robotics (in 2015 & 2016, The University of Sheffield received more EPSRC funds for robotics research than any other university).
Growth. There are currently approximately 70 academic and research leads in The University of Sheffield, with about 40 members in Sheffield Hallam University.
Field Robotics. Launch of the Sheffield Robotics Field Robotics Facilities – a large indoor test and development facility complemented by a 12 acre rural site for larger-scale aerial and ground robotics.
Growth. The membership of Sheffield Robotics grew to include 65 academic members, split across the two universities.
Sheffield Robotics. Relaunched with the name Sheffield Robotics to reflect the expansion of the organisation to include a wide range of centres and groups in Sheffield conducting research in the area of Robotics and Autonomous Systems.
Northern Robotics Network. The Northern Robotics Network, which is linked to the InnovateUK Robotics and Autonomous Systems Special Interest Group., was established together with groups at the Universities of Liverpool, Leeds, Manchester, Salford and York.
Festival of Mind. For the 2014 Sheffield Festival of Mind the Sheffield Centre for Robotics organised a week long series of events on the theme New Age of Robotics. The robot-themed events were widely reported including articles in the Telegraph and El Pais.
New Laboratory, the “Robot Foundry”. TUOS established a new 300m2 laboratory in the newly constructed Pam Liversidge Building representing a University investment in robotics of ~£3M.
Capital Investment. Together with the University of Liverpool, Sheffield Centre for Robotics wins a £1M investment from theEPSRC Capital for Great Technologies Call. Capital funds have been used to purchase equipment for research in collective robotics, flexible automation, human-robot interaction, cognitive robotics, assistive robotics and field robotics.
The Tactile Helmet. A sensory augmentation device developed to assist fire-fighters moving through smoke-filled buildings was been piloted with South Yorkshire Fire and Rescue and exhibited at the Gadget Show Live in 2013, and at the “Living Machines 2013” exhibition at the London Science Museum. Development of the helmet was supported by an EPSRC/TUOS Proof-of-concept grant. A field test with the helmet has been featured by Discovery Channel Canada.
The G2 BIOTACT Sensor. TUOS Adaptive Behaviour Research Group and Bristol Robotics Lab developed a version of their modular artificial vibrissal system that could be attached to a robot arm and that included additional degrees of freedom for whisker positioning. The G2 Sensor was used to validate a model of tactile attention derived from studies of rodent orienting and whisking behaviour and was demonstrated at the “Living Machines 2013” exhibition at the London Science Museum. The sensor was developed as part of the EU FP7BIOTACT Project which was led by TUOS. Artificial vibrissal sensors continue to be developed, in partnership with BRL, for applications such as fault detection, underwater sensing, and structural health monitoring.
Shrewbot. The third generation whiskered mobile robot was designed to emulate the prey-catching behaviour of the smallest terrestrial mammal the Etruscan shrew. Collaborating with biologists at the Bernstein Centre for Computational Neuroscience in Berlin, researchers at the TUOS Adaptive Behaviour Research Group and Bristol Robotics Lab developed the most complete brain-based robot control system to date, including algorithms for tactile simultaneous localisation and mapping (tactile SLAM). Shrewbot was demonstrated at FET (Future Emerging Technologies) 2011 Exhibition in Budapest, at the 2012 “Living Machines” exhibition in Barcelona, and featured on the BBC Current Affairs programme the One Show. Shrewbot was developed as part of the EU FP7 BIOTACT Project which was led by TUOS.
REINS. The REINS project, founded by Engineering and Physical Sciences Research Council grant at Centre for Automation and Robotics Research, develops a semi-autonomous mobile robot with sensory capabilities that can be shared with humans. It focusses on haptic and tactile human robot cooperation. The main aim is to design and investigate haptic communication interfaces (reins) between a human agent and a mobile robot guide.
Sheffield Centre for Robotics (SCentRo). SCentRo was created to bring together and promote research activities in advanced robotics across the two Universities of Sheffield. Membership of the centre was initially drawn from research groups inAutomatic Control and Systems Engineering, Computer Science, and Psychology in TUOS, and from the Centre for Automation and Robotics Research, and the Art & Design Research Centre at SHU.
TAROS 2011. Conference organisation.
26th Nov 2010: Sheffield Hallam University via the MMVL group is co-organising Taros 2011 conference in collaboration with the University of Sheffield.
Natural Robotics Lab. With funding from the EU FP7 EVOLVINGROBOT project, the Natural Robotics Lab was founded. It investigates robotic systems inspired by nature and models of natural systems.
22nd January 2010: The Centre for Automation and Robotics Research (CARR) has been opened by Professor Noel Sharkey from Sheffield University.
ViewFinder. A successful completion of a cutting-edge projectViewFinder by The Mobile Machine and Vision Laboratory at theCentre for Automation and Robotics Research at Sheffield Hallam University. The project built an autonomous robotic system to establish ground safety in the event of a fire. It helps to gather data (visual and chemical) to assist fire rescue personnel. The Viewfinder robots use chemical sensors and video cameras to map safe locations for the crew to access in partially destroyed industrial sites, after events such as explosions.
Guardians. A successful completion of a ground-breaking project the GUARDIANS at the Mobile Machine and Vision Laboratory at Centre for Automation and Robotics Research at Sheffield Hallam University. A swarm of autonomous robots was developed to navigate and search an urban ground. The main example employed an industrial warehouse in smoke. This is a very dangerous situation where toxins can be released and human senses can be severely impaired. The robots warn of toxic chemicals, provide and maintain mobile communication links, infer localisation information and assist in searching. They enhance operational safety and thus indirectly save lives. GUARDIANS was discussed on ‘More 4’ news, featured inScience Museum, and in ‘Security Europe’.
Scratchbot. A second biomimetic whiskered robot co-developed with Bristol Robotics Laboratory combining animal-like vibrissal sensing with brain-based models of sensorimotor control. Scratchbot won a 2009 Award from Popular Science, was reported in Science magazine, inspired a children’s book, appeared on the cover of a Cambridge University Press book onbrain-based robots, and was featured in a documentary for the Discovery Channel Canada. Scratchbot was developed with funding from the EU FP6 ICEA Project.
The Nanorobotics Project. Successful completion.
A large new nanotechnology research programme Nanorobotics – technologies for simultaneous multidimensional imaging and manipulation of nanoobjects led by the Engineering Materials Department of the University of Sheffield was a collaboration of project partners from Sheffield (project leader), Sheffield Hallam and Nottingham.
Many new nanotechnology research fields require a high degree of precision in both observing and manipulating materials at the atomic level. The advanced nanorobotics technology needed to manipulate materials at this scale, a million times smaller than a grain of sand, was developed in the new Sheffield Nanorobotics group.The integration of different technologies to act as simultaneous real-time nanoscale “eyes” and “hands”, including the advanced nanorobotics, high-resolution ion/electron microscopy, image processing/vision control and sophisticated sensors, will lead to the ability to manipulate matter at the scale of atoms or molecules.
I-SWARM. Intelligent Small World Autonomous Robots for Micro-Manipulation (I-SWARM) project in which the the Mobile Machine and Vision Laboratory of Sheffield Hallam participated, aims at technological advances to facilitate the mass-production of microrobots, which can then be employed as a “real” swarm consisting of up to 1,000 robot clients. These clients will all be equipped with limited, pre-rational on-board intelligence. The swarm will consist of a huge number of heterogeneous robots, differing in the type of sensors, manipulators and computational power. The project was featured on numerous news channels, including the Discovery Channel and Robots.net.
Whiskerbot. This biomimetic whiskered robot was co-developed by the TUOS Adaptive Behaviour Research Group with theBristol Robotics Laboratory and funded by the EPSRC. The robot was developed to investigate animal-like vibrissal tactile sensing and its possible translation into robotics. Whiskerbot was used to validate a model of early sensory processing the facial nerve, and to investigate control strategies for vibrissal active touch. The research was featured by the BBC Radio programme Material World.
The European Union IST project MiCRoN in which the Mobile Machine and Vision Laboratory of Sheffield Hallam took part, developed a multi-robot manipulation system capable of handling Âµm-sized objects. The system is based on a small cluster of about a few cubic-centimetre-sized robots. Each robot is equipped with onboard electronics for communications and control. These robots are controlled by infrared communication and they can be equipped with various tools such as syringe-chips, grippers or AFM probes.
The robot reticular formation. Reviving and re-interpreting a classic computational neuroscience model, researchers at the TUOS Adaptive Behaviour Research Group demonstrated that a functional model of the brainstem reticular formation that can perform effective action selection in a mobile robot. This research, which was sponsored by the EPSRC, also led to the identification of the reticular formation as a small-world network and to the invention of novel methods for measuring ‘small-worldness’.
MINIMAN. Miniaturised robot for micro manipulation in whichthe Mobile Machine and Vision Laboratory of Sheffield Hallam took part, developed smart microrobot with 5 degrees of freedom and a size of a few cubic cm, capable of moving and manipulating by the use of tube-shaped and multilayered piezoactuators.
The robot basal ganglia. A functional model of the vertebrate basal ganglia, developed by the TUOS Adaptive Behaviour Research Group, is demonstrated as capable of performing action selection in a Khepera mobile robot configured to do a foraging task. The model which has implications for understanding failures of action selection in brain disorders such as Parkinson’s disease, has been incorporated in various forms into a number of later brain-based robots and tested on commercial robot platforms in partnership with BAE systems.
The Sheffield Arm. Researchers at SHU Art & Design Research Centre developed an anthropomorphic robot arm and hand accurately modelled on the human physiology. The arm design was later commercialised by Elumotion and different versions have been used at laboratories around the world including NASA’s Jet Propulsion Laboratories.
A behaviour based robot, designed by Jacques Penders, has been built with the help of colleagues at the KPN-Research lab.
The TINA image processing system. The AI Vision Research Unit (AIVRU), in the TUOS Department of Psychology, was founded in the 1970s to develop computer vision systems that exploited the principles underlying human vision.
AIVRU combined the developed of computer vision algorithms with psychophysical studies of human vision, establishing a close relationship between experimental research and development of biomimetic AI that has been a hallmark of a significant strand of robotics research at Sheffield.
AIVRU was particularly well known for the development of stereo vision algorithms. Designed with robotic applications in mind, the AIVRU TINA vision system, has been used for applications such as pick and place with a variety of industrial partners, including Toyota. TINA continues to be developed and supported as an Open Source Image Processing package.