The modern world is faced with numerous grand challenges from climate change and energy security to sustainability and healthy ageing. This generation also benefits from exciting advances in computing, robotics, artificial intelligence and virtual engineering. The research being undertaken by IDEAS academics combined with VEC’s expertise in translating blue-sky research to applied industrial applications provide a new model for research industrialisation. This gives academics and industrial partners a unique opportunity for impact.
Research at IDEAS is organised into six different labs:
The Mixed Reality Laboratory (MR Lab)
The Mixed Reality Laboratory within DIF incorporates a large-scale Virtual Reality (VR) Powerwall providing up to three independent viewers with their own, accurate head tracked perspective, enabling activities such as collaborative design, object interaction, design reviews or multi person working to be experienced and characterised.
Images are displayed on a 5 x 2.5m screen in native 4k resolution at a refresh rate of 360Hz. The MR laboratory also features a large, tracked space where physical objects can be located to enhance the simulation experience. Participant and object tracking in this space is to sub-millimetre accuracy, with the capability to capture whole body motion of individuals and represent this in real time simulations, or with enhanced resolution track the motion of a participants fingers and represent this in high fidelity within the immersive simulation.
Furthermore, the lab is equipped with an array of Virtual and Augmented Reality head mounted displays for individual experiences, 4k touch screens for visualisation and interaction, and scanning technologies to allow the accurate capture and visualisation of existing environments or objects.
Activities within the MR lab are focused on the solution of real-world industrial problems across all industry sectors, with particular emphasis on the realisation of complex high-fidelity digital test beds and twins of industrial systems and processes.
Autonomous Chemistry Laboratory
The Autonomous Chemistry Laboratory is led by Professor Andy Cooper, the Leverhulme Research Centre for Functional Materials Design and the Materials Innovation Factory. They are creating the “Laboratory of the Future”, a self-driving, autonomous scientific laboratory powered by artificial intelligence and robotics, which will develop heterogeneous robotic platforms (“robotic chemists”) and custom-designed automated workflows to provide a step change in the use of automation for chemistry and materials science environments.
Extreme Environment Laboratory
The Extreme Environment Laboratory’s research focuses on the design, manufacture, analysis and testing of highly automated or autonomous land-based and aerial robotic systems for use in real-world missions.
The laboratory will focus on the development and evaluation of novel algorithms, sensors and, where necessary, platforms that will work in environments beyond that of the laboratory. The laboratory is reconfigurable and is specifically designed to allow complex industrial structures to be introduced into it, such that the both the robotic system’s as well as its component part’s ability to deal with realistic scenarios can be fully investigated and assessed.
Trustworthy Autonomous Cyber-Physical Systems Laboratory
The Trustworthy Autonomous Cyber-Physical Systems Laboratory is focused on developing theories, algorithms, and tools to detect, verify, and enhance the safety and security of learning-enabled autonomous systems. It is also developing techniques to enhance the transparency, fairness, and privacy of such systems. In particular, the laboratory has world leading research on the safety verification and safety assurance of robotics and autonomous systems (RAS).
At the moment, the laboratory works within several application areas, including Robotics (for example, self-driving cars, underwater vehicles, etc.), Internet of Things (for example, wireless sensor networks), Healthcare (for example, medical diagnosis through brain imaging and human activity recognition), and Data Analytics (for example, natural language processing, social media analytics, data mining and knowledge discovery).
The research activities are currently supported by a number of funding agencies including the European Commission, EPSRC, and the Ministry of Defence (through Defence Science and Technology Laboratory, Dstl).
The Immersive Laboratory was founded on Collaborative Multisensory VR, from a recent EPSRC project “CASMS: Context Aware network architectures for the Sending of Multiple Senses” and is part of the University of Liverpool’s “Advanced Network Research Group” which also researches network security and the Internet of Things (IoT).
The Immersive Laboratory investigates the sharing, synchronicity, and replication of sensory information at multiple sites (currently Liverpool and London) for future “Tactile Internet” applications – such as online shopping, entertainment (gaming), delivery of training, and remote medical care (e.g., patient isolation) beyond basic teleconferencing. The user’s immersion, sense of presence and co-presence between multiple users, is then evaluated through user trials in collaboration with the university’s Psychology department.
This includes multidisciplinary research in the areas of traditional visual and audio senses, but also more esoteric senses such as: Haptics & Tactile (sense of touch) - through haptic manipulators, thermal gloves, and proxy object presentation/interactions using collaborative robotics; Olfaction (sense of smell) using our in-house developed “Odour Displays”, and gas detector signal processing; the Immersive Laboratory will also include Gustation (sense of taste) through VR-driven automated food preparation technologies as part of a future project proposal.
SmARTLab is a state-of-the-art robotics laboratory at the Department of Computer Science.
The lab is affiliated with the Centre for Autonomous Systems Technology (CAST) and the Digital Innovation Facility (DIF). Research focuses on robot perception and learning, with applications to autonomous manipulation and navigation.