L-CAS as a partner in NCNR

The Lincoln Centre for Autonomous Systems Research (L-CAS), the University of Lincoln’s cross-disciplinary research group in robotics, specialises in technologies for perception, learning, decision-making, control and interaction in autonomous systems, especially mobile robots and robotic manipulators, and the integration of these capabilities in application domains including agriculture, nuclear robotics and space robotics, i.e., applications of robots in extreme environments. Among NCNR, L-CAS participates in a large number of collaborative research projects with other academic and industry partners, funded by the UK Research Councils, Innovate UK and Horizon 2020, among others. For instance, L-CAS is leading the £6.9m EPSRC Centre for Doctoral Training in Agri-Food Robotics, with the Universities of Cambridge and East Anglia, and co-leads a £6.3m award from the UK Government’s Expanding Excellence in England (E3) Fund to create Lincoln Agri-Robotics, a global centre of excellence for agricultural robotics, in collaboration with the Lincoln Institute for Agri-Food Technology. L-CAS is one of the big robotics labs in the UK, a member of UKRAS and euRobotics.

Focus of L-CAS in NCNR

Within NCNR, L-CAS has focused on the specific challenges of deploying robots in nuclear application, with a focus on sort and segregation, disaster response, and decommissioning tasks. The main contributions are in the research areas of “Grasping and manipulation (WP7 + WP5)”, “Mobility and navigation (WP6)”, and “Telepresence and Variable Autonomy (WP8 + WP9)”. L-CAS has been collaborating mainly with Bristol Robotics Lab (Prof Giuliani), the Extreme Robotics Lab at the University of Birmingham (Prof Stolkin) and Centre for Advanced Robotics @ Queen Mary (Prof Althöfer) within NCNR, and with the GOALS group (Prof Hawes) of the Oxford Robotics Institute in inter-hub collaboration.

The Lincoln NCNR Team


Prof Gerhard Neumann

Prof Marc Hanheide
Profile Picture
Dr Amir Ghalamzan Esfahani
People look-up - The University of Nottingham
Dr Ayse Kucukyilmaz

Dr Mohamed Sorour

Dr Harit Pandya

Dr Riccardo Polvara

Dr Aravinda R. Srinivasan
Profile Picture
Dr Sariah Mghames
Profile Picture
Vaisakh Shaj

Dr Baris Serhan
Profile Picture
Soran Parsa
The L-CAS Team in NCNR (some affiliated staff and students included)

Some Exemplary Results of Research with NCNR@L-CAS

Outdoors mobile manipulation demonstrator

The objective of this demo is to show a mobile manipulator remotely controlled by a human operator to clean a specific area out of dangerous items. The human operator is located somewhere else at distance and controls both the mobile base and the robotic arm in various ways, through a web interface and a second robot arm acting as master, respectively. In this way, space can be decontaminated without putting the operator at risk. The application can be useful not only for nuclear decommissioning tasks but for any object-retrieval task at long distance (e.g., over the network). The demo is performed while using two Franka-Emika arms and a Thorvald robot from Saga robotics, which has been extended with a custom frame for supporting one of the arms.

  • Upon identification of target area, navigates autonomously there to deploy mobile manipulator through closed-loop teleoperation
  • Haptic-feedback teleoperation to retrieve item on the mobile platform

1- S. Parsa et al., “Haptic-guided shared control grasping: collision-free manipulation,” 2020 IEEE 16th International Conference on Automation Science and Engineering (CASE), 2020, pp. 1552-1557, doi: 10.1109/CASE48305.2020.9216789.

Secure remote assisted teleoperation of a mobile sensing platform

This collaboration with the GOALS Teams of the Oxford Robotics Institute is one of the cross-hub collaborations, here working with researcher in the RAIN hub. The collaboration facilitated remote inspection and radition mapping over long distances with a secure web interface. The robot is navigating autonomous along a topological graph, under supervision by a remote operator. This autonomous operation eases the workload for the operator and allows them to define a reconnaissance mission on a web interface and have it executed autonomously by the distant mobile robot.

Force-guided teleoperation

  • Bilateral teleoperation using Franka Emika Panda Arms. 
  • The torque controlled robotic arms are programmed to generate feedback forces such that it will mimic as a haptic device and enhance the user experience while performing  the teleoperation task.
  • A method to incorporate robot trajectories learned from human demonstrations and dynamically adjusts the level of robotic assistance based on how closely the detected intentions match these trajectories have been developed.
  • A method that enables real-time control with less communication latency is developed, which is tested and verified for long distance teleoperation tasks. 

Ly, Kim Tien, Mithun Poozhiyil, Harit Pandya, Gerhard Neumann, and Ayse Kucukyilmaz. “Intent-Aware Predictive Haptic Guidance and its Application to Shared Control Teleoperation.” In 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), pp. 565-572. IEEE, 2021

Grasp planning for unknown Objects

As part of sorting and segregation, flexibility in grasping objects is key. Different manipulators and grippers are deployed in such settings, and the objects to be handled vary greatly in shape and size. A universal grasp planner, working with a large variety of robotic manipulators and objects has been developed, integrated with 3D sensing, integrating 3D point clouds from different viewpoints, to identify the best grasp position for individual objects.

  • Efficient grasp planning for unknown objects from 3D point clouds
  • Support for large variety of different grippers and manipulators, ranging from dexterous hands to industrial pinch grippers

Enhancing Grasp Pose Computation in Gripper Workspace Spheres M Sorour, K Elgeneidy, M Hanheide, M Abdalmjed… – 2020 IEEE International Conference on Robotics and …, 2020
Grasping unknown objects based on gripper workspace spheres M Sorour, K Elgeneidy, A Srinivasan, M Hanheide… – 2019 IEEE/RSJ International Conference on Intelligent …, 2019

Autonomous Raditation Mapping

With this demonstrator, we address the challenge to survey and outdoor site, to uncover radiation sources and localise them autonomously from a wheeled robot base that can autonomously traverse challenging terrains. A probabilistic radiation map is incrementally built, and sensing points are automatically chosen to drive the autonomous exploration of the robot. A hybrid digital twin-realworld system is deployed to facilitate experimentation with simulated radiation sources in controlled positions safely, while testing navigation and movement behaviours in the real world under realistic conditions.

  • Robot navigation is performed by using a topological map, a graph-based representation which exploits the structure of the environment.
  • Robot path planning is performed by Next-Best-Sense, a framework for combining multiple criteria and expressing the preference of one above others.
  • Kriging method (also known as Gaussian Process regressor) is used for interpolating sensor readings and building two maps (mean and covariance) of where the radiation sources are located.
  • Efficient grasp planning for unknown objects from 3D point clouds
  • Support for large variety of different grippers and manipulators, ranging from dexterous hands to industrial pinch grippers

Polvara, Riccardo, Del Duchetto, Francesco, Neumann, Gerhard and Hanheide, Marc (2021) Navigate-and-Seek: a Robotics Framework for People Localization in Agricultural Environments. IEEE Robotics and Automation Letters, 6 (4). pp. 6577-6584. ISSN 2377-3766
Polvara, Riccardo, Fernandez-Carmona, Manuel, Hanheide, Marc and Neumann, Gerhard (2020) Next-Best-Sense: a multi-criteria robotic exploration strategy for RFID tags discovery. IEEE Robotics and Automation Letters, 5 (3). pp. 4477-4484. ISSN 2377-3766

Push to See

Searching and grasping irregular shaped objects from a dense clutter is a crucial task for autonomous manipulators especially for handling radiation sensitive materials. In this work, we learn skills using deep reinforcement learning techniques for decluttering the scene and grasping the desired object.

  • Deep learning based segmentation of identifying graspable textureless objects in clutter using only depth measurements.
  • Reinforcement learning based non-prehensile interactions for searching and sorting.

SERHAN, B., PANDYA, H., KUCUKYILMAZ, A., & NEUMANN, G. (in press). Push-to-See: Learning Non-Prehensile Manipulation to Enhance Instance Segmentation via Deep Q-Learning. In IEEE International Conference on Robotics and Automation (ICRA 2022)

Recurrent Kalman Networks to Learn Robot Dynamics

Developed new RNN architecture based on factorized Kalman Updates

  • Memory neurons and uncertainty neurons
  • Factorized representation of the covariance
  • Simple updates: No matrix inversions needed (just scalar divisions)
  • Can be learned end to end

Shaj, Vaisakh, et al. “Action-Conditional Recurrent Kalman Networks For Forward and Inverse Dynamics Learning.” arXiv preprint arXiv:2010.10201 (2020).

The Future: Exploitation of Fundamental Research

NCNR has laid the foundations for further developments and innovations. With the outcomes of NCNR, L-CAS has further engaged with the nucelar industry, to exploit the technologies developed. A new project with Veolia Nuclear Solutions is already delivering results, but exploitation of NCNR outcomes also directly has contributed to new industry-focused projects, e.g. in our space robotics portfolio (clearing debris in low earth orbit) and in agricultural applications (e.g. the IUK “FastPick” project building upon the agile robotic manipulation technologies developed within NCNR).

A new IUK-funded project (2004_SBRI_NDA), a collaboration between Veolia Nuclear Solutions and L-CAS is focusing on the “Dexter” platform, to make sorting and segregation tasks safer and easier through an integrated approach of blended autonomy