Light Field Imaging for Autonomous and Assisted Driving

A fully funded scholarship is available for this project, applications due Jun 19, 2022, details at the link below.

Challenges arise in devising long-range and wide-field-of-view 3D sensing solutions that offer low latency and operate under a wide variety of illumination and weather conditions.

The project’s aims include:

  • Designing and evaluating LF camera architectures suited to the challenges of autonomous and assisted driving,
  • Developing computationally efficient 3D vision pipelines that exploit the rich information that LF cameras capture to deliver next-generation robustness and performance, and
  • Running lab and field trials to rigorously evaluate novel sensing technologies for autonomous and assisted driving.
  • Depending on interest and ability, there is an opportunity to explore novel optics for light field capture and/or digital architectures for low-power low-latency light field vision.

About the Robotic Imaging Lab: roboticimaging.org

Scholarship application and enquiries: https://roboticimaging.org/Join/LFDriving

Contact: donald.dansereau@sydney.edu.au

Control of Walking Robots

As robots move out of controlled environments like factories and into the wider world, many creative methods of locomotion are being explored. In particular, legged robots are suitable for traversing terrain too rough or irregular for wheels. The dynamics of legged locomotion presents many exciting challenges for planning and control: it is nonlinear, uncertain, high-dimensional, non-smooth, and underactuated.

Candidates will investigate one or more of:

  • Integrated perception and motion planning over uneven terrain
  • Optimization and learning paradigms for provably-robust control policies
  • Experimental investigations with the ACFR’s Agility Robotics Cassie robot (pictured left)

Contact: ian.manchester@sydney.edu.au

Safeguarding Our Online Social Networks

Today online social networks have fundamentally changed the way how our society is organized. However, the presence of misinformation and disinformation in forms of fake customer reviews, manipulated news, and disingenuous recommendations etc. are posing systemics risks on the well-being of our social members in the short term and the values our society holds in the long term. More importantly, the social interactions between peers may accelerate the spread of misinformation and amplify the harm from adversarial actions as we live in an interconnected network. Recently, in collaboration with researchers from University of Texas Austin, University of Oxford, and Chinese University of Hong Kong we uncovered the hidden risks of our social interactions being identified from public records on social networks (https://dx.doi.org/10.2139/ssrn.3875878). This projects aims to develop algorithms and optimization frameworks by which we slow down the spread of misinformation and safeguard our social networks, using tools from control theory, optimization, and machine learning.

Contact: guodong.shi@sydney.edu.au

LiDAR Pointcloud Perception and Deep Learning in Forests

The ACFR is currently engaged in several national and international collaborations with research, industry and government partners in sensing and robotic applications in commercial forestry, forest health, ecology and management. Forests are structurally diverse environments that pose unique challenges for robotic sensing and perception. This research will aim to develop new methods for sensing, perception and navigation using LiDAR, photogrammetry and hyperspectral imaging in forests.

Candidates will investigate one or more of the following topics:

  • New developments in deep learning models for 3D pointcloud data
  • Human-computer interaction for 3D deep learning using virtual reality
  • Applications of 3D robotic perception and learning in forest environments

Contact: mitch.bryson@sydney.edu.au

Sensing and Mapping the Dynamic World

Dynamic scenes challenge a number of mature research areas in computer vision and robotics, including simultaneous localisation and mapping (SLAM), 3D reconstruction, multiple object tracking etc. Most of the existing solutions to these problems rely on assumptions about the static nature of either the environment or the sensing modality. This drastically reduces the amount of information that can be obtained in complex environments cluttered with moving objects. To achieve safe autonomy, obstacle avoidance and path planning techniques require this information to be integrated.

Candidates will investigate one or more of the following topics:

  • Robust segmentation and tracking of moving objects sensed by a sensor (camera, laser) in motion.
  • Simultaneous localization and mapping of dynamic environments.
  • Novel representations of dynamic scenes directly connected with the requirements of the autonomous vehicles.

Contact: viorela.ila@sydney.edu.au

Contacts

Sydney Institute for Robotics and Intelligent Systems
info@acfr.usyd.edu.au