Forschung
Hauptschwerpunkte der Forschung:
- Autonomie und Zusammenarbeit mobiler Roboter im 3D Raum
- Regelung und Bewegungsplanung unter Berücksichtigung der Zustandsschätzung
- On-board Verarbeitung der Umgebungswahrnehmung und autonome Entscheidungsfindung
- Langzeit Missionsausführung in sich verändernden Umgebungen
- Vernetzte Missionsplanung für Roboter im Schwarm
Laufende Projekte
Automated Log Ordering through robotic Grasper
Projektleitung
Laufzeit
01.04.2018 - 31.03.2021
Förderung
Produktion der Zukunft
AutoLOG will focus on research towards automating challenging and DDD (dull, dangerous, dirty) tasks concerning the handling of raw material in production lines currently executed manually. We will focus on artificial intelligence inspired vision based approaches to categorize and segment the raw material and its geometry to subsequently define (again through AI) optimal handling/grasping poses for the automation machinery. We seek to automate existing infrastructure in a versatile and cost effective way. Thus, we will investigate in retrofittable sensors and robust control strategies for seamless and cost efficient upgrading/retrofitting/automating existing infrastructure. As a specific application scenario and immediate benefit to Austrian’s industry, we will tackle the problem of autonomously grasping logs to be placed from the truck to the processing machinery.
Kooperationspartner:innen
Forest Inventorying with Micro Aerial vehicles for autonomous Tree-parameter Estimation
Projektleitung
Projektmitarbeiter:innen
Laufzeit
01.09.2016 - 31.03.2020
Förderung
Österreichische Forschungsförderungsgesellschaft mbH (FFG)
Webseite
We will pursue a concentrated research effort to enable research platform prototype consisting of one unmanned aerial vehicle (UAV) navigating autonomously through managed mature forest providing sufficiently dense visual data for accurate 3D reconstruction and subsequent autonomous extraction of ecological data from objects of interest. The ecological data includes the estimation of the position of the trees, the diameter as breast height, the stem shape, and coverage of herb layer. The project will yield new innovative algorithms for GPS independent, vision based autonomous UAV navigation, including self-healing state estimation, vision based obstacle avoidance, and adaptive path planning. In addition, novel 3D reconstruction algorithms will enable on-site extraction of ecological forest parameters in unprecedented precision and efficiency in both time and cost.
Kooperationspartner:innen
MOdular, Distributed and Uncertainty aware sensor-fusionfor Long-term Exploration with autonomous Systems
MODULES tackles the challenges of high-precision state estimation with the application to approach, landing, and subsequent take-off of a multi-copter on a landing pad for autonomous recharging. The goal is to develop a reliable algorithm for state estimation using multiple sensor modalities in a consistent and real-time multi-sensor fusion framework which will serve as back-bone for the high-precision maneuvers. In addition to the data fusion of multiple sensors in a given sensor suite, the question of how to increase modularity of such a framework such that sensor signals and entire sensor modules can adaptively be added and removed in-flight without disabling state estimation for reliable UAV navigation will be analyzed. System self-calibration, fast state convergence and consistency are key aspects to be considered.
Kooperationspartner:innen
Tightly Coupled Visual-Inertial Multi-Sensor Fusion for Robust Navigation of Computationally Constraint UAVs
Projektleitung
Projektmitarbeiter:innen
Laufzeit
01.08.2016 - 31.12.2021
Förderung
U.S. Army International Technology Center – Atlantic
Visual information fused with inertial cues has proven to be able to provide pose information to the robots in a variety of different scenarios. However, current real-time capable solutions still use most of the resources on computationally constrained platforms, require well textured and low cluttered areas, and do not make use of the dense information the camera image provides. VI-MuSe will investigate in further reducing the computational complexity of visual-inertial state estimation and at the same time increase the information used from the camera image to mitigate the current limitations. In addition, the project will investigate in the use of other sensors to mitigate failure modes in visually homogeneous areas. In contrast to state-of-the-art multi-sensor fusion algorithms, the goal is to develop a method to seamlessly add sensors during run-time without the need of pre-calibration.
Kooperationspartner:innen
Eine vollständige Liste aller Forschungsprojekte des Instituts für Intelligente Systemtechnologien finden Sie in der Forschungsdokumentation (FoDok).
Abgeschlossene Projekte
Quicklinks
Informationen für
Adresse
Universitätsstraße 65-67
9020 Klagenfurt am Wörthersee
Austria
+43 463 2700
uni [at] aau [dot] at
www.aau.at
Campus Plan