In search and rescue (SAR) missions every minute counts. Semi-collapsed buildings are among the difficult scenarios encountered by search and rescue teams. An UAV-based exploration system can provide crucial information on the accessibility of different sectors, hazards, and injured people. The research project “UAV-Rescue” aims to provide UAV-borne sensing and investigate the use of AI to support this powerful tool. The sensor suite contains a radar sensor for detecting people based on breath and pulse movement. A neural network interprets the extracted data to identify signs of human life and as such persons that need rescuing. We also fuse radar and lidar data to explore the environment of the UAV and obtain a robust basis for simultaneous localization and mapping even under restricted visibility conditions. Additionally, we plan to use AI to support the path planning of the drone taking the digital map as input. Furthermore, AI is leveraged to map intact and damaged building structures. Potentially hazardous gases common to urban settings are tracked. We fuse the acquired information into a model of the explored area with marked locations of potential hazards and people to be rescued. The project also addresses ethical and societal issues raised by the use of UAVs close to people as well as AI supported decision making. The talk will present the system concept including interfaces and sensor fusion approaches. We will show first results of a research project from static and dynamic measurement campaigns demonstrating the capability of radar and lidar based sensing in a complex urban environment.
In this paper integrated radar modules are presented which are suitable for collision avoidance and imaging for small UAVs. A short introduction to electronic beam steering is given and different approaches for angle resolving imaging are shown. Two sensors, a mono-static 80 GHz radar sensor and a polarization resolving bistatic 94 GHz radar module are presented which can be valuable elements of sensor-suites for modern UAV based imaging and surveillance and autonomous operation.
A MIMO radar imaging system at 360 GHz is presented as a part of the comprehensive approach of the European FP7 project TeraSCREEN, using multiple frequency bands for active and passive imaging. The MIMO system consists of 16 transmitter and 16 receiver antennas within one single array. Using a bandwidth of 30 GHz, a range resolution up to 5 mm is obtained. With the 16×16 MIMO system 256 different azimuth bins can be distinguished. Mechanical beam steering is used to measure 130 different elevation angles where the angular resolution is obtained by a focusing elliptical mirror. With this system a high resolution 3D image can be generated with 4 frames per second, each containing 16 million points. The principle of the system is presented starting from the functional structure, covering the hardware design and including the digital image generation. This is supported by simulated data and discussed using experimental results from a preliminary 90 GHz system underlining the feasibility of the approach.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.