Projects

ASTRIL Projects

Vision based GPS-denied Object Tracking and Following for UAV

We present a vision based control strategy for tracking and following objects using an Unmanned Aerial Vehicle. We have developed an image based visual servoing method that uses only a forward looking camera for tracking and following objects from a multi-rotor UAV, without any dependence on GPS systems. Our proposed method tracks a user specified object continuously while maintaining a fixed distance from the object and also simultaneously keeping it in the center of the image plane. The algorithm is validated using a Parrot AR Drone 2.0 in outdoor conditions while tracking and following people, occlusions and also fast moving objects; showing the robustness of the proposed system against perturbations and illumination changes. Our experiments show that the system is able to track a great variety of objects present in suburban areas, among others: people, windows, AC machines, cars and plants. Please visit the project page for more details.

Ars Robotica
Ars Robotica is a collaboration between ASTRIL and the School of Film, Dance and Theatre at ASU. Using the Rethink Robotics Baxter as a test platform, Ars Robotica aims to investigate the possibility of defining and achieving a human quality of movement through robots. Training data is obtained through various modes of sensing ranging from simple devices such as a Microsoft Kinect to high speed precise tracking setups such as a 12 camera Optitrack system; which is then used for defining a vocabulary of human motion, thus helping create a framework for autonomous interpretation and expression of human-like motion through Baxter. Please visit the project page for more details.

3D Mapping
heli
Three dimensional mapping is an extremely important aspect of geological surveying. Current methods of doing this, however, often poses pragmatic challenges. We introduce a technique for performing terrain mapping using unmanned aerial vehicles (UAVs) and standard digital cameras. Using a photogrammetric process called structure from motion (SFM), aerial images can be used to infer 3-dimensional data. Please visit the project page for more details.

Autonomous Kite Plane for Aerial Surveillance
ASU Autokite

The development of an autonomous fixed-wing motorized kite plane, Autokite, offers a unique approach to aerial photography in the field.  The inexpensive and lightweight nature of the Autokite makes it ideal for deployment in environments that are remote and or extreme.

Autonomous ship board landing of a VTOL UAV

Autonomous Ship Landing Red

The autonomous landing of Vertical Take Off and Landing (VTOL) Unmanned Aerial Vehicles (UAVs) is a very important capability for autonomous systems. Autonomously landing on a ship deck platform continues to be studied and has only recently been solved for very favorable weather conditions.
Our challenge is to provide the UAV with the capabilities of autonomously landing on ship deck platforms in extreme weather conditions.

EGGS (Exploration Geology & Geophysics Sensors) 

eggs

Coming soon… Please visit the project page for more details.

Using UAVs to Assesses Signal Strength Patterns for Radio Telescopes
oktokopter
In this work the design of flight hardware for detecting the signal strength field pattern of an array of Radio Telescopes is considered.  Utilizing the ultra-stable and robust aerial platform offered by a multi-rotor craft makes this task possible.

Change Detection using airborne Lidar
eg-DEM
We work with geologists on developing algorithms for finding the local displacements on topographies during earth quakes. Algorithms use the Digital Elevation Models of earthquake sites (before and after the earthquake) obtained from Lidar scanners mounted on aerial vehicles. Please visit the project page for more details.

NIR Camera
nir
The objective of NIR involved constructing an equivalent MER PANCAM from readily available commercial parts for use of science and study of Earth’s atmosphere and geological features.  Please visit the project page for more details.

Path Planning for Ground Vehicles
raven-path
The objective of this project is to study and devise new means for motion planning for ground vehicles, using Raven as the prototype vehicle. More specifically, we try to determine smooth paths for Raven to follow, as it traverses waypoints; such paths have wide use in applications,  for instance  in following an astronaut as (s)he walks along a random path. Please visit the Project Page for more details.

Plume Detection
plumes
The objective of this work was to autonomously detect manually verified features (plumes) in images under onboard conditions. Success enables these methods to be applied to future outer solar system missions and facilitates onboard autonomous detection of transient events and features regardless of viewing and illumination effects, electronic interference, and physical image arti facts. Autonomous detection allows the maximization of the spacecrafts memory capacity and downlink bandwidth byprioritizing data of utmost scientific significance. Please visit the project page for more details.

R.A.V.E.N.
raven
RAVEN (Robotic Assist Vehicle for Extraterrestrial Navigation) was designed for the 2010 Revolutionary Aerospace Systems Concepts Academic Linkage (RASC-AL) contest.  Please visit the project page for more details.

Road Detection from UAV Aerial Imagery
roadmap
Using aerial images taken from UAV photography to detect the presence of roads.  In this work, we developed variations of algorithms suitable for different types of roads and detection.  Please visit the project page for more details.

Autonomous Sampling
sub
Autonomous Underwater Vehicles have proven themselves to be indispensable tools for mapping and sampling aquatic environments. However these sensing platforms can only travel as far as their stored energy capacities allow them to. Thus we are researching both offline and online adaptive sampling strategies that optimize both the estimation accuracy of the models derived from sampling and the energy consumption of the vehicle through.  Please visit the project page for more details.

 

Other Projects

ASU Lunabotics Competition
lunabotics
Arizona State University is participating in the NASA’s Lunabotics Mining Competition that is designed to promote the development of interest in space activities and STEM (Science, Technology, Engineering, and Mathematics) fields. The competition uses excavation, a necessary first step towards extracting resources from the regolith and building bases on the moon.  Click the banner or http://robots.asu.edu/lunabotics/ to find out more .

High Altitude Turbine Survey // H.A.T.S.
hats
The High Altitude Turbine Survey (HATS) is an attitudinal wind velocity experiment aimed at understanding thrust and other performance characteristics of micro-propellers along a vertical profile (35km).  The survey data will have a wide range of application towards further study including wind power generation from airborne turbines, propeller-driven airships, and micro-propeller performance characteristics at extended altitudes. Click the banner or http://robots.asu.edu/hats/ to find out more.