Waypoint

Modeling for Mars – Using Drones to Test a Space Rover

We talked to Martin Azkarate of the European Space Agency about his work developing rover prototypes for planetary exploration, including how he employs a senseFly drone to test these cutting-edge robots.

Martin (pictured above right) is a Space Automation and Robotics Engineer at the European Space Agency. Based in the Netherlands at the European Space Research and Technology Centre (ESTEC), he is in charge of the laboratory that develops and tests the agency’s rover prototypes—the cutting-edge ground robots that explore faraway planets—with special reference and interest in the rover that will be used on the agency’s forthcoming Mars mission: ExoMars.

We caught up with him to learn more about his role, planetary robotics, and how a drone aids the lab’s research activities.

Hi Martin and thanks so much for speaking with us. Why don’t you start by telling us a little about your role?

Sure. I’m responsible for our Planetary Robotics Laboratory, where we mainly focus on space exploration using rovers. I’ve worked here three and a half years, having started as a trainee on a scholarship from Spain. We work on using rovers, robots, to explore unknown faraway planets, which usually these days means Mars.

What mission are you currently working towards?

The Agency’s next mission is called ExoMars [which stands for Exobiology on Mars]. This features two separate phases: the first in 2016 will launch an ‘orbiter’ out to Mars; a satellite that is used to relay the future rover’s data back to earth because it doesn’t have the means—the power or antenna—to communicate directly. Then in 2018 there will be a follow-up launch where we will send the final landing module containing the rover.

What will your rover do on the red planet? What are its goals?

It will explore a specific area of the red planet, taking samples, drilling down to two metres below the surface, and then analysing these materials on-board the rover itself. Our lab mainly focuses on the robotics technology that can be applied to this rover system.

There are already some areas that the Agency has selected as preliminary target locations, but the final landing location has not been selected yet. Whatever that final location is, the rover will drive around that area and perform types of drilling and sample analysis. Its drill system is pretty complex. It takes take samples from up to two metres below the surface, which is deep for a rover system. Then there is a full miniature lab or ‘analytical drawer’ onboard the rover, which it will use to analyse these samples, searching for signatures of life using instruments such as microscopes and spectrometers. The results of this analysis will then be sent back to Earth via the orbiter.

What are the technical challenges you face when developing a rover? There must be so many…

It depends; whether you do the science in-situ, as with ExoMars, or one day send samples back to Earth, that would mean different kinds of subsistence and navigation, but generally speaking a key requirement is for some kind of autonomous navigation. We can tell the rover which area to explore, and give it coordinates, but has to be automated enough to understand directions, measure distances, and to recognise and navigate around obstacles. We can’t control a rover on Mars remotely, like we could on the moon.

How does a rover localise itself?

There are different ways of doing this. The main method is via visual information, sourced from its cameras. First, the rover uses visual odometry. This means it updates its relative position based on what it saw previously. It will take an image, move a metre, take another image, compare these two images and then compute the transformation matrix that matches the motion it has performed from one step to the next. By doing this repeatedly, it can constantly update its position with regards to its original position. This is all relative positioning, of course, based on where it landed.

We also need to know where its first position was, called global positioning. One way is to find reference objects in the nearby surroundings; on earth we use ground control points. But on Mars it’s not easy to locate a nearby tower! We don’t have a very detailed map of the terrain on Mars either, but we do have maps of up to 1 metre per pixel thanks to satellite imagery. On these maps we can identify big rocks, craters etc. Then if we can see any of these with the rover, we can triangulate its position with regards these landmarks.

So where do flying robots, meaning your drone, come into play?

Our eBee has two main applications. The first is creating high-resolution maps of our rover test sites.

In the case of Mars, there is an orbiting satellite that takes images of the areas we want to explore. However we need something here on Earth, to capture the imagery we’re going to explore when testing the rover; the higher the resolution the better. At a smaller scale, we need to identify and geo-reference landmarks that the rover has to be able to see, to target.

Let’s say we choose a parking lot. We first use the eBee to map this area, maybe covering a hectare or even larger (up to a square kilometer, our rover’s realistic maximum). From there we can identify landmarks that the rover can use to localise itself.

The second application however is all about enriching the work of the operator. We use the digital elevation models the eBee generates to feed the rover’s ground control station. This gives the operator a better understanding of where the rover is and how well it is traversing the terrain. If you have a full DEM of the terrain and you place the rover somewhere, you get a better idea of how the rover is operating. Whatever the rover will see, in terms of obstacles, terrain etc., we can cross-check that via the DEM.

What are you checking when conducting such rover tests?

Everything within the scope of autonomous navigation. For instance, how well the rover can localise itself, its performance when traversing different terrains, and how accurately the rover is analysing where it is. The rover may believe it is here, and in this position or facing this direction, but we may see that differently on the DEM in which case we know there’s a problem.

exomars, ESA, senseFly, eBee, drone
The ExoMars rover is due to land on the red planet in 2018 (image: ESA).

Why did you choose to use an eBee specifically?

In the past we used a quadcopter with a balloon to create our orthophotos and DEMs, but this wasn’t usable in bigger fields because it required time and brought operational complexity.

We moved to an eBee at the end of 2014, because it is a full, end to end system, including all the software, at a price we couldn’t find any better. It creates maps with high enough precision and it’s a stable system that produces imagery we can use. It copes with the challenges of a fixed-wing system well.

What kind of areas are you mapping to test the rover on? You mentioned a parking lot?

Yes, we use a very nice parking area, owned by a company called DECOS, which is close to ESTEC. This has the color and characteristics of a Martian landscape; a design choice by the owner.

We also target more sandy areas like nearby beaches. Mars has a mix of different terrain, from hard rock to sandy areas, so it’s difficult to find natural terrain that has all of these. We have also tested in the Atacama desert in Chile, areas of Spain south of the Pyrenees, and the Canary Islands.

Using the eBee in such areas, you might face certification or approval issues. We had that here in the Netherlands at first, before we specified that our use of the eBee was non-commercial.

Field testing is something we plan over the entire year. We’ll typically carry out two big field testing campaigns, of one to two weeks each, per year. That’s when we’ll use the eBee to create those maps and models. Whereas our internal activities in the lab, related more to sub-system applications like testing cameras and algorithms, these happen every day here in our lab.

Mars-rover-test-drone-UAV-UAS
Traversing one of the team’s test sites.

Lastly, looking even further ahead than 2018, what might future Mars missions look like?

In the future we’re looking to bring full physical samples of Mars back to Earth. This is a very complex undertaking, because it requires first having a full working launcher system, usually called an ‘Ascent Vehicle’, on the surface of Mars.

Martin, thank you, we’ve learned so much from speaking with you!

You’re welcome.  My pleasure.

Read more about the ExoMars 2018 mission

Learn more about the eBee.

4 Comments

  1. how to close google plus account

    26.11.2015 - 03:53
    Reply

    Hi would you mind stating which blog platform you’re
    working with? I’m planning to start my own blog in the near future but I’m
    having a difficult time choosing between BlogEngine/Wordpress/B2evolution and Drupal.
    The reason I ask is because your layout seems different then most blogs
    and I’m looking for something unique. P.S Sorry for being
    off-topic but I had to ask!

    • senseFly

      26.11.2015 - 08:51
      Reply

      Hi – we’re on WordPress using a customised Futura theme. Glad you like it!

  2. google authorship markup

    27.11.2015 - 01:56
    Reply

    When someone writes an post he/she maintains the image of a user in his/her mind that
    how a user can know it. Thus that’s why this piece of writing is outstdanding.
    Thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *