Wednesday, September 30, 2015

Activity 3: Using a multi-rotor to gather imagery/Using mission planning software

Introduction

This weeks activity was all about conducting a flight with a Unmanned Aerial System. Last week we focused on the planning and preflight checks that need to be done before a mission but were unable to fly because of poor weather conditions. This week the students went through all the preflight and mission planning procedures again to emphasize safety and awareness of surroundings and then we conducted the missions they planned. I was the Pilot at the Controls (PAC), actually piloting the flights, for all of the mission with other students serving as the Pilot in Command (PIC) at the ground station and spotters. While I was working with students on creating mission to fly with the Matrix (Figure 1), Dr. Hupy was showing another group of students how to plan and fly a mission with the 3DR Iris (Figure 2) and Dr. Pierson was going over the basics of batteries with the other students. All the students rotated through each of these stations through the duration of the class. Dr. Pierson also brought along a RC plane (Figure 3) just like the one the students are currently building in class to show them how it flies and give them some tips. It was a busy class period but I think the students really gained a lot of knowledge through everything being hands on and interactive.
Figure 1 This is our Matrix platform. It is quad-copter that is very stable and can carry a decent payload. It has a flight time of about a half hour which gives us the capability ti fly larger mission with it.
Figure 2 This is the 3DR Iris. It is an out of the box ready-to-fly platform. It is easy to deploy very automated so it is easy to use. It has about a 15 minute flight time and can carry a GoPro or the GEMS sensor so we use it for small 5 to 10 minute mission.
Figure 3 This is the model of plane that Dr. Peirson brought and that the students are learning how to build and fly in the class. It is made be a company called Test Flite. It is a very basic and easy to fly RC plane for beginners. 


Study Area

The class met at the Eau Claire Indoor Sports Complex again at the soccer fields (Figure 4) as we have in week past. The weather forecast changed about 10 times so we didn't know if this activity was going to happen or not but the weather turned out to be perfect for conducting the missions. It was very cloudy and little bit hazy with a dupoint around 70%. Rain was expected sometime in the evening however we got class in before it rained. There was very little to no wind which is ideal for flying with UAS. The lack of wind also allowed Dr. Peirson to fly his RC plane.It needs to be almost calm to be able to fly because it is so small and light.
Figure 4 This is the soccer complex here in town where we had class again this week as in weeks past. The red area is where the flights were conducted.

Methods

Many of the procedures this week were the same or very similar as last week because the students went through the pre flight and mission planning all last week we just didn't get a chance to actually fly. Please refer to my Activity 2 blog for the methods related to mission planning and the pre flight checklist. This week was a little bit different however because they were getting to see the sensors or cameras that take a Unmanned Aerial Vehicle (UAV) and make it an Unmanned Aerial System (UAS). An UAV with no cameras or other sensors to collect data is basically a glorified and really high end toy. You can fly them around but what is the point if you aren't collecting data of some kind to be used later. The real money in this field is in the data not in the platform, but anyway the students got some time seeing and working with the different sensors we use for collecting data. We used 4 sensors during class, GoPro HERO3, Canon SX260 and the GEMS censor from Sentek Systems in the Twin Cities. The GoPro (Figure 5) is a very easy way to collect aerial images and is especially good for capturing video is a follow me function kind of application.It is light and compact so it can be put on almost any platform however it can not be triggered it collect in continuous interval mode. It doesn't work very well when trying to collect nadir aerial imagery. The Canon SX260 (Figure 6) and S110 (Figure 7) are much better suited for collecting nadir imagery. They both geo tag the images as they are collected which works much better with the image processing software and they both can be triggered to take images or be on a time interval collection like the GoPro. They are 12 and 16 mega pixel respectively and have a fairly wide view which makes them good for increasing distance between flight paths but still getting enough overlap. The GEMS (Figure 8) is a custom sensor built by Sentek Systems out of the Twin Cities. It is a self contained unit that has its own GPS, accelerator and dual cameras, RGB and Infrared, which makes it ideal for collecting aerial imagery. The sensor is also very light allowing us to put it on our big platforms like the Matrix, a small Iris platform or even in a fix wing aircraft. On draw back to this sensor in the view width. It is very narrow and basically only gathers data directly below the sensor which increases flight time because the flight paths have to be so close together to get the proper amount of overlap. Better pixel quality from the cameras would also be a nice addition to this sensor. The ease of use is what has made this sensor our go to for data collection in the past.
Once the students got introduced to these different sensors it was time to fly. Four flights were conducted with the Matrix. Two were with the GEMS sensor and the other two were with the SX260 camera. One of the SX260 cameras we have is an infrared so one flight was RGB and then the same flight path (Figure 9) was flown with the infrared SX260. All of the data collected is on the computers at school and will be processed in classes later this semester by the class when we move from the data collection portion of the class to the analysis and processing portion of the class. The Iris was also flown by Dr. Hupy demonstrating the mission planning procedure for that platform and conducting an auto mission with the GoPro attached to gather the data, He also demonstrated some of the other functionalities of the Iris including the follow me function which is often used by people who are more interested in creating videos rather than collecting still images.
Figure 5 This is the GoPor HERO3. We use this every once in a while for imagery collection but we have better sensors that are easier to work with when it comes to the imagery processing.
Figure 6 This is the Canon SX260 camera we use for much of our data collection. 
Figure 7 this is the Canon S110 which is pretty similar to the SX260 but it is slightly smaller and is a 16 megapixels compared to 12. 
Figure 8 This is the GEMs sensor made by Sentek Systems. You can see the two cameras side by side where it collects RGB and IR images simultaneously.
Figure 9 This is a flight path similar to the ones flown during class to collect the imagery. This is with the Canon SX260 camera on the Matrix platform at 70 meters. A speed of 5 m/s and approximate flight time of 5 minutes.

Discussion

This exercise really made this class and material more interesting for me and from the comments I heard during class from other students they are starting to understand the different parts and procedures of this technology. This comes back to the hands on approach of this class. It was disappointing last week when we didn't get to fly however I think splitting the mission planning and pre flight up from the actual flying of the mission is a good thing. It gave the students two chances to really learn the procedures and it also is a test of how much they remembered from last week when they had to use those skills this week to do the flights. As the GEI technician and in doing my research with Dr. Hupy I have flown many flights but it is always exciting when we get the chance to go and fly. The weather and mechanical obstacles have often prevented us from being able to collect data but when we finally get a chance and it all goes smoothly it is fun. It is rare to find an occupation that is very serious and demanding but is also fun, that is what data collection with UAS is though. We strictly follow our safety procedures and we document all of our flights but at the same time it is fun because I enjoy what I am doing. I think this week the students got to experience that too. Last week was very serious about the safety and planning aspect and this week they got to actually see missions be flown which is fun and exciting, yet serving a good and serious purpose. The next couple weeks when we get to process the data is cool too because you get to see a final product of your work. The students will know how to go through the whole UAS process from pre flight all the way through data processing and analysis, which is a very valuable skill set to have once they are in the work place.

Wednesday, September 23, 2015

Activity 2: Using a multi-rotor to gather imagery/Using mission planning software

Introduction

In this weeks activity we explored the topic of mission planning and preparation for a data collecting mission using a Unmanned Aerial system. We started in the lab getting the basic knowledge needed to plan a mission in 3DR Mission Planner software. Then the students were split into groups of three and each one experienced the different position and roles that members a flight crew would serve during an actual mission. They went through all the pre-flight checks, planned the mission and finished by arming and disarming the UAS. We were unable to fly because of poor weather conditions but hope to do that in the upcoming weeks of class. My role in this exercise was to watch as the students did all the inspections and give them assistance and tips as needed. As the UAV technician and pilot for the geography department I have done this same procedure many times during real flights but helping teach the students the process forced me to pay more attention to detail and not just go through the motions. 
Having the students learn these procedures and techniques gives them a much better understanding of what all is involved in data collection with UAS. It is more time consuming and complex than many people think. There are many safety, privacy and other consideration that need to be taken into account before during and after a mission. By seeing this whole process from start to finish it gives them a new area of knowledge which they could apply to research here at UWEC or out in the work force once they graduate.

Study Area

This exercise was conducted here on campus in the campus mall area (Figure 1). The weather was not very ideal which is why we did not fly any of the missions the students created, as previously planned. It was in the mid 70s very cloudy and humid with winds around 5 MPH out of the SSE. Since we were on campus there were many people walking past and observing what we were doing. This is a little bit of a safety hazard but since we were not actually flying the missions it was OK. Being aware of your surroundings at all times when out in the field is essential, especially when using UAVs. A few of the platforms we use can very seriously injure or kill someone so vigilance and taking every precaution to make sure the study area is clear is vital to safe data collection. Another reason we would not have flow in this area is there are numerous trees, bushes garbage cans and other objects that could get in the way of landing or take off. GPS health is also another concern in the campus mall because there are so many tall brick buildings. Large structures will often times reduce or even completely block GPS signal from the UAS platform. Strong GPS health is vital to a successful mission so again the campus mall is not a good place to fly. I hope all the students made some of these observations as well. Being a pilot is much more than just knowing how to fly you have to be aware of everything going on around you, weather, people, obstacles, and the aircraft.
Figure 1 This is an aerial view of lower campus at UWEC. The orange square labeled study area is where the activity took place. As you can see there are lots of large buildings around this area which make it a very bad place to fly. Issues with GPS health and also being able to see the UAV as it is flying are two of many factors that make this a very poor place to actually fly a mission. 

Methods

Planning Missions

The first part of the activity was indoors in the lab learning how to use Mission Planner which is an open source software created by 3DR. The first step to planning a mission is knowing where you want to fly or your study area. Once you have your study area chosen and travel to that location the next thing to do is create your flight path and data collection mission. When creating the mission there are a few things to keep in mind as far as legal areas to fly. Built into this software are the no fly zones in the United States. No fly zones are areas over airports, military bases, the White House and other important high security areas. It is illegal to fly a UAV in these areas and if caught you will be severely fined or sent to jail. In mission planner these areas show up as pink circles and purple areas (Figure 2). Make sure your study area is not in one of these spaces. The mission planner software communicates with the auto pilot on board the UAV and tells it where to fly. There are many other parameters that can be set in this program as well such as altitude, flight speed, how often to collect a picture, and what to do at each way point ascend, descend, land etc. To draw a flight path you draw a rectangle with a mouse on the screen and then based on the parameter you have set Mission Planner fills in the flight grid (Figure 3). You can also manually enter the way points like I have done in figure 4. Once it creates the grid it will tell you an estimated flight time which is good to know because these vehicles do have a limit on flight time. Once you have your mission drawn you can get ready to fly with the next step.
Figure 2 This is showing those no fly zones that are built into the software. You can see the pink circles indicating airports and the purple circles are over government buildings and other no fly areas.

Figure 3 This is what a auto waypoint mission looks like. I tell Mission Planner what area, sensor, altitude, and platform and creates a mission grid based on those parameters. Just like when I manually made a mission you can see the list of waypoints at the bottom of the screen, I can still go in and make changes to those even though the mission was auto created.

Figure 4 This is a screenshot of Mission Planner. I have manually drawn out a flight path for a mission over the soccer park here in Eau Claire.  You can see at the bottom of the image where you can set all the parameters for each point in the flight. If I were connected to an aircraft it would also tell me an estimated air time for this flight. It is important that these flight lines are close enough together to ensure the correct amount of image overlap. We shoot for 80% overlap and this program will adjust the flight path to accommodate that depending on which sensor you are using.

Pre-Flight Checklist and Notes

This next step is the most important. Professor Hupy and I after some crashes and other growing pains in our research with UAV's came up with a pretty extensive checklist to go through prior to putting any of the UAV's in the air. Once the mission is created and the UAV is assembled going through the checklist is essential. We use the checklist shown in figure 5. We check the weather, all connections both frame and electrical, prop and motor check, battery and antenna check, controller and command center, battery voltage, number of satellites, mission sent to UAV, check takeoff area for people and then clear for launch. I haven't read the whole list but this gives you an idea of how many things we check. We check everything we can think of every time we fly. This is the golden rule of using UAV's. It is over looked many times and that leads to accidents which cost money and can damage property or even kill someone. Notice how the majority of this checklist happens before the aircraft is powered up, there is a reason for that. Once the UAV is powered up reducing the time someone is around it adjusting things the better. An accidental arming of aircraft could result in loss of fingers or worse depending on the platform. After every flight we take notes about the flight in a log book we have created as well. This is more to do with battery charging, post flight maintenance of the vehicles if needed and data processing. Figure 6 is a screen shot of the other excel sheet we use to keep track of all of our flight data.

Figure 5 This the pre-flight checklist we go through before every flight.  We do not fly until there is a check by each  row.  Like I stated earlier this helps us to fly safely which is many times overlooked because people think UAV;s are just toys and that they aren't dangerous. We know the dangers and use our checklists!
Figure 6 This the spreadsheet where we keep all the information about each flight. We record battery voltages, weather info, maintenance needs and suggestions for what to do better on the next flight. 

Mission Execution

We did not fly any mission in class but this is the procedure if we had done so. Once the the pre-flight checklist is complete the next step is to fly the mission. Part of the checklist is sending the mission to the UAV's auto pilot. Once this is done the aircraft is armed. After we are sure the area is safe we prefer to take off manually to about 10 meters height and do a stability and normal function test. We place it into loiter to see if it is functioning properly and can hold its position, if everything looks good it is flipped to auto pilot and it begins to fly the previously created flight path. During the flight there are a couple different jobs the crew flying the UAV need to perform.These are the 3 postions we had each of the students go through and learn what their role would be during an actual flight. One is the pilot at the controls (PAC) who has the controller in there hands and is piloting the aircraft. Another is the spotter who keeps an eye on the UAV at all times while it is in the air to make sure it is flying normally. The third person is the pilot in command (PIC). This person sits at the command station and watches the diagnostics on the Mission Planner screen (Figure 7). The PIC watches diagnostics like battery voltage, number of satellites, altitude, air speed, pitch, yaw, roll and that the aircraft is following the planned flight path. If anything looks suspicious this is the person who can abort a mission.Conducting research with Dr.Hupy we have aborted mission and safely landed. One time I was at the laptop and saw that the UAV was deviating from the flight path and the spotter observed it acting rather strange in the air. I made the decision to say abort mission and the pilot at the controls, professor Hupy, said aborting mission and put the aircraft in RTL mode. RTL mode stands for return to launch. If at any time during the mission this switch is flipped the UAV stops whats it is doing and returns to where it took off from and lands itself. The mission may not have needed to be aborted however as a safety precaution I decided to abort. 
Figure 7 This is the display that the Pilot in Command watches as the UAV is in the air. You can see altitude, speed, battery voltage and other vital information is displayed here. If there is a catastrophic failure on board the UAV it will tell you on this display. If the GPS failed for example where it says DISARMED on the screen it would say NO GPS or GPS FAILURE in big red letters at which the mission would be aborted. 

Discussion

I think this activity was very good for the other students how complex of a procedure collecting aerial imagery with a UAS really is. It is not an activity that should be taken lightly or done by someone who does not have much experience in the field because it can be very dangerous for both the flight crew and observers if done incorrectly. I hope it sparked so interest in the students to pursue this technology as part of their research. It is a very skill set to have and will make them much more marketable once they graduate UWEC. In my time being the GEI technician I have seen opportunities and had many valuable experiences in connection with UAS technology that have increased my knowledge on the topic and helped me to better connect it to my GIS and other geography techniques. I hope we can conduct an actual mission with the students in the next few weeks so they know how this process goes from start to finish and even into the processing of the imagery collected during that flight in class.

Sunday, September 13, 2015

Activity 1: Image Gathering Fundamentals: Using a Kite (Weather Balloon) to Gather Aerial Imagery

Introduction

This was the first week of this Unmanned Aerial Systems class where we did a field activity. This week we focused on collecting aerial imagery in an old school fashion. This served the purpose of demonstrating to the students that a more complicated and automated Unmanned Aerial Vehicle or UAV isn't the only way that useful aerial data can be collected. In the next few weeks they will be working with and learning how the same techniques used in this weeks low tech activity also apply to the much more advanced and technical platforms available today for data collection.

Study Area

For this weeks activity we went to the Eau Claire Sports Center complex (Figure 1) which is about 5 minutes from campus. The complex has a large area of multiple soccer fields which was good for this activity, giving us plenty of space to collect data in. It was a cloudy day with temperatures in the lower 70s and very little to no wind. The original plan was to fly a very large kite (Figure 2) with the cameras attached but because of the lack of wind a weather balloon (Figure 3) filled with helium was used instead.
Figure 1 is a Google Earth image showing the sports center complex (lower left) and the soccer fields we used during this activity. The yellow box is the area that we walked back and forth over with the balloon to collect imagery. The huge amount of open space was ideal for this exercise. 
Figure 2  The original plan was to use a kite very similar to the one pictured here. It is about 6
feet across which helps with lift and makes it more stable.

Figure 3 This is a weather balloon similar to the one used in this activity.

Methods

The methods for this activity are fairly straight forward and not overly complex. Dr. Hupy assigned some pre class readings on how to set up the balloon with cameras using a device called a Picavet Rig (Figure 4). The purpose of the picavet rig is to keep the cameras facing down at all times to collect vertical imagery. The strings are tied in such a way that they are allowed to slide back and forth as the balloon or kite tips to keep the cameras 90º to the ground at all time (Figure 4). Dr. Hupy had a picavet rig set up from past activities that we used for this activity however he showed the students how it was strung up to help them understand the concept. Once the rig was set up the next step was to inflate the balloon. The balloon was filled with helium which is lighter than air so that the balloon would float. We had to put in enough helium so that the balloon wouldn't only float but be able to carry the two SX260 Canon cameras (Figure 5) mounted on the picavet rig. There are two cameras because one is a regular true color or RGB camera and the other is an infrared camera. This will allow use to look at the area in normal color but also in infrared from which we can do vegetation health analysis. Once the balloon is inflated a string is attached so that we can control how high the balloon goes and where it goes. The picavet rig is attached to the string about 15 feet below the balloon. For our purposes the balloon was flown at approximately 50 meters or 150 feet. This is a good height which allows use to get the desired amount of overlap between images. Image overlap is essential for creating accurate and usable data layers such as a mosaic of the whole area where data is collected. 75 % overlap is a good amount, but during this activity we were getting around 80 to 85% overlap. Again that overlap comes from flying at an adequate altitude and well as the correct spacing between passes. Data was collected by walking straight lines back and forth across the soccer complex and keep the balloon at a constant altitude and keep the same spacing between passes which in this case was 10 meters or 30 feet. Figure 6 is video of the balloon while it was collecting imagery.
Figure 4 This is a diagram of how to string up a picavet rig. It is very simple to set up and is made of simple string, a metal cross piece and some special clips used to attach it to the string on the balloon.


Figure 5 This is an SX260 Canon camera like the one used in this activity. It is a fairly inexpensive camera that takes good quality images and is easy to use. This camera is bolted to the metal plate on the picavet rig 90º to the ground or the lens pointing to the ground.
     Figure 6 This is a video Dr. Hupy took with a DGI Phantom which is a small drone with a very good camera on it for collecting video and images. You can see the weather balloon and the cameras attached to the picavet rig.

Discussion

Later in the semester we will be processing the imagery we collected during this activity and it will be interesting to see how the data turns out. With a multi rotor or fixed wing platform the amount of movement side to side the cameras do is pretty minimal compared to a platform like a balloon where the picavet rig and cameras can sway side to side quite a bit, The picavet rig is supposed to minimize the movement but we will see how well it did when we process the data. Another consideration when using this platform compared to a more sophisticated one is the time between images. With a mulitrotor where you are moving at 15 to 20 MPH you will want pictures taken more often usually around every .7 seconds, with a slow moving balloon that can adjusted to every 3 to 5 seconds or even longer depending on how fast you are walking. The speed and timing of image capture is crucial because just like we want 80% overlap side to side on the images we also need the images to overlap front to back a decent amount. Again it will be interesting to see how good our timing and overlap was when the data is processed in a couple of weeks.

Processed Data

I ran the imagery data we collected during this exercise and it turned out pretty good. Figure 7  is a 3D mosaic of all the RGB photos taken during the exercise. There were 570 photos that I brought into a image processing program called Pix4D, which takes all those individual JPG files and combines them into one large image. It does this by using the geotags that the camera places on each image. A geotag is a lat/long position where the image was taken. Figure 8 is the same mosaic but you can see in the red circles there are some areas of the image that the program didn't mosaic properly. There are a large number of reasons why this happens, both data collection methods issues as well as software errors. In this instance you can see some issues with elevation. Just as the camera takes a geotag it also records an elevation value for each image. If these values get messed up you will get areas in the image like the red circles where this area is obviously flat but the mosaic has drastic elevation changes in it. These errors can be erased so they aren't too big of a deal. For being attached to a balloon this data turned out extremely well. Figure 9 is a 2D version of the mosaic brought into ArcGIS where it can be manipulated and have various GIS tools and processes run on it, such as a simple elevation map Figure 10. Along with these RGB images there were also 570 Near Infrared images taken that I am currently working on mosaicking. From what I can tell so far the mosaic isn't going to turn out as well for the IR but will have to wait and see till it is done processing.
Figure 7 This is the 3D mosaic created in Pix4D of all the RGB JPG files taken with the weather balloon of the Eau Claire Sports Center soccer complex.

Figure 8 This is the same 3D mosaic but you can see the errors discussed above in the post.

Figure 9 This is the 2D version of the mosaic brought into ArcGIS.

Figure 10 This is a very simple elevation map created form the DSM file which is the elevation file created when the images were run in Pix4D. The green areas are higher elevation and the red areas are lower elevation. The green area circled in black is one of the errors seen in the 3D mosaics above.


Conclusion

This was an interesting exercise for me. I am quite familiar with collecting aerial imagery using UAVs, as I am the technician for the Geography departments UAV lab and program, however I have never collected imagery in this low tech way. All of the concepts are the same it just comes down to making slight adjustments for the difference in platform technology, speed and stability. It is good for the students to see this method before they see the more high tech platforms later in the semester so that they realize that collecting aerial imagery can be done cheaply and effectively without using a platform that costs thousands of dollars. It also gives them a good idea of how drastically the technology has advanced in this area over the last couple years.