This is the course outline.
Welcome to Lesson 1! In this lesson, you will become familiar with the history behind the use of the UAS. You will also be familiar with the current status of the UAS development. In addition, you will be exposed to the different classes of UAV/UAS according to their size, weight, and missions.
At the end of this lesson, you will have a working knowledge about how the unmanned aerial missions started, the current status and the classes of the UAV/UAS.
At the successful completion of this lesson, you should be able to:
Lesson Activities
In this section, you will learn about the history of UAS development and its introduction to civilian and military applications.
The history of flying objects, or the unmanned aerial vehicle in its rudimentary forms, extends way back to ancient civilizations. The Chinese, around 200 AD, used paper balloons (equipped with oil lamps to heat the air) to fly over their enemies after dark, which caused fear among the enemy soldiers who believed that there was divine power involved in the flight.
In the United States, during the Civil War, both Union and Confederate forces launched balloons laden with explosives and attempted to land them in supply or ammunition depots and explode them.
As a matter of fact, the idea of unmanned aerial objects came long before manned flights. This was for the obvious reason of removing the risk of loss of life in conjunction with these experimental objects. In modern times, the idea of unmanned flying objects developed to meaning flying aerial vehicles, or aircraft, without pilots on board. Thanks to advancements in technology, the maneuvering and control of piloted flight can be sufficiently mimicked.
Names like aerial torpedo, radio-controlled vehicle, remotely piloted vehicle (RPV), remote controlled vehicle, autonomous controlled vehicle, pilotless vehicle, unmanned aerial vehicle (UAV), unmanned aircraft system (UAS), and drone are names that may be used to describe a flying object or machine without a pilot on board.
The main challenge that faced early aerospace pioneers of piloted and pilotless airplanes alike was the issue of controlling flight once the flying object was up in the air. The Wright Brothers (1903), and at about the same time, Dr. Samuel Pierpont Langley, taught the aviation world a lot about the secrets of controlled flight. Afterwards, the war machine of WWI put intense pressure on inventors and scientists to come up with innovations in all aspects of flight design including power plants, fuselage structures, lifting wing configurations and control surface arrangements. By the time WWI ended, modern day aviation had been born.
In late 1916, the US navy funded Sperry Gyroscope Company (later named Sperry Corporation) to develop an unmanned torpedo that could fly a guided distance of 1000 yards to detonate its warhead close enough to an enemy warship. Almost two years later, on March 6, 1918, after a series of failures, Sperry efforts succeeded in launching an unmanned torpedo to fly a 1000-yard course in stable guided flight. It dived onto its target at the desired time and place, and later was recovered and landed. With this successful flight, the world’s first unmanned aircraft system, which is called Curtiss N-9, was born.
In the late 1930s, the U.S. Navy returned to the development of drones. This was highlighted by the Navy Research Lab’s development of the Curtis N2C-2 drone. (See Figure 1). The 2500-lb. bi-plane was instrumental in testing the accuracy and efficiency of the Navy anti-aircraft defense system.
World War II accelerated the development of aviation science in general and the unmanned aircraft in particular. Both the Germans and the allies successfully utilized unmanned combat aircraft. The most extensive program came about during the Vietnam War, as advances in technologies made UAVs more effective. Ryan Firebee drones by Teledyne-Ryan Aeronautical of San Diego, California were flown extensively over North Vietnam and conducted various tasks, such as reconnaissance and signals intelligence missions, leaflet drops, and surface-to-air missile radar detection.
In most recent experience, US forces used drones in the wars in Bosnia, Iraq, and Afghanistan, and drones are in continuous use in the war on terrorism around the world.
The way a pilotless aircraft is controlled determines its categorization. In general, there are three main names for pilotless aircraft:
Whether it is named a UAV, an RPV, or a drone, at a minimum, the pilotless aircraft should include the following elements:
More details will be provided in Lesson 2.
Although studying the origins of UAS development is crucial to understanding the evolution of UAV/UAS, its status in modern times is what we’re concerned with in this course.
UAV/UAS has shown sporadic appearance over time, and individual appearances have lacked momentum and continuity. It has become the pattern for UAV/UAS to serve a limited purpose and then discontinue as the purpose is satisfied. The utilization of UAS during the Vietnam War is a good example of such a sporadic rise of the use of drones. Very little was achieved surrounding the development of UAS after the war ended. This is not the case with the current period of unmanned aircrafts development.
In the last two decades, UAV technology has become very strong. This is mainly due to advancements in the fields of GPS, IMU, and electronics. Since the wars in Bosnia, Iraq and Afghanistan, the pilotless aircraft industry has witnessed increased and sizable investment that has continued to the present time.
The use of pilotless aircraft in Desert Storm in 1991, and later in Desert Shield, can be considered to be the first wide-scale deployment of UAS/UAV. During Desert Storm, some 500 UAS sorties were conducted to support intelligence gathering and to guide heavy artillery from battleships in the Persian Gulf. The success in deploying UAS in desert storm convinced militaries around the world of the usefulness of UAS in spotting enemy locations and directing artillery units.
Strong opposition to the use of UAS for defense purposes came from manned aircraft pilots and their leadership. They found a weakness in the technology that supported their claims. They built their case on the vulnerability of the data link, especially for the UAS, that relies on line-of-site based operation. However, advances in the space communications field, especially GPS, weakened their claim, as the space-based data link made the UAS no less more vulnerable than the piloted aircraft.
The United States has committed valuable resources and investments to the development of the modern UAS. NASA was immensely involved in such developments, as it is clear in the following video clip.
This video has clips of the following types of UAVs.
Lifting Body Remotely Piloted Vehicle (Hyper III, 1969)
Remotely Piloted Research Vehicle (Piper PA-30, 1970)
F-15 Remotely Piloted/Spin Research Vehicle (SRV, 1975)
Drones for Aerodynamic and Structural Testing (Firebee/DAST, 1977)
Highly Maneuverable Aircraft Technoloby (HiMAT, 1979)
Controlled Impact Demonstration (Boeing 720 CID, 1984)
Expendable Air-Launched Orbital Booster (Pegasus, 1990)
Spacecraft Autoland Demonstrator (Space Wedge, 1991)
Environmental Research Aircraft & Sensor Technology (Perseus, 1993)
Environmental Research Aircraft & Sensor Technology (Theseus, 1996)
Environmental Research Aircraft & Sensor Technology (Altus, 1996)
Environmental Research Aircraft & Sensor Technology (Pathfinder, 1997)
Environmental Research Aircraft & Sensor Technology (Centurion, 1998)
Environmental Research Aircraft & Sensor Technology (Helios, 2001)
High Altitude Endurance Unmanned Aerial Vehicle (Tier III, 1996)
Tailless Fighter Agility Research Aircraft (X-36, 1997)
Lifting Body Crew Return Vehicle (X-38, 1998)
Space Maneuvering Vehicle (X-40 SMV, 2000)
Inflatable Wing Technology Demonstrator (I-2000, 2001)
High Altitude Lon-Endurance Research Aircraft (Altair, 2002)
Scramjet Engine Experiment (X-42 Hyper-X, 2001)
Unmanned Combat Aerial Vehicle (X-45, 2002)
MQ-9 Predator B UAS (Ikhana, 2007)
Blended Wing Body [BWB] (X-48B, 2007)
RQ-4 Environmental Science Aircraft (Global Hawk, 2007)
UCAV Demonstrator (Phantom Ray, 2011)
Hydrogen Powered HALE (Phantom Eye, 2007)
Hybrid Wing Body [HWB] (X-48C, 2012)
The use of UAS in US Army combat operations grew from 51 operational UAS in 2001 to over 4000 in 2010. Studying the “US Army Unmanned Aircraft Systems Roadmap 2010-2035” can tell a lot about the importance of UAS in the US Army’s current and future activities. The Roadmap states that “Army UAS are the ‘Eye of the Army’ and support information dominance by providing the capability to quickly collect, process, and disseminate information to reduce the sensor-to-shooter timeline. In addition, UAS support tactical echelons in Army and joint operations and provide the warfighter a tactical advantage through near real-time situation awareness, multi-role capabilities on demand (including communications, reconnaissance, armed response, and sustainment applications), and the ability to dynamically retask.”
UAS use for daily civilian activities is no less important than it is for the armed forces. UAS are used around the world for different tasks, and most recently, for package delivery. The first video below (1:12) provides a fairly good idea about the different types and uses for the modern day UAS. UAS commercial use outside the United States is growing rapidly, as is illustrated in the second video clip. Growth in the commercial UAS market within the United States is slower than one would like to see due to tight regulations by the FAA. UAS is not allowed to be used for any commercial purposes in the United States. In a recent report, 6 Predictions for 2016: UAV Experts Discuss Important Developments for Commercial Drone Applications [4], by Jeremiah Karpowicz of the Commercial UAV News, the author discusses the latest developments in the UAS market and technologies as well as predictions on the status of UAS use for commercial applications.
Video of Unmanned Aerial Systems taking off, in flight and landing. The UAS's are flying over mountains, neighborhoods, airports, roads, forestfires, lakes, oceans, and military operations. The following words appear on the screen indicating how UAS's are used: Disaster Response, Weather Monitoring, Maritime Security, Border Security, Firefighting, Drug Interdiction, Search and Rescue, and Wildlife Management.
[MUSIC PLAYING] BRIAN A. ANDERSON: Hey, it's Brian with Motherboard. I've got one word for you-- drones. Now, you've probably heard a little bit about drones in the news lately. These things fly all throughout the Middle East and the Horn of Africa. When they're not spying on suspected terrorists, they're probably killing them with Hellfire missiles.
But here's the thing. Drones are coming to the States. They're actually already here. They're being used to keep an eye on things, so they're not going to kill you, at least not yet. Motherboard has been fascinated with drones for a while now. It seems there's some misconceptions about the age of unmanned aerial vehicles.
To try and clear the air just a little bit, we're going to head out and talk to some people who are building drones, who are selling drones all over the world. With any luck, we hope to fly some drones, as well. We have absolutely no idea what we're getting into.
New York City captured by a Swiss drone hobbyist. As you're probably thinking, yes, this is illegal as all hell. And I'll be the first to say that doing this sort of thing over the site of the worst terrorist attacks on American soil? Probably not the best idea. The drone view that you've seen probably looks a bit more like this. Or more accurately, this.
The grainy, pixelated, bird's-eye views that Unmanned Aerial Vehicles, or UAVs, offer have become wildly popular on the internet. Maybe you've heard of the grim footage under its nom de YouTube, drone porn. How did we arrive at the robo wars? And where are they taking us?
To get an idea, we left our Brooklyn offices for Washington, DC, to meet up with PW Singer, one of the world's foremost experts on military robotics.
PW SINGER: We are wrestling with what it means to live, work, and even fight through a robotics revolution. The technology that we're using, with things like the Predator or the PackBot-- those are Model T 4. Those are Wright brothers equivalents. But even with that first generation, we're seeing impact on questions like, how do we catch up our laws in war, but also how do we start to catch up our laws domestically as we start to see that technology move over to the domestic side?
We're seeing an evolution that is following many other technologies. And the story of the airplane is, I think, a good illustration of where we're at and the impact of the war on an industry that becomes a game-changer. The flying machine was once thought as mere science fiction. Then the Wright brothers make it real. Within a couple years, it's utilized in war.
In World War I, at the start, they're not armed. They're just used for observation. Then they jerry-rig arm them. Then they start to specially design them to be armed. And then by the end of World War I, you see all these other roles being visualized for planes that soon move over to the commercial sector. Passenger, postal delivery, medical evacuation-- you name it.
Same exact thing is happening with robotics. First science fiction, then becomes real. The Predator was originally unarmed, just used for observation. Then they jerry-rig arm it. Then they specially design them to be armed. Now we're seeing all sorts of other roles.
BRIAN A. ANDERSON: One of the latest developments in militarized drones is autonomy-- being able to tell your drone where to go and then basically setting the thing on cruise control until it gets there is a game-changer. At the same time, drone technology is doing what most any other killer app does as it proliferates. It's becoming smaller.
We actually noticed this evolution last year when Vice was in Amman, Jordan, home to SOFEX, the world's largest military weapons expo.
SHANE SMITH: You know when you were a kid, you used to have those little model airplanes and somebody's dad would be a real nerd and have a model airplane? Now, it's all model airplane-style drones that can take pictures or drop bombs.
BRIAN A. ANDERSON: We want to check out some of these drones and size up their market, so we decided to go back.
[AIRPLANE ENGINES WHINING]
Drones are becoming hot commodities for armed forces around the world. Some 600 companies from well over 50 countries are dabbling with drone tech for both spying and killing. And nowhere is this more evident than among the trade booths at SOFEX, where we first meet this guy, a rep for a Turkish drone company.
FATIH SENKUL: My name is Fatih Senkul. I am working for Atlantis Unmanned Vehicle Solutions developing unmanned vehicles, like Aeroseeker. Some photographers want to use it for surveillance purposes, military issues, and maybe some go track-and-seek missions. Some of the military, even the Turkish Army-- what's the payload? They asked. We said, 500 grams. So let's put a very little camera and just put 500 grams of bomb and they will do a suicide attack.
That's one of the issues they offered we hadn't thought of. This is something that the military's thinking.
BRIAN A. ANDERSON: If that sounds crazy, well, then there's this.
FATIH SENKUL: I am a fan of Terminator and I love these movies, and I really would like to see some of them in the future, like 2030 maybe. So I am trying to do my best to see them, yes.
BRIAN A. ANDERSON: Unlike Fatih, I'm in no rush to hasten the rise of the machines. The next guy we meet at SOFEX maybe isn't either. Then he says something almost as crazy.
CHRIS BARTER: To me, drone means you've got something that's operational on its own. It's kind of doing its own thing, like a Hal of 2001, A Space Odyssey.
BRIAN A. ANDERSON: Here's hoping his robot, the Scout, has no intentions of becoming self-aware, like Hal, and refusing to open the pod bay doors. Now to be clear, the Scout is built by Aeryon Labs, a Canadian drone firm. Datron and its reps, like Chris, work with Aeryon on the supply side of the chain. Chris Barter is a drone dealer.
The Scout is the flagship UAV in Datron's suite of tactical robotics. Take one look at its size, and it's pretty clear that the Scout is nothing more than a surveillance system. It can fly at speeds up to 30 miles per hour. It's fully operational from negative 22 to 122 degrees Fahrenheit. And it can withstand wind gusts up to 50 miles per hour.
It's a compact, capable machine, and has been sought after by the likes of NOAA, the US Coast Guard, and FEMA.
PRESENTER 1: Is there anything that you can give me?
CHRIS BARTER: Like a hand--
PRESENTER 1: Yeah.
CHRIS BARTER: Like a brochure or something like that?
PRESENTER 1: I think we're going to hopefully be contacting you very, very soon.
CHRIS BARTER: Sounds like a plan.
BRIAN A. ANDERSON: After Chris closes the deal, he invites us back to Datron's headquarters just outside of San Diego. In addition to drones, he promises there's going to be some pretty decent surfing.
[AIRPLANE ENGINE WHINING]
CHRIS BARTER: I'm a pretty Buddhist guy. There's not much that makes me tic out there, outside of bad driving and bad surfing.
BRIAN A. ANDERSON: Would you ever use a Scout to shoot some pretty gnarly, big wave surfing footage?
CHRIS BARTER: Oh absolutely, man. That's actually one of my dreams, is to take it out to Pipeline or [INAUDIBLE].
BRIAN A. ANDERSON: So even though Chris exudes the calculating precision of a drone capitalist, he's a surfer dude at heart, and maybe even a drone hobbyist. And he isn't the only one who views drones as being a whole lot more than killers and spies. This is Justin Wellender. Notice those goggles he's putting on. Those allow for what's known as first-person viewing. So suddenly, what the drones camera sees transmits back to Justin's goggles, in effect allowing him to fly.
But we'll get back to the hobbyist later.
[MUSIC PLAYING]
Southern California has long been a hub of aerospace R&D. And today, drone firms like Datron are popping up all over the region. You can call it Drone Valley or even the Drone Zone. Datron's campus is in one of these cookie-cutter industrial parks. But soon enough we find the place and we're greeted by Chris and two of his colleagues. We'll get down to the brass tacks. Who buys a Scout?
CHRIS BARTER: I will not go into specific customers by name, but I can address customer bases that we will go after.
PRESENTER 2: The scout is man-packable, and offers fast setup, ease of use, and hot-swappable payload capabilities. The snap-together assembly requires no tools, and total assembly to takeoff time can be measured in seconds.
CHRIS BARTER: We're targeting the guy, be it the law enforcement officer, be it the squad guy who's out in a combat theater, who doesn't want to rely on some guy flying a system in Las Vegas that's being launched out of an airport that's 7,000 miles away.
BRIAN A. ANDERSON: Unlike the Scout, most so-called hunter-killer drones are flown out of Force bases throughout the American West. Many people lose sleep over the thought of these hulking drones, but many others accept the new bug-splat warfare.
CHRIS BARTER: I have no qualms when I read the news about a drone strike in Pakistan. What troubles me is that people have a tendency to kind of lump in a lot of these unmanned systems, one with another. So a Scout, which is unarmed and will probably always be unarmed, is meant specifically for surveillance, will never be harming any individual, for the most part, unless any kind of accident.
PAUL WILSON: The unit really and truly flies itself. It just waits for us to tell it when to take off, how high to go, how fast to fly, where we want it to go, and what to look at. All of our status says we're OK. We've got a GPS accuracy of 2.6 meters. So we're ready to take off. It spins up. It says, I've done all my check, so now I'm ready to take off. So I take off.
The vehicle is very good at flying itself, and it just listens to the directions of how high we want it to go, where we want it to go to, and what we want it to look at.
CHRIS BARTER: We've had a lot of interest in special use cases, like in Nome, Alaska, where they actually had an oil tanker trying to ship oil into Nome. Unfortunately, the harbor froze really early in the year. And what they actually did with the Scout was they took it off and they took photographs of the ice surrounding this harbor. And using post-processing software, they were able to actually map out the sea ice thickness so they could navigate this tanker accordingly. So it's a pretty diverse system.
BRIAN A. ANDERSON: It takes some convincing, but eventually Chris and his team let us take this diverse system of theirs for a spin. So I'm gazing up at this airborne robot, only to see it looking back down at me. I begin to feel the sting of my own privacy potentially being compromised, and I can't help but wonder if Chris and Datron feel the same.
CHRIS BARTER: Yes, we do empathize with the security and privacy rights. But we're more so focused on supporting that agency, supporting that firefighter, or supporting that law enforcement officer going into the building who needs to know either what's happening in that building, in a tactical type of situation, or what's happening on the other side. So really, it's in the court of public opinion how that gets flushed out.
BRIAN A. ANDERSON: Datron doesn't want to talk about privacy, but Chris hopes everyday civilians will come to see something like the Scout as a friend, not big brother.
CHRIS BARTER: As we deploy these into real world environments, what I hope happens is that people obtain an understanding of how these systems are actually working for them, as opposed to against them.
BRIAN A. ANDERSON: How do you think we did?
PAUL WILSON: You guys did pretty good you. Took off and you landed exactly where you wanted it to, and you didn't crash a thing. You did good.
BRIAN A. ANDERSON: No blood.
PAUL WILSON: No blood. No blood, no dents, no scratches.
BRIAN A. ANDERSON: Now that we've gotten a glimpse of the defense and professional side of this equation, we decide to check out some of the folks at the leading edge of hobby drones. A few miles down the road from Datron is 3D Robotics, a company that represents a drastic culture shift in drone tech. Alan and Sam, two engineers of the company, give us a quick run of the lab.
ALAN SANCHEZ: This is where we design all the frames, the autopilot, all the circuits, and also where we play around. So this is just where everything starts. And then the manufacturing, shipping, and testing's on the other side.
So right now, the word drone I feel has a negative connotation, especially with all the wars that have been going on, and military drones being the most common use of the word. But really, a drone is a machine that can pre-program or that has a level of autonomy that can do a job that the user can't do or doesn't want to do.
So what we're doing is turning regular RC aircraft, or even helicopters, quadcopters, into autonomous vehicles. With our autopilot, you just drop it into your existing vehicle and turn it into a fully autonomous aircraft, something that wasn't available for the masses before. And then what to do with that? That's where the user comes in. We're selling the tool, and it's up to the user to come up with a use for it. And you go buy scissors and do something bad with it, so it's basically the same thing.
CHRIS ANDERSON: I'm Chris Anderson. I'm the co-founder of 3D Robotics and I founded DIY Drones, the community that spawned us initially. This is not my day job. My day job is I'm the editor of Wired.
BRIAN A. ANDERSON: Shortly after taping this interview, Chris Anderson announced his departure from Wired to focus on 3D Robotics full-time.
CHRIS ANDERSON: Well, what you're looking at here is what we think of as something like the Apple from 1977, coming out of the Home brew Computing Club, amateurs, hobbyists, not the IBMs of the day. They're the technology in your cell phone-- the sensors, and the processors, and the wireless, et cetera. The fact that this has become cheap, and available, and ubiquitous is the enabling technology of the personal drone movement.
And we don't come out of the aerospace industry. We definitely don't come out of military. We come out of the hobbyist world, and what you're seeing here is just a bottoms-up open source community-based attempt to take a technology was once a military-industrial one and democratize it, make it available to everybody, and introduce the word personal to drone.
BRIAN A. ANDERSON: Minutes later, we're heading to a nearby field that serves as one of the main proving grounds for 3D's aircraft. Alan and Sam bring along two drones-- a small quad copter and a more traditional RC glider. We're curious to see how these guys stack up against the pro model, like the Scout.
CHRIS ANDERSON: These things are light. The planes are foam. They hit you on the head, they'll just bounce off. They won't hurt you. But they don't have weapons. They can't carry anything very heavy. They're designed basically like radio-controlled toy airplanes, but they just have a brain.
ALAN SANCHEZ: Yeah. So these things take a while to get some altitude.
PRESENTER 3: [INAUDIBLE]
Yeah.
BRIAN A. ANDERSON: And just like that, our graceful flight is cut short. The guys are spooked by a small private plane passing through our airspace, which brings us to the Federal Aviation Administration's stance on drones.
ALAN SANCHEZ: Since the FAA doesn't really have rules for what we make, we just piggyback onto RC aircraft. And so we're limited by altitude. We can only fly 400 feet or below. We have to fly within line-of-sight. Just various rules that are there that maybe we can do away with because our drones are more capable than that.
BRIAN A. ANDERSON: While 3D's quad copter and the Scout might look similar, their differences far outweigh any similarities. 3D's shoots seemingly better looking footage than the Scout that we flew, for one thing. Just compare the two. Then again, what the Scout might lack in visuals, it makes up in durability. And of course, the GPS and the slick user interface allow for reconnaissance and search-and-rescue capabilities that put it above and beyond 3D's systems, which are more or less pimped-out aircraft.
We fly 3D's drones the old-fashioned way-- with RC controllers. But thanks to 3D's autopilot, these are autonomous aircraft, meaning that just like the Scout, you can tell your DIY drone where to go, let it get there by itself, and then regain control once it's at point B. I can see how it would be easy to drop out and kick it in the Drone Zone forever, but it's time we get back to New York.
[PLANE ENGINE WHINING]
So we've made it back to Brooklyn. Before this trip, a lot of my thinking about today's drone world came with a certain alarmism. And to a degree, I think it still does, and for good reason. When you're playing with toys like these, it can be hard to forget that drone technologies are evolving in large part to be really, really good at killing a lot of people.
ALAN SANCHEZ: So if you want it to come back to you--
BRIAN A. ANDERSON: Yeah, I will.
ALAN SANCHEZ: --take this switch and pull it all the way down.
BRIAN A. ANDERSON: So I think that rigging up big Predator and Reaper drones to incinerate innocent civilians, and American citizens on foreign soil, is too much of the stuff of war crimes and extrajudicial killings. And I certainly don't sit well with the thought of a spying robot peering into my apartment window. But when it comes to some of the tactical and hobbyist drone deployments that I saw out in Jordan and San Diego, I kind of caught the bug.
It's getting harder and harder to argue against the fact that for certain scenarios, drones this makes sense. Think about Aeryon giving a couple of Scouts to Libyan rebels last year to help aid their fight against Muammar Gaddafi's forces. We all know how that story ended.
[GUNFIRE]
But beyond the war time theater, think of the myriad possibilities that drones open up for research, filmmaking, even the next generation of taco delivery. Or think about a guy like Justin, who's just really, really stoked on flying. My guess is what the domestic drone scape is going to look like in the next 5 to 10 years is about as good as yours.
But having spoken with people like Chris Barter and Sam and Alan out at 3D, I can say with relative assurity that drones are going to become more a part of our everyday lives than they already are. Should we be concerned about that? Absolutely. But so long as these drones are being put to legitimate uses, that's maybe not the worst thing, is it?
[DRONE ENGINES WHINING]
There is no one standard when it comes to the classification of UAS. (In this course, the terms UAS and UAV will be used interchangeably.) Defense agencies have their own standard, and civilians have their ever-evolving loose categories for UAS. People classify them by size, range and endurance, and use a tier system that is employed by the military. The US National Aviation Intelligence Integration Office [9] website provides good overview for the global UAS classification categories. For classification according to size, one can come up with the following sub-classes:
UAVs can also be classified according to the ranges they can travel and their endurance in the air, using the following subclasses developed by the US military:
According to the U.S. Department of Defense, UAVs are classified into five categories, as shown in Table 1:
Category | Size | Maximum Gross Takeoff Weight (MGTW) (lbs) | Normal Operating Altitude (ft) | Airspeed (knots) |
---|---|---|---|---|
Group 1 | Small | 0-20 | <1,200 AGL* | <100 |
Group 2 | Medium | 21-55 | <3,500 | <250 |
Group 3 | Large | <1320 | <18,000 MSL** | <250 |
Group 4 | Larger | >1320 | <18,000 MSL | Any airspeed |
Group 5 | Largest | >1320 | >18,000 | Any airspeed |
*AGL = Above Ground Level |
The very small UAV class applies to UAVs with dimensions ranging from the size of a large insect to 30-50 cm long. The insect-like UAVs, with flapping or rotary wings, are a popular micro design. They are extremely small in size, are very lightweight, and can be used for spying and biological warfare. Larger ones utilize conventional aircraft configuration. The choice between flapping or rotary wings is a matter of desired maneuverability. Flapping wing-based designs allow perching and landing on small surfaces. Examples of very small UAVs are the Israeli IAI Malat Mosquito (with wing span of 35 cm and endurance of 40 minutes), the US Aurora Flight Sciences Skate (with wing span of 60 cm and length of 33 cm), the Australian Cyber Technology CyberQuad Mini (with 42x42 cm square), and their latest model, CyberQuad Maxi. See Figure 1.1, below.
The Small UAV class (which also called sometimes mini-UAV) applies to UAVs that have at least one dimension greater than 50 cm and no larger than 2 meters. Many of the designs in this category are based on the fixed-wing model, and most are hand-launched by throwing them in the air as shown in Figure 1.2. Examples of members of this small UAV class are:
The The AiRanger™, by American Aerospace and shown in Figure 1.4. The The AiRanger™ [14] is a crossover UAV between a small and a medium-sized system
Some of the UASs of this class are based on a rotary-wing design.
The medium UAV class applies to UAVs that are too heavy to be carried by one person but are still smaller than a light aircraft. They usually have a wingspan of about 5-10 m and can carry payloads of 100 to 200 kg. Examples of medium fixed-wing UAVs are (see Figure 1.6, below) the Israeli-US Hunter and the UK Watchkeeper. There are other brands used in the past, such as the US Boeing Eagle Eye, the RQ-2 Pioneer, the BAE systems Skyeye R4E, and the RQ-5A Hunter. The Hunter has a wingspan of 10.2 m and is 6.9 m long. It weighs about 885 kg at takeoff. The RS-20 by American Aerospace is another example of a crossover UAV that spans the specifications of a small and medium sized UAV. Many other medium UAVs can be found in the reading assignment. There are also numbers of rotary-based medium sized UAVs.
The large UAV class applies to the large UAVs used mainly for combat operations by the military. Examples of these large UAVs are the US General Atomics Predator A and B and the US Northrop Grumman Global Hawk (Figures 1.7 and 1.8).
This class includes UAVs that have a range of 5 km, endurance time of 20 to 45 minutes, and cost of about $10,000 (2012 estimate). Examples of UAVs in this class are the Raven and Dragon Eye. UAVs in this class are very close to model airplanes.
This class includes UAVs that have a range of 50 km and endurance time of 1 to 6 hours. They are usually used for reconnaissance and surveillance tasks.
This class includes UAVs that have a range of 150 km or longer and endurance times of 8 to 12 hours. Like the close range UAV, they are mainly utilized for reconnaissance and surveillance purposes.
The mid-range class includes UAVs that have super high speed and a working radius of 650 km. They are also used for reconnaissance and surveillance purposes, in addition to gathering meteorological data.
The endurance class includes UAVs that have an endurance of 36 hours and a working radius of 300 km. This class of UAVs can operate at altitudes of 30,000 feet. They are also used for reconnaissance and surveillance purposes.
In this section, we will discuss the different missions of the UAS.
Naming the different missions for UAVs is a difficult task, as there are so many possibilities and there have never been enough systems in use to explore all the possibilities. However, the two main classifications for UAS missions are the following:
On the geospatial and mapping applications side, the UAS can be used for the following activities:
Military and civilian missions of UAS overlap in many areas. They both use UAS for reconnaissance and surveillance. In addition, they both use UAS as a stationary platform over a point on the ground from which to perform many of the communications or remote sensing satellite functionalities with a fraction of the cost.
We have now concluded the materials for Lesson 1, which walked us through the early history of UAS development. As is the case with most emerging modern technologies, we find the US defense program behind UAS development and its introduction to the civilian market. In addition, we learned about the different classifications for UAS. We also learned about the current status and the different applications of UAS.
One thing I would like to emphasize here is the fact that there is no single civilian owner of a large size UAS (such as the one used by the military, which is the size of a Boeing 737). In other words, there is a large gap between the size and sophistication of UAS used by the military and the ones used by civilians, which are characterized by smaller size and lesser sophistication. I believe that the reason behind this gap is strict regulation surrounding the operation of UAS in the National Airspace (NAS). Such a gap will diminish once civilian UAS has access to the NAS.
As for this lesson’s readings, try to read as much as you can through the materials available on the Internet, as it is a great resource. There is no one good textbook available so far on the subject. That is why I recommend buying, if you can, the two supplementary references listed under the course requirements in addition to the designated textbook.
(Note: Unless it is an online quiz or assignment, all deliverables should be organized and submitted in a Word document. Figures should be scanned and inserted in the document.)
1 | Complete the Lesson 1 Quiz by the end of Lesson 2 |
---|---|
2 |
Complete your participation in the discussion forum on the "Agreement and Differences in UAS Classification" detailed in Classification of the Unmanned Aerial Systems [19] by the end of Lesson 2 |
3 | Review the final project details in Canvas. |
Welcome to Lesson 2! In this lesson, you will become familiar with the elements that combine to create an operational Unmanned Aerial System (UAS). Most UASs consist of an Unmanned Aerial Vehicle (UAV), human elements, payload, control elements (for a larger system it will be a ground control station (GCS) or mission planning and control station (MPCS)), and a data link communication unit (Figure 2.1). Military versions of the UAS will have an additional weapon system platform and supporting soldiers as part of the human element.
In addition, you will understand and develop knowledge about the different acquisition and auxiliary aerial sensors that are usually carried on board the UAS payload. Finally, at the end of this lesson, you will have a working knowledge of the different components forming a UAS and how the different components relate and interact with one another, the data acquisition sensors, and the auxiliary sensors that accompany a UAS mission, such as GPS and IMU.
At the successful completion of this lesson, you should be able to:
The air vehicle is the airborne part of the system. The air vehicle here means the aircraft, in conjunction with the payload, that forms an Unmanned Aerial System (UAS). In general, the unmanned aircraft is usually called an Unmanned Aerial Vehicle (UAV) and can be either a fixed-wing or rotary airplane that flies without a human on board.
The UAV is a complicated system including structures, aerodynamic elements such as wings and control surfaces, propulsion systems, control systems, communication elements, and launch and recovery subsystems. Larger UAVs use fuel-powered engines in order to attain flight, while smaller UAVs typically use either gasoline-powered engines or electrically powered engines. When the UAV has sensors and payloads, it is customarily called an Unmanned Aerial System (UAS). In this course, the terms UAS and UAV will be used interchangeably to mean the same. Due to inclusion of the word "unmanned," there has been some resistance in recent years to use of the names Unmanned Aircraft and Unmanned Aerial Vehicle. There is a push to adopt the term Remotely Piloted Aircraft (RPA) or Remotely Piloted Vehicle (RPV) because of the crucial human involvement related to the operation of the system. UAVs come in all different sizes and shapes; however, the following are the major factors to be considered in designing a UAV:
The term data link is used to describe how commands are communicated back and forth between the ground control system and the autopilot. The data link is a key subsystem for any UAS, as it provides a two-way communication to ensure that missions are executed safely and according to plan. A good data link is illustrated in Figure 2.2:
There are two different modes for operating a UAS. Those are:
More details on the two operating modes will be covered in lesson 4.
The command and control element is the nerve center for the UAS operation. It controls the following tasks:
The command and control element utilizes several subsystems to accomplish its missions. They are:
The most important parts of the command and control element are the Autopilot and the ground control station, as described in the following subsections:
Autopilot is the sub-system that enables partial or fully autonomous flight. A UAV can be operated completely by a remote control, where an operator steers the air vehicle all the time, or a UAV can be flown autonomously, where a pre-programmed path is fully executed from takeoff to landing by the autopilot sub-system without any pilot intervention. Small, light-weight autopilots are readily available and are made by a few manufacturers. Besides guiding the air vehicle throughout the pre-set flight path, the autopilot also executes a “lost link” routine if the UAV loses contact with the ground control station. The lost link procedure guides the UAV to a known waypoint, where contact with the ground control station can again be established. The following scenario was developed for a typical emergency procedure based on loss of link between the Yamaha RMAX UAS and the ground Control Station:
The RMAX utilizes a redundant communication system to ensure constant contact between the aircraft and the remote pilot. The ground control station provides real-time data regarding aircraft location, altitude and flight characteristics. The pilot constantly monitors the flight information provided to the ground control station, and through the assistance of a trained observer, maintains a visual line of sight to the aircraft. In the event of a loss of link between the aircraft and the ground control station, the subsequent procedures are followed:
Problem: | Sign of Problem: | Monitored throughout: | Solution: |
---|---|---|---|
Low Signal | Vehicle is slow to respond to manual commands or PCC commands. Autopilot terminates steering mode. Audible and warning light alarms. | Yes, signal strength displayed in percentage and packet update rate. | Turn Autopilot on and abandon manual flight. Initiate auto-land. |
Loss of Communication | Autopilot terminates manual control or fails to respond to PCC commands. Audible communication alarm and warning light. | Yes. | The vehicle returns to loss communication waypoint, hovers until elapse of flight timer, then commences auto-land procedure. |
Loss of GPS | First indication is poor altitude hold performance, also poor position hold during hover. | Yes, indicated by the number of satellites tracked and GPS Quality PDOP. | Assume manual control of aircraft and land. |
Low Power Avionics | Lower than nominal voltage displayed. | Yes. | Land Immediately. |
Engine Failure | Noise level or RPM changes, engine loses power. | Yes, monitored by rotor RPM through the RPM sensor. | Return and land immediately. If the engine dies, initiate autorotation procedure. |
Tail Rotor Failure | Loss of tail control. | No. | Switch to manual control and initiate autorotation procedure. |
The ground command station (GCS) is the site where the pilot controls the UAV during the flight. The GCS size and sophistication depends on the category of the UAS/UAV. Some large UASs require a formal facility with multiple workstations and personnel, while a GCS for small UAS can be a handheld transmitter. Most UASs used by the geospatial community are small UASs that do not require a dedicated GCS.
Payload refers to air vehicle (aircraft) cargo. It is also defined as the amount of cargo weight an air vehicle can safely carry. Carrying a payload on board is the sole purpose for most UASs. Payloads come in a variety of sizes, weights, and functions. In our business of geospatial remote sensing, we focus on remote sensing sensors and the necessary navigation systems accompanying them. A UAS dedicated to remote sensing and mapping missions is usually equipped with one or more of the following sensors.
Auxiliary sensors here mean the navigation sensors that are necessary to determine the location and the orientation of the UAS and its remote sensors that are mentioned earlier in this section. For the UAS and onboard sensor position determination, the Global Positioning System (GPS) is used, and for the attitudes or orientation of the UAS and the onboard sensors, the Inertial Measurement Unit (IMU) is used.
The GPS does not need introduction, as everyone is familiar with its definition. It is the same GPS that you might use to drive around town. However, GPS that is used to determine a remote sensor position usually undergoes a post-processing to enhance the accuracy of the position.
UAS are offered with two grades of GPS accuracies. The most common one is the single frequency GPS receivers as it is cheaper, and it does not require post-processing or real-time correction service. Such receivers provide location accuracy of around 1 to 2-meter. For more accurate geospatial products generation, the more accurate dual frequency receiver and precise services are need needed. The latter receivers offer two modes of operations, both of which yield positional accuracy of 1 to 3 cm with little or no ground controls required for the project. UAS vendors are fielding systems with two operational modes, those are:
In principle, both RTK and PPK promise positional accuracies at the 1-3cm level. The main purpose of RTK and PPK is to minimize or eliminate the need for ground control points, thereby reducing cost. For more details on GPS, please visit GPS Defined. [42]
An inertial measurement unit, or IMU [43], is an electronic device that measures and reports on aerial vehicle velocity, orientation, and gravitational forces using accelerometers and gyroscopes. IMUs are typically used to control and maneuver manned aircraft, unmanned aerial vehicles (UAVs), and satellites. Another important use for the IMU is that it helps IMU-enabled GPS devices to maintain positioning information when GPS-signals are unavailable, such as in tunnels, inside buildings, or when electronic interference is present.
The IMU is the main component of inertial navigation systems (INS) used in aircraft, spacecraft, watercraft, and guided missiles in Geo-spatial mapping activities. The data collected from the IMUs sensors allows us to determine the orientation of the sensor, which is an important aspect in geolocating on the ground each pixel of the sensor. The IMU, like other components necessary for the operation of UASs, is miniaturized in weight and size to make it fit on small UASs. An example of these small IMUs, which are mainly designed for UASs, is the SBG 500E, [44] illustrated in Figure 2.12.
For more details on the IMU, you can visit the IMU Wikipedia page [43].
The launch and recovery element is an area that requires the most human interaction. Some UASs require elaborate launching procedures, while others can be hand thrown toward the sky. Some large UASs require long runways and other field support equipment such as fuel trucks, ground power units, and ground tugs. Similarly, the requirements for recovery procedures vary widely. Most small UASs that are used for geospatial projects require simple procedures and can be hand held or launched with the use of a catapult.
Some UASs, such as target drones, are air-launched from fixed-wing aircraft. Usually, large UASs are equipped with wheels for takeoff and landing and do not need special equipment, while smaller UASs need a variety of launch and recovery strategies depending on the complexity of the system.
A truck driven at a speed of 60 mph can be used to launch a small UAS assuming that the launching site contains a smooth surface for the truck to use. In this type of launching method, the UAS is held in a cradle above the truck cab with its nose pointed high toward the launching path, Figure 2.13. Once speed is sufficient for takeoff, the UAS is released and lifts upward toward its takeoff path.
Many small and medium-sized UAS launch systems have a requirement to be mobile, or in other words, to be mounted on a truck or a trailer. Such mobile launchers fall within one of the following types:
For more details on these launchers, refer to chapter 17 of the supplemental textbook Introduction to UAV Systems, 4th edition.
Like any other technology that requires human intervention for the safety of operation, human involvement is considered to be the most important element for the successful and safe operation of the UAS. Even with autonomous flights using autopilot, the human role during launch and recovery is crucial to the operation of the UAS. As navigation technology develops further, the human role in operating a UAS will diminish dramatically.
The human element is key in almost all operational aspects of any UAS and plays a great role in the success and survival of its operation. Starting with mission planning, humans have to design and arrange a concept of operation in order to guarantee success. Equally important is the human role in the flight control process. Autopilot can do only so much without the guidance and intervention of the operator.
The role of the pilot and the observer cannot be underestimated, as without them the flight will not occur. This is true even with the most sophisticated drones, such as the Predator. Even the Predator, with sophistication and automation built in, needs a pilot to fly it. The human element is involved in all of the following aspects of operating a UAS:
Automation in operating a UAS results in less human intervention, but it will never eliminate the role of the human in such an operation. Imagine that an airline invites you to be on board an airplane flown solely by autopilot. There are no pilots on board. Would you accept such an invitation? I am certain your answer would be a big NO. Using the same analogy, could you imagine operating a UAS, which is less sophisticated than a jetliner, without a pilot and without an observer? That is how important the human role is in operating a UAS. That is at least true for the time being. Who knows what the future may bring to this field.
Congratulations! You've finished most of the Lesson 2 material. What I hope you learned from this lesson is all you need to know about the different elements that form a UAS. The payload section is very valuable to individuals with background in geospatial mapping, as it goes through the different sensors utilized by the industry today. Understanding the functionality of each of the UAS elements will help you in the common lessons, where we are going to talk about Concepts of Operation (CONOP), risk assessment, and Certificate of Authorization (COA). Therefore, please make sure that you understand the different topics of this lesson and do not hesitate to ask questions.
Task | Description |
---|---|
1 | Complete Lessons 1 & 2 Quizzes |
2 | Complete the discussion assignment on SWOT analysis in lesson 2 on CANVAS |
3 | Install Pix4D software. Pix4D is the data processing software you will use to process UAS imagery. Follow the instructions in Canvas. |
Welcome to Lesson 3! In this lesson, you will become familiar with the concept of operating a UAS and how to design a Concept of Operation (CONOP). The CONOP subject focuses on the pre-flight description of the mission that a UAS operation will go through. You will also learn how to analyze risks surrounding UAS operations and to how to assess and mitigate the impact of such risks.
At the successful completion of this lesson, you should be able to:
The term CONOP means a complete description of the mission that a UAS operation goes through from launch to recovery. The CONOP includes a procedure for the mission to be carried out to achieve the mission objectives. The procedure depends on the system configuration and capabilities. UAS capabilities determined by its components such as sensors, guidance, endurance (in time), weather limitations (ceilings, wind speeds, etc.), navigation and control play key role in defining the CONOP. CONOP may also depend on other factors such as safety considerations for the UAS as well as to lives and properties along the flying path of the mission. The procedure will also include weather condition such as wind speed and visibility, as the mission may be halted or terminated if the favorable weather condition is not reached.
The FAA expectations from the provided CONOP are:
There are many ways to design CONOP one of which is described in the final report published by the ITS Research Institute of the University of Minnesota entitled “Analysis of Unmanned Aerial Vehicles Concept of Operations in ITS Applications”. In that report, the process describes the following main elements of the design:
Figure 3.1 illustrates a flow chart for small UAS concept of operation (CONOP) design process.
Many details need to be identified to complete CONOP development. Information such as:
The block diagram in Figure 3.2 illustrates the components that make up most of the UASs available in the market today. Such a diagram is very beneficial to CONOP analysis and development, as it lists all the sub-systems included in a UAS. As you can see below, the main components that concern us in this course are the mission sensor payload, airborne data link, and navigation and control sensors. The mission sensor payload represents the highest priority for geospatial data users. Types and quality of sensors within the mission sensor payload block are directly linked to the end user's needs and expectations. The other two blocks, the airborne data link and navigation and control sensors, mainly concern the FAA and its regulations. Main FAA concerns lie in the quality of the communications and the navigational systems that steer and control the aircraft.
The report "Analysis of Unmanned Aerial Vehicles Concept of Operations in ITS Applications, [46]" which discusses in detail the Concept of Operation for the UAS.
Review the FAA document “Integration of Unmanned Aircraft Systems into the National Airspace System Concept of Operations [47]”. When reading through the FAA document, focus on the components that the FAA considered when developing these Concepts of Operations. These considerations are key and beneficial to the development of your CONOP and Risk Assessment analysis that follows later in this lesson.
In this section, you will explore potential risks surrounding UAS operations. The block diagram provided in section 3.1 illustrates UAS system components, each of which carries its own operational risk. In addition, there are many external risks surrounding the operational environment, such as weather and other aircraft sharing the same airspace. Review the document assigned for this section to stand on the different types of risk associated with the UAS operations.
Read chapter 7 of textbook 1 Introduction to Unmanned Aircraft Systems, 2nd edition. While you are reading through the chapter, focus on the hazard recognition and risk assessment, as you will need it for the section "Concept of Operation (CONOP) and Risk Assessment for UAS.
Review the report "Analysis of Unmanned Aerial Vehicles Concept of Operations in ITS Applications [46]" that was provided to you in the section "CONOP Elements." Focus on the discussions concerning risk assessment.
It's time to develop your own CONOP and Risk Assessment methodology for the UAS you selected for the SWOT analysis in Lesson 2, by doing the following:
Your document, at minimum, should include the following sections:
Submit your report in a word document, NOT a pdf, to the drop box. The report should not exceed 3500 words or 15 pages (single line spacing). You will have 3 weeks to complete this assignment. (7 points or 7%)
Deadline for this assignment is at the end of lesson 5.
Congratulations! You have finished Lesson 3, CONOP and Risk Assessment for UAS. As you may have noticed from reading the different materials provided in this lesson, the development and completion of CONOP is an important milestone that needs to be achieved for the successful operation of UASs, especially here in the United States where the FAA has strong rules and regulation against operating UASs. Without a CONOP, the operation may end up with chaotic operations and disastrous results. You also noticed that recognizing the risks surrounding a UAS operation and mitigating them not only results in a safe operation, but it will also please the FAA and encourage them to issue the required permissions to operate a UAS in the National Air Space (NAS).
1 | Complete the Lesson 3 Quiz. |
---|---|
2 | Preliminary Project Idea Milestone: Submit Preliminary project idea/proposal in the "Preliminary project idea" dropbox. |
3 | Pix4D is the data processing software you will use to process UAS imagery. Follow the instructions in Canvas. |
4 | Download and practice Mission Planner Software, following these instructions. [50] |
5 | Complete your CONOP and Risk assessment analysis. Submit your completed MS Word document to the drop box in Lesson 5. (7 points) |
6 | Participate in the discussion assignment in Canvas. |
Welcome to Lesson 4! In this lesson, you will practice planning and designing a UAS mission. For this lesson, we will focus on imaging sensor (digital cameras), as it is widely used for geospatial projects. Successful execution of any mapping project requires a tremendous amount of planning prior to mission execution. Planning must be done by an experienced person who is familiar with all aspects of mapping. Mission planning includes the following categories:
At the successful completion of this lesson, you should be able to:
In this section, you will understand the value of studying area maps for a project prior to the development of the flight plan.
Flight planners should acquaint themselves with the project area through two types of maps before proceeding with further steps of the design; those are U.S. Topo Quadrangle Maps and Sectional Aeronautical Charts.
The U.S. Topo Quadrangles Map, mainly a topographic map, shows the details of the contours of the land (terrain elevation). See Figure 4.1. This type of map reveals all information that a planner needs about the topography in the project area. Topography affects flight plan parameters such flight lines, spacing, and imagery spacing. Quad maps can be downloaded from the USGS [53]. You can also review a sample of such maps for the State College area [54].
Sectional Aeronautical Charts, which are also called VFR charts (Figure 4.2), are described as “the primary navigational reference medium used by the VFR pilot community. The 1:500,000 scale Sectional Aeronautical Chart Series is designed for visual navigation of slow to medium speed aircraft. The topographic information featured consists of the relief and a judicious selection of visual checkpoints used for flight under visual flight rules. The checkpoints include populated places, drainage patterns, roads, railroads, and other distinctive landmarks. The aeronautical information on Sectional Charts includes visual and radio aids to navigation, airports, controlled airspace, restricted areas, obstructions, and related data. These charts are updated every six months, most Alaska Charts annually. To better understand these charts, review the FAA “Aeronautical Chart User Guide [55]”. You can also watch this YouTube video on learning how to read the sectional charts [56]:.
The VFR acronym is adopted from “Visual Flight Rules [57]” where a pilot relies on the visual see-and-avoid rule during flight. To download such charts, visit the FAA site [58].
The topographic map and the aeronautical chart provide an overview of the area and the contents of the ground cover (both natural and man-made), restricted airspace such as airport approaches, high towers, etc.
No less important than visualizing a sectional chart, is to utilize the online FAA sites and other services, which allow you to zoom in to your geographic location to stand on the airspace status and the allowed flights ceiling. Here are a couple of the free services available to the public:
1. The AirMap [60] App
2. Visualize it: See FAA UAS Data on a Map [61]
3. B4UFLY [62]
The focal plane of an aerial camera is the plane where all incident rays coming from the object are focused. The focal plane is where the film of a film-based camera is placed. With the introduction of digital cameras, the focal plane is occupied by the CCD array, replacing the film.
A digital camera like the ones we use at home is called a “digital frame” camera just to distinguish it from other designs of digital cameras such as “push broom” cameras. Digital frame cameras [63] have the same geometric characteristics as the film camera that employs the film as the recording medium.
A digital frame camera consists of a sensor that is a two-dimensional array of charge-coupled device [64] (CCD) elements (CCD is also called pixel). The sensor is mounted at the focal plane of the camera. When an image is taken, all CCDs of the sensor are exposed simultaneously, thus producing a digital frame. Figure 4.3 (from Wolf, page 75) illustrate how a digital camera captures an area on the ground that falls within the lens' field of view (FOV).
The size of a digital camera is measured by the size of its sensor. The higher number of CCDs (pixels) in the sensors, the bigger and more expensive the camera is. If a camera has a sensor with 4000 pixels by 4000 pixels, it is called a 16 megapixels camera. That is because it has 16,000,000 pixels. UAS imaging productivity, i.e. how many acres the UAS can cover in an hour, depends on the sensor size, battery life, and the lens focal length. The article "DJI Phantom 4 RTK vs. WingtraOne [65]" clearly illustrates the difference between UAS productivity based on sensor and UAS capabilities. In that article, you will also learn about some fundamental capabilities that we usually expect from a mapping drone.
The lens for a mapping camera usually contains compound lenses put together to form the lens cone. The lens cone also contains the shutter and diaphragm.
The lens is the most important and most expensive part of a mapping aerial camera. Cameras on board of the UAS are not of that level of quality, as they were not manufactured to be used as mapping cameras. Mapping cameras are called metric cameras, and are built so that the internal geometry of the camera holds its characteristics despite harsh working conditions and changing operational environments. Lenses for cameras on board of the UAS are small in size and lighter in weight. They are also less expensive than standard mapping cameras. Lenses for mapping cameras should be calibrated to determine the accurate value for focal length and lens distortion (imperfectness) characteristics.
Shutters are used to limit the passage of light to the focal plane. The shutter speed of aerial cameras typically ranges between 1/100 and 1/1000 seconds. Shutters are of two types: focal-plane shutters or the between-the-lens shutters. The latter one is the most common shutter used for aerial cameras. Most digital camera shutters are designed according to two mechanisms: the leaf shutter (also called mechanical or global shutter or the dilating aperture shutter) or the electronic rolling shutter (curtain or sliding shutter). The leaf shutter exposes the entire sensor array at once, while the rolling shutter exposes one line of pixels at a time. For aerial imaging from a moving platform such as a UAS, leaf shutter is recommended because it minimizes image blur. To understand the shortcoming of the rolling shutter, watch this video [66].
It is important to know which shutter is used for your camera as most processing software including Pix4D provide correction for the rolling shutter effect. However, the software does not correct for it automatically, and you will need to activate that option before you start processing the imagery.
More information on different types of shutter mechanisms can be found on Wikipedia's Shutter (photography) page [67].
In order to understand mission flight planning, you need to understand the geometry of the image as it is formed within the camera. The size of the CCD array and lens focal length coupled with flying altitude (above ground) determines the image scale or the ground resolution of the image. Therefore, it is essential to the work of the flight planner to have all of this information understood and available before starting to design a mission.
In photogrammetry, we usually deal with three types of imagery (photography), They are defined in term of the angle that the camera optical axis makes with the vertical (nadir), those are:
For the purpose of this course, we will focus only on the first two types, and that is vertical and near-vertical photography.
Figure 4.3 illustrates the basic geometry of a vertical photograph or image. By vertical photograph or image, we mean an image taken with a camera that is looking down at the ground. As the aircraft moves, so does the camera, and this makes it impossible to take a true vertical image. Therefore, vertical image definition allows a few degrees deviation from the nadir (the line connecting the lens frontal point and the point on the ground that is exactly beneath the aircraft). In summary, a vertical image is an image that is either looking straight down to the ground or is looking a few degrees to either side of the aircraft.
As the sun's rays hit the ground, they reflect back toward the camera, and some actually enter the camera through the lens. This physical phenomenon enables us to express the ground-image relation using trigonometric principles. In Figure 4.3, ground point A is projected at image location a' and ground point B is projected at image location b' on the film. From such geometry, the film four corners a' b' c' d' cover an area on the ground represented by the square ABCD. Such relations not only enable us to compute the ground coverage of a photograph (image) but also enable us to compute the scale of such a photograph or image.
The scale of an image is the ratio of the distance on the image to the corresponding distance on the ground. In Figure 4.4, the distance on the ground AB will be projected on the image on line ab, therefore, the image scale can be computed using the following formula:
Equation 1:
Analyzing the two triangles (the small triangle with base ab and the large triangle with base AB) of Figure 4.4, one can also conclude, using the similarity of triangles principle, that the scale is also equal to:
Equation 2:
Scale is expressed either in a unitless ratio such as 1/12,000 (or 1:12,000) or in pronounced units ratio such as 1 in. = 1,000 ft (or 1”=1,000’).
The following two examples will walk you step by step through the process of computing scales for imagery produced from a film-based camera and from a digital camera. In digital cameras, the scale does not play any role in defining the image quality, as is the case with film-based camera. In digital cameras, we use the Ground Sampling Distance (GSD) to describe the resolution quality of the image while in film-based cameras we use the film scale.
Aerial photographs were acquired from an altitude of 6,000 ft AMT (Above Mean Terrain) with a film-based aerial camera with lens focal length of 6 inches. Determine the scale of the resulting photography.
Solution:
From Figure 4.4 and equations 1 & 2,
Therefore,
OR
Scale = 1:12,000 or 1"=1,000'
Scale is meaningless in digital mapping products as the scale concept was created to represent measured distances on old days maps which are plotted on paper. However, people are still using scale and it would take time before the new generation of mappers embrace the digital representation of the new geospatial products. Digital camera manufacturers provides information of the sensor used in their cameras. Some of them expresses it as 16 mega pixels which could be a square array of 6,000x6,000 pixels or a rectangular with any ratio of width/height such as 8,000x2,000 pixels or a ratio of width/height equal to 4. Some cameras manufacturers provide the sensor array size in pixels and in millimeter and some provide it with combination of number of pixels and sensor size in inch leaving wondering about the physical size of the CCD, see Figure 4.5. Figure 4.6, illustrates camera information that you need to dig deep into the provided information to obtain what you want. From Figure 4.6 which represents the information provided for the multi-spectral camera on board of the DJI Phantom 4 agricultural UAS you can indirectly derive the sensor dimensions from the given array size in pixels and the CCD size, or 3 um, which is inserted in the focal length information. The sensor dimensions in pixels were not provided directly and you would need to figure it out from the two values provided for the optical center. The optical center, or the origin of the image coordinates at 0,0, is usually located in the middle, i.e. center of the array, therefore the total width of the array is equal to 800 pixels X 2 = 1,600 pixels while the sensor height is equal to 650 pixels x 2 = 1,300 pixels. Knowing the number of pixels in the width direction, or 1,600, and the pixel size of 3 micrometer, the sensor width can be derived to be equal to 1,600 x 0.003 = 4.8 mm, similarly, the sensor height is equal to 1,300x0.003 = 3.9 mm.
The following is an example on calculating the scale for digital imagery acquired using digital camera:
Aerial imagery was acquired with a digital aerial camera with lens focal length of 100 mm and CCD size of 0.010 mm (or 10 microns). The resulting imagery had a ground resolution of 30 cm (1 ft). Determine the scale of the resulting imagery.
Solution
From Figure 4.4 and equation 1, assume that the distance ab represents the physical size of one pixel or CCD, which is 0.010 mm, and the distance AB is the ground coverage of the same pixel or 30 cm.
Therefore,
OR
Scale = 1:30,000 or 1"=2,500'
Aerial imagery was acquired with a digital aerial camera with lens focal length of 50 mm and CCD size of 0.020 mm (or 20 microns). The resulting imagery had a ground resolution of 60 cm (2 ft). Determine the scale of the resulting imagery.
Solution
Scale = 1:30,000 or 1"=2,500'
Imagery acquired for photogrammetric processing is flown with two types of overlap: Forward Lap and Side Lap. The following two subsections will describe each type of imagery overlap.
Forward lap, which is also called end lap, is a term used in photogrammetry to describe the amount of image overlap intentionally introduced between successive photos along a flight line (see Figure 4.7). Flight 3 illustrates an aircraft equipped with a mapping aerial camera taking two overlapping photographs. The centers of the two photographs are separated in the air with a distance B. Distance B is also called air base. Each photograph of Figure 4.7 covers a distance on the ground equal to G. The overlapping coverage of the two photographs on the ground is what we call forward lap.
This type of overlap is used to form stereo-pairs for stereo viewing and processing. The forward lap is measured as a percentage of the total image coverage. Typical value for the forward lap for photogrammetric work is 60%. Because of the light weight of the UAS, we expect substantial air dynamic and therefore substantial rotations of the camera (i.e., crab); therefore, I recommend the amount of forward lap to be at least 70%.
Side lap is a term used in photogrammetry to describe the amount of overlap between images from adjacent flight lines (see Figure 4.8). Figure 4.8 illustrates an aircraft taking two overlapping photographs from two adjacent flight lines. The distance in the air between the two flight lines (W) is called lines spacing.
This type of overlap is needed to make sure that there are no gaps in the coverage. The side lap is measured as a percentage of the total image coverage. The typical value for the side lap for photogrammetric work is 30%. However, because of the light weight of the UAS, we expect substantial air dynamic and therefore substantial rotations of the camera (i.e. crab), and therefore I recommend using at least 40% side lap.
Ground coverage of an image is the area on the ground (the square ABCD of Figure 4.3) covered by the four corners of the photograph a'b'c'd' of Figure 4.3. Ground coverage of a photograph is determined by the camera internal geometry (focal length and the size of the CCD array) and the flying altitude above ground elevation.
Example on Image Ground Coverage:
A digital camera has an array size of 12,000 pixels by 6,000 pixels (Figure 4.9). If the physical CCD size is 0.010 mm (10 um) camera, how much area in acres will each image cover on the ground if the resulting ground resolution (GSD) of a pixel is 1 foot?
Solution
Ground coverage across the width (W) of the array = 12,000 pixels x 1 ft/pixel = 12,000 ft
Ground coverage across the height (L) of the array= 6,000 pixels x 1 ft/pixel = 6,000 ft
Covered area per image =
In this section, we start the practical work for flight planning an imagery mission. By the end of this section, you should be able to develop a flight plan for an aerial imagery mission. Successful execution of any photogrammetric project requires thorough planning prior to the execution of any activity in the project.
The first step in the design is to decide on the scale of imagery or its resolution and the required accuracy. Once those two requirements are known, the following processes follow:
For the flight plan, the planner needs to know the following information, some of which he or she ends up calculating:
Figure 4.8 shows three overlapping squares with light rays entering the camera at the lens focal point. Successive overlapping images forms a strip of imagery we usually call "strip" or "flight line," therefore photogrammetric strip (Figure 4.8) is formed from multiple overlapping images along a flight line, while photogrammetric block (Figure 4.9) consists of multiple overlapping strips (or flight lines).
Once we compute the ground coverage of the image, as it was discussed in the "Geometry of Vertical Image" section, we can compute the number of flight lines and the number of images and draw them on the project map (Figure 4.10), aircraft speed, flying altitude, etc.
Before we start the computations of the flight lines and images numbers, I would like you to understand the following helpful hints:
Now, let us start figuring out how many flight lines we need for the project area illustrated in Figure 4.13, to the right. Figure 4.13 shows rectangular project boundaries (in black dashed lines) with length equal LENGTH and width equal WIDTH that was designed to be flown with 6 flight lines (red lines with arrowheads). To figure out the number of flight lines needed to cover the project area, we will need to go through the following computations:
In Figure 4.13, you may have noticed that the flight direction for each flight line alternates between North-to-South and South-to-North from one flight line to the adjacent one. Flying the project in this manner increases the aircraft fuel efficiency so the aircraft can stay longer up in the air.
Once we determine the number of flight lines, we need to figure out how many images will cover the project area. To do so, we need to go through the following computations:
Figure 4.14 is the same as Figure 4.13 with added blue circles that represent photo centers of the designed images. The circles are only given to one flight line, and I will leave it to your imagination to fill all the flight lines with such circles.
Flying altitude is the altitude above certain datum the UAS flies during data acquisition. The two main datum used are either the average (mean) ground elevation or the mean sea level. Figure 4.15 illustrates the relationship between the aircraft and the datum and how the two systems relate to each other. In Figure 4.15, we have an aircraft that is flying at 3000 feet above average (mean) ground elevation, represented by the blue horizontal line in the figure. We also have the mean terrain elevation (the blue horizontal line), situated at 600 feet above the mean sea level. Therefore, the flying altitude will be expressed in two ways, those are:
We now need to determine at what altitude the project should be flown. To do so, we go back to the camera internal geometry and scale as we discussed in section 4.3. Assume that the imagery to be acquired with a camera with lens focal length of f and with CCD size of b. We also know in advance what the imagery ground resolution or GSD should be. The flying altitude will be computed as follows:
OR
from which, H can be determined.
Here, we need to make sure that both f and b are converted to have the same linear unit, in which case the resulting altitude will be in the same linear unit of the GSD. If we assume the following values:
f = 50mm
ab = 0.010mm (or 10um)
GSD = 0.30 meter, the flying altitude will be:
meters above ground level
Controlling the aircraft speed is important for maintaining the necessary forward or end lap expected for the imagery. Fly the Aircraft too fast, and you end up with less forward lap than anticipated, while flying the aircraft too slowly results in too much overlap between successive images. Both situations are harmful to the anticipated products and/or the project budget. Little amount of overlap reduces the capability of using such imagery for stereo viewing and processing, while too much overlap results in too many unnecessary images that may affect the project budget negatively. In the previous subsections, we computed the airbase or the distance between two successive images along one flight line that satisfy the amount of end lap necessary for the project. Computing the time between exposures is a simple matter once the airbase is determined and the aircraft speed is decided upon.
When the camera exposes an image, we need the aircraft to move a distance equal to the airbase before it exposes the next image. If we assume the aircraft speed is (v) therefore the time (t) between two consecutive images is calculated from the following equation:
For example, if we computed the airbase to be 1000 ft and we used aircraft with speed of 150 knots, the time between exposure is equal to:
In the navigation world, way points [68] are defined as “sets of coordinates that identify a point in physical space.” Close to this definition is the one used by mapping professionals, and that involves using sets of coordinates to locate the beginning point and the end point of each flight line. Way points are important for the pilot and camera operator to execute the flight plan. Way points in manned aircraft imagery acquisition are usually located a couple of miles outside the project boundary on both sides of the flight line (i.e., a couple of miles before approaching the project area and a couple of miles after exiting the project area or for UAS operations it would be a couple hundreds meters before approaching the project area and a couple hundreds meters after exiting the project area). The pilot uses way points to align the aircraft to the flight line before entering the project area. In UAS operation, a "Way Point" marks the beginning or the end of a flight line where the UAS either positions itself before starting taking pictures or it ends taking pictures on a certain flight line.
A project area is 20 miles long in the east-west directions and 13 miles in the north-south direction. The client asked for natural color (3 bands) vertical digital aerial imagery with a pixel resolution or GSD of 1 ft using a frame-based digital camera with a rectangular CCD array of 12,000 pixels across the flight direction (W) and 7,000 pixels along the flight direction (L) and a lens focal length of 100 mm. The array contains square CCDs with a dimension of 10 microns. The end lap and side lap are to be 60% and 30%, respectively. The imagery should be delivered in tiff file format with 8 bits (1 byte) per band or 24 bits per color three bands (RGB). Calculate:
Solution:
Looking into the project size (20x13 miles) and the one-foot GSD requirements, a mission planner should realize right away that image acquistion task for such project size and specifications can only be achieved using a manned aircraft.
The camera should be oriented so the longer dimension of the CCD array is perpendicular to the flight direction (see Figure 4.12).
Past experience with projects of a similar nature is essential in estimating cost and developing delivery schedule. In estimating cost, the following main categories of efforts and materials are considered:
Once quantities are estimated as illustrated in the above steps, hours for each phase are established. Depending on the project deliverables requirements, the following labor items are considered when estimating costs:
The table in Figure 4.16 provides an idea about the going market rates for geospatial products that can be used as guidelines when pricing a mapping project using manned aircraft operation and metric digital camera and lidar. The industry needs to come up with a comparable table based on Unmanned operations. There is no good pricing model established for UAS operation as the standards and produicts quality are widely varialble depending on who offers such services and whether it fall strictly under the "Professional Services" designation.
Product | GSD ft | Price per sq mile | Comments |
---|---|---|---|
Ortho | 0.5 | $150-$200 | Based on large projects |
Ortho | 1.0 | $80-$100 | Based on large projects |
Ortho | 2.0 | $30-$60 | Based on large projects |
lidar | 3.2 | $100-$500 | Depends on accuracy, terrain, and required details |
After the project hours are estimated, each phase of the project may be scheduled based on the following:
The schedule will also consider the constrains on the window of opportunity due to weather conditions. Figure 4.17 illustrates the number of days, per state/region, available annually for aerial imaging campaigns. Areas like the state of Maine have only 30 cloudless days per year that are suitable for aerial imaging activities.
Chapter 18 of Elements of Photogrammetry with Applications in GIS, 4th edition
For practice, develop two flight plans for your project, one by using manual computations and formulas as described in this section and one by using "Mission Planner" software. Compare the two.
In this section, we will discuss the topics of camera calibration and sensor boresighting.
Most existing UASs that are dedicated to photogrammetric imaging carry on board less expensive cameras that we call nonmetric cameras. Nonmetric cameras are cameras with variable interior geometry (i.e., unknown focal length) and with relatively large lens distortion. In order to conduct photogrammetric mapping from the resulting imagery from such cameras, we need to determine to a known accuracy all interior camera parameters such as the focal length and the coordinates of the principal point, and to model the lens distortion.
The principal point of a camera is the point where lines from opposite corners of the CCD array or the lines connecting the opposite mid-way points of the CCD array sides intersect, Figure (4.18). However, when the lens is fitted on the camera body, it is impossible to align the center of the lens and the principal point described above, resulting in offset distances xp and yp as illustrated in Figure 4.18. Those two values are determined in the process of camera calibration that needs to be represented in the photogrammetric mathematical model during computations.
Mapping film camera calibration was usually performed in special laboratories dedicated to this task such as the USGS calibration lab for film cameras, which was shut down permanently on April 1, 2017 after decades of services to the mapping community. However, with the advancements in the computational analytical model in photogrammetry, we can determine the camera parameters analytically through a process called camera self-calibration from within the aerial triangulation process. Most UAS data processing software such as the one used in this course support camera self-calibration.
The term “boresighting” is usually used to describe the process of determining the differences in the rotations of the sensor (such as camera) rotational axes and the rotational axes of the Inertial Measurement Unit (IMU), which is usually bolted to the camera body. The IMU [43] is a device that contains gyros and accelerometers used in photogrammetry and lidar to sense and measure sensors rotations and accelerations. In photogrammetry where the IMU is used on an imaging camera, the boresight parameters are determined by flying over a well controlled site (site with accurate ground controls) and then conducting aerial triangulation on the resulted imagery.
The aerial triangulation process will compute the six exterior orientation parameters (X, Y, Z, omega, phi, kappa) while the IMU will measure the three orientation parameters' roll, pitch, and heading (or yaw). Comparing the two sets of the orientation angles of the camera as computed by the aerial triangulation and measured by the IMU, one can establish the differences in the rotations of the camera in reference to the inertial system (from the IMU). These differences (or offsets values) will be used to correct all the future IMU-derived orientation to convert the rotation angles from inertia to photogrammetric systems so it will be utilized in the mapping process.
A similar process is followed for determining the offset values for the IMU used in the lidar system. For the lidar offset determination, there is no aerial triangulation used as it follows different processing steps. To determine the boresight offset values in lidar, the lidar has to be flown in a certain configuration over a well controlled site. Figure 4.19 represents an ideal design for lidar boresight determination. From the figure, there are two lines flown in the east-west directions (one flight line flown due east and the other flown the opposite direction, due west) from a certain altitude and two flight lines flown in the opposite direction (north-south) from an altitude that is nearly double the altitude of the east-west flight lines.
Congratulations! You have just finished Lesson 4, UAS Mission Planning and Control. I hope that you appreciate the importance of this lesson material in relation to the Concept of Operation for any UAS. UAS projects based on poor planning mean nothing but guaranteed failure or/and poor quality derived products. Computations may seem complicated, but I tried to walk you through the different steps with details. However, if you feel that you are overwhelmed with understanding the design concepts, please do not hesitate to write to me.
1 | Complete the Lesson 4 Quiz. |
---|---|
2 | Start Pix4D processing for Exercise 1 (Wiregrass Gravel Mine, Alabama) using these instructions [69]. Submit your reports in Lesson 6 (5 points) |
3 |
Start Pix4D processing for Exercise 2 (County Line Road, Dayton, Ohio) using these instructions. [70] Submit your reports in Lesson 8 (8 points) |
4 | Practice the use of "Mission Planner" software to develop a flight plan. |
5 | Participate in the "Human Elements of UAS" Discussion Forum |
Welcome to Lesson 5! In this lesson, you will become familiar with all aspects of operating a UAS, starting with the obstacles in the face of the UAS and its operations and moving to subjects like guidelines to UAS operations, definition of airspace, launch and recovery, line of sight (LOS) operation, beyond line of sight (BLOS) operation, and personnel qualifications. All the topics mentioned above are crucial to any individual involved in operating a UAS, especially here in the United States. I would like to emphasize the topic of understanding the national airspace (NAS) and the rules surrounding the operation of an aircraft in each of its classes. In this lesson, you will be asked to express your opinion on the newly released FAA roadmap for integrating the UAS in the NAS.
At the successful completion of this lesson, you should be able to:
In the following sections, you will become familiar with the FAA regulations that restrict the operation of the UAS in the national airspace, especially for commercial use. Whether we all agree with it or not, the reasons behind the FAA restrictions are due to one or more of the following:
I would like to add here that even though the FAA restricted the use and operation of UASs in U.S. airspace, there are growing feelings, by consumers who have found useful uses for UASs, about breaking the FAA rules and flying UASs without a COA or the special airworthiness certificate. Before the FAA changed its pace in recent years in dealing with UAS issues, people were frustrated with the sluggish pace of progress by the FAA to integrate the UAS into the NAS. To understand such "unlawful" use of the UAS, read the following article:
Section 5.1 of Introduction to Unmanned Aircraft Systems.
Review the TRB2013 Paper presentation slides "Addressing the Operational and Technical UAS Airspace Integration Challenges. [71]"
Prior to August 29, 2016, where the latest FAA regulations in regard to UAS operation went into effect, the FAA document number N 8900.227 entitled “National policy: The Unmanned Aircraft Systems (UAS) Operational Approval” is used to describe the regulation surrounding the UAS operation in the United States. The policy carefully explains all aspects of UAS operation, from the airworthiness of the aircraft to the operator training and risk mitigation. Getting familiar with these regulations was necessary for anyone who was planning to own or operate a UAS. The document was temporarily issued until the future regulations that proposed in the FAA roadmap replaces what the above document mandated. It took the FAA few years to amend its regulations to allow the legal operation of small unmanned aircraft systems in the National Airspace System. The new rules were published in the Federal Register (Vol. 81 Number 124 Part II [82]) on June 28, 2016 and it went into effect on August 29, 2016. The new rules were added as a new part 107 to Title 14 Code of Federal Regulations (14 CFR) to allow for routine civil operation of small UAS in the NAS and to provide safety rules for those operations. The new rules, which are publicly known as PART 107, become the latest official policy to govern the commercial operation of small UAS in the National Airspace System (NAS). The article "What You Need to Know to Legally Operate Your Drone Under New FAA Regulation [77]" briefly describes the new rules, and it is a good read for anyone that is trying to understand PART 107.
Prior to the issuing of PART 107, the FAA achieved one of its most important milestones, which is the selection of the 6 sites for the "UAS Test Site Program." The 6 sites selection represented the first serious step by the FAA toward the integration of the UAS into the NAS. Among tens of applicants, the FAA On December 30, 2013 announced the selection of the following 6 agencies to operate UAS test sites as it is quoted below:
In totality, these six test applications achieve cross-country geographic and climatic diversity and help the FAA meet its UAS research goals of System Safety & Data Gathering, Aircraft Certification, Command & Control Link Issues, Control Station Layout & Certification, Ground & Airborne Sense & Avoid, and Environmental Impacts.
Each test site operator manages the use and scheduling of the test site in a way that it gives access to parties interested in using the site. The FAA’s role is to ensure that each operator sets up a safe testing environment and to provide oversight that ensures each site operates under strict safety standards.
Watch the hearing in the U.S. Senate Committee on Commerce, Science, and Transportation on March 15, 2017 on "Unmanned Aircraft Systems: Innovation, Integration, Successes, and Challenges [80]
In order to understand the UAS operations within the United States, you will need to be familiar with the way the NAS is classified and managed. Figure 5.1 schematically illustrates the different classes of the NAS, while table 5.1 provides details on the different classes of the NAS. Each class has its own rules and restrictions. The Wikipedia web site contains good details on the US national airspace classes [84]. The materials given in the assignment will provide you with additional details about the NAS classes.
Class | Description |
---|---|
Class A | Generally, airspace from 18,000 feet mean sea level (MSL) up to and including flight level (FL) 600, including the airspace overlying the waters within 12 nautical miles (NM) of the coast of the 48 contiguous states and Alaska. Unless otherwise authorized, all pilots must operate their aircraft under instrument flight rules (IFR). (Instructor added note: FL 600 or Flight Level 600, means a flying altitude of 60,000 ft. MSL, for more details, check out this website [78].) |
Class B | Generally, airspace from the surface to 10,000 feet MSL surrounding the nation’s busiest airports in terms of airport operations or passenger enplanements. The configuration of each Class B airspace area is individually tailored, consists of a surface area and two or more layers (some Class B airspace areas resemble upside-down wedding cakes), and is designed to contain all published instrument procedures once an aircraft enters the airspace. An air traffic control (ATC) clearance is required for all aircraft to operate in the area, and all aircraft that are so cleared receive separation services within the airspace. |
Class C | Generally, airspace from the surface to 4,000 feet above the airport elevation (charted in MSL) surrounding those airports that have an operational control tower, are serviced by a radar approach control, and have a certain number of IFR operations or passenger enplanements. Although the configuration of each Class C area is individually tailored, the airspace usually consists of a surface area with a 5 NM radius, an outer circle with a 10 NM radius that extends from 1,200 feet to 4,000 feet above the airport elevation and an outer area. Each aircraft must establish two-way radio communications with the ATC facility providing air traffic services prior to entering the airspace, and thereafter maintain those communications while within the airspace. |
Class D | Generally, that airspace from the surface to 2,500 feet above the airport elevation (charted in MSL) surrounding those airports that have an operational control tower. The configuration of each Class D airspace area is individually tailored, and when instrument procedures are published, the airspace will normally be designed to contain the procedures. Arrival extensions for instrument approach procedures (IAPs) may be Class D or Class E airspace. Unless otherwise authorized, each aircraft must establish two-way radio communications with the ATC facility providing air traffic c services prior to entering the airspace and thereafter maintain those communications while in the airspace. |
Class E | Generally, if the airspace is not Class A, B, C, or D, and is controlled airspace, then it is Class E airspace. Class E airspace extends upward from either the surface or a designated altitude to the overlying or adjacent controlled airspace. When designated as a surface area, the airspace will be configured to contain all instrument procedures. Also in this class are federal airways, airspace beginning at either 700 or 1,200 feet above ground level (AGL) used to transition to and from the terminal or en route environment, and en route domestic and offshore airspace areas designated below 18,000 feet MSL. Unless designated at a lower altitude, Class E airspace begins at 14,500 MSL over the United States, including that airspace overlying the waters within 12 NM of the coast of the 48 contiguous states and Alaska, up to but not including 18,000 feet MSL, and the airspace above FL 600. |
Class G | Airspace not designated as Class A, B, C, D, or E. Class G airspace is essentially uncontrolled by ATC except when associated with a temporary control tower. |
In most cases, operating a UAS requires employment of similar logistics as those needed for manned aircraft. Large UASs such as the Northrop Grumman’s Global Hawk call for operation requirements similar to those needed to fly a large Boeing aircraft. The Global Hawk, which is the size of a Boeing 737, requires runways for takeoff and landing. It can fly over 60,000 feet, cruise at 310 knots, and has an endurance of 36 hours. On the other hand, small UASs weigh only a few pounds and do not need airports or runways for takeoff and landing. Different UAS sizes and sophistication also require different personnel skills and requirements.
There are many ways in which a UAV can be launched, some of which are very complex while others are as simple as a hand toss into the air. Some UASs, such as target drones, are air-launched from a fixed wing aircraft. Usually, large UASs are equipped with wheels for takeoff and landing and do not need special equipment, while smaller UASs needs a variety of launch and recovery strategies depending on the complexity of the system. Many small and medium size UAS launch systems have a requirement to be mobile, or in other words, to be mounted on a truck or a trailer. Such mobile launchers fall within one of the following types:
For more details on these launchers, refer to chapter 17 of the supplemental textbook Introduction to UAV Systems, 4th edition.
Line-of-sight (LOS) operation refers to operating the UAS through direct radio waves. The LOS link provides command and control uplink and product downlink while the UAS operates within a certain distance from the GCS. The link is used to launch and recover the aircraft and perform data acquisition according to the type of payload mission of the system. In the United States, civilian operations are usually conducted on 915 MHz, 2.45 GHz, and 5.8 GHz.
Beyond Line-of-sight (BLOS) operation refers to operating the UAS through satellite communications or using a relay vehicle such as another aircraft. The recent advancements in SwiftBroadband service and hardware, including smaller, lighter avionics that don’t compromise on performance or data capacity, allow near-global connectivity to become available to support and enhance UAV operations. SwiftBroadband service is provided by InmarSat Satellite broadband communications. BLOS is usually limited to military UAS operations. Civilian UAS operations do not need BLOS systems for the time being, as their missions are conducted within line of sight range. Civilian operations have access to BLOS via the Iridium satellite system, which is owned and operated by Iridium LLC.
The FAA through its "Partnership for Safety Plan (PSP)" program continue its efforts to team with the industry to help them with the UAS integration. The following organizations were among the entities that FAA is working with to test and try the BVLOS and many of the other UAS integration issues:
1. Amazon Prime A
2. Burlington Northern Santa Fe (BNSF) Railway
3. Drone Racing League (DRL)
4. Florida Power and Light
5. UPS Flight Forward Inc.
6. Wing (an Alphabet company)
7. Xcel Energy
For more information on the PSP, visit this FAA website [86].
The FAA in mid-June, 2021 announced that they are forming a new Aviation Rulemaking Committee, or ARC, to provide recommendations to help the agency develop a regulatory path for routine Beyond Visual Line of Sight drone flights. The committee considers the safety, security and environmental needs, as well as societal benefits, of these operations.
Personnel Qualifications
Unmanned aerial system operators of remote pilots, visual observers, mission planners, and other support staff are responsible to:
According to FAA PART 107, the job descriptions for the following jobs are specified:
According to the FAA, the following operational restrictions apply to all UAS pilots:
As for the visual observer job, the FAA requires:
As for the crew in general:
Several agencies started providing training and issuing a UAS operator certification to support newcomers to the UAS business, such as the one in the following links:
Congratulations! You've finished Lesson 5, Fundamentals of Unmanned Aerial System Operations. You should find by now that you are comfortable with describing different UAS classes, listing UAS system elements, designing a concept of operating a UAS, assessing risk surrounding a UAS operation, understanding FAA regulations, defining the operation criteria for different classes of the national airspace, and understanding the guidelines for UAS flight operations. If you feel that you are not comfortable with any of the previously listed subjects, then you need to review the lessons notes and/or contact me.
Task | Description |
---|---|
1 | Complete the Lesson 5 Quiz |
2 | FAA Roadmap Discussion Forum: Review the 2013 FAA roadmap document in Canvas. As you may notice, the FAA is serious about integrating commercial UASs into the National Air Space (NAS) starting in 2015. Considering the state of technology in manufacturing small UASs and the lack of Sense-and Avoid instruments on board of such small UASs, do you expect any obstacles in the face of integrating small commercial UASs into the NAS according to the FAA proposed road map (i.e., the FAA may restrict the use of certain sizes of UASs for not being equipped with enough safety features)? Post your opinion on the discussion and respond to at least one peer posted opinion on the subject. The deadline for submission is at the end of Lesson 6; you will have two weeks to participate. (3 points or 3%) |
3 | Final Project Milestone: Submit final project idea/proposal in the Final Project Proposal drop box in Lesson 5 |
4 | Submit your "CONOP and Risk Assessment" assignment report |
Welcome to Lesson 6! In this module, you will become familiar with the current FAA regulations that govern UAS operations and the ongoing efforts to integrate their operations into the National Airspace System (NAS). On top of it are the latest rules known as PART 107. You will also explore the current recreational versus public or commercial operations of UAS, be familiar with the Certificate of Authorization (COA), Certificate of Waiver, and Airworthiness certificate and how to apply for one, and examine issues related to privacy that are of concern to both the government and industry. You will be asked to choose your application materials and organize them for your COA or Part 107 waiver application. The COA/Part 107 waiver project includes a few graded components that will be submitted in the different sections of the lesson. During this lesson, you will be engaged in discussions with fellow students on several topics related to the lesson objectives. Participation in these discussions is mandatory wherever it is requested.
At the successful completion of this lesson, you should be able to:
The Federal Aviation Administration (FAA) was created in 1958 in response to a series of fatal accidents and midair collisions involving commercial aircraft. The FAA was mandated to develop plans and policies for the use of navigable airspace to ensure the safety of aircraft and the efficient use of airspace. Prescribed air traffic regulations should cover the flight of aircraft (such as safe altitudes) for navigating, protecting, and identifying aircraft; protecting individuals and property on the ground; using the navigable airspace efficiently; and preventing collision between aircraft, between aircraft and land or water vehicles, and between aircraft and airborne objects.
Since the creation of the FAA, American airspace has become one of the most regulated fields in the United States. With the introduction of UASs, the FAA has had to examine and ensure that these pilotless aircraft can operate safely and meet all the above mentioned regulations. The NAS is already congested with piloted aircraft, and adding a swarm of UAVs requires thoughtful planning. The FAA's main mandate is to ensure that UASs do not endanger current users of the NAS (including manned or other unmanned aircraft) nor compromise the safety of the people and property on the ground.
When it comes to the safe operation and integration of the UAS into the NAS, one of the main concerns that the FAA has is the lack of detect, sense, and avoid capability of the current UAS technology. The FAA did a thorough literature review to stand on what is possible and what is not along this line. The article listed in the reading assignment of this section details the FAA quest for the detect, sense, and avoid possibilities.
In this section, you will explore the current regulations that govern UAS operations and the efforts underway to integrate their operations into the National Airspace System (NAS). The status of UAS regulations can be considered in relation to two different eras. The first one preceded the provisions of the FAA Modernization and Reform Act of 2012 (P.L. 112-95), and the second is what we are currently dealing with after the 2012 provision. During both eras, the FAA regulations on operating a UAS in NAS were very strict and in fact prohibited civilians from flying UASs until Part 107 went into effect on August 29, 2016. In 2008, The Aviation Safety Unmanned Aircraft Program Office (UAPO) of the FAA issued the Interim Operational Approval Guidance 08-01. “Interim Operational Approval Guidance, Unmanned Aircraft Systems Operations in the U. S. National Airspace System” provided guidance to help determine if unmanned aircraft systems (UAS) should be allowed to conduct flight operations in the U. S. national airspace system (NAS). On July 30, 2013, the FAA issued a national policy (N 8900.227) for reviewing and evaluating the safety and interoperability of proposed Unmanned Aircraft Systems (UAS) flight operations conducted within the United States (U.S.) National Airspace System (NAS) under the subject “Unmanned Aircraft Systems (UAS) Operational Approval.” The new national policy defined in details the methods of the UAS operational approval through the issuance of either a COA for public aircraft operations or a Special Airworthiness Certificate for civil operations. All guidelines and regulations are jointly developed by the following entities within the FAA:
Originally, the Certificate of Authorization, or COA, was limited to public agencies and no commercial agency was granted a COA. Even for public agencies, COA cannot be guaranteed, and COAs may take different lengths of time or have some restrictions built in, according to the FAA document N 8900.227, which states “because of the uniqueness of various UAS flight operations, each application must be evaluated on its own technical merits, including operational risk management (ORM) planning. Each application may require unique authorizations or limitations directly related to the specific needs or capabilities of the UAS and/or the proposed specific mission and operating location.”. However, during 2015, the FAA started issuing grants exemption for commercial entities to fly UAS for commercial use under strict limitations. The FAA based such grant exemption on section 333 of the FAA Modernization and Reform Act of 2012. [99] An exemption according to section 333, allows commercial companies to fly UAS, after they apply for COA, of course, for commercial use. Even with the heavy restrictions that surrounded these exemptions, the move was welcomed by companies who are planning to use UAS for various commercial tasks, and it was considered to be the baby step that they were waiting for.
The previous surprising move by the FAA was followed by three unprecedented moves.
As you may have noticed from the materials you reviewed in the previous section, no one is allowed to fly a UAS without prior approval from the FAA. Any UAS operation in the United States has to occur in one of two ways. Either the UAS belongs to a public agency (i.e., governmental) and then requires a COA or operates under Part 107 rules, or it belongs to to a civilian entity and therefore requires adherence to Part 107 rules and perhaps a special airworthiness certificate or a waiver. For manned aircraft, the FAA requires several basic steps to obtain an airworthiness certificate in either the Standard or Special class. The FAA may issue an applicant an airworthiness certificate when:
The process for a UAS is different for the time being, as it is approached through either a COA or a special airworthiness certificate, as was discussed above. For UAS, the FAA may consider an airwortiness letter like the following:
"To Whom It May Concern:
The eBee small Unmanned Aircraft System has been inspected and reviewed on behalf of XY organization by qualified individuals and a determination has been made based on testing data and evaluation data provided by the manufacturer that the aircraft is serviceable and airworthy for the intended use as advertised by the manufacturer, subject to the warrantees and representations offered by said manufacturer.
Sincerely,
John Doe, System Engineer, XY organization"
Just to reiterate, the process of requesting a UAS operation within the territorial airspace of the United States (the airspace above the contiguous United States, Alaska, Hawaii, U.S. territories, and U.S. territorial waters) differentiates depending whether the applicant is a public agency or a civilian entity. The methods of operational approval are the issuance of either a COA for public aircraft operations or for civilian operators is either to operate under PART 107 for UAS that weighs less than 55 lbs or operators need to apply for an exemption under the Special Authority for Certain Unmanned Systems (49 U.S.C. §44807) [100]. Special Airworthiness Certificate [101] is needed for civil operations under certain conditions. The FAA on its website allowed civil users to apply for a COA, it is not needed anymore, through a dedicated portal. This Form shows the web application interface [102]. The form is provided to show the actual interface for the COA application and the required materials and all applicants have to provide the required submissions through the portal. To apply for a COA, go to the FAA UAS Civil COA Portal [103]. You will need to create an account on the FAA website before you proceed with your application. Anyhow, if you are planning to apply for a COA, be prepared to provide the following materials and information through the portal and/or when the FAA ask you later if needed:
This link provides a sample of COA application provided by the FAA on their website Sample COA application form from the FAA website [104].
Certificate of Authorization Application Components
Make sure that your COA application provide the FAA with the following components:
- Applicant Contact Information:
Public agencies or private individual or business who wants to be exempted to fly UAS under certain conditions can apply for Certificate of Authorization (COA) [105]. The introduction of Part 107 removed many hurdles from the face of operating civilian UAS under many conditions. However, for conditions that are not listed or described directly under Part 107 regulations, a civilian operator can apply for a waiver. The FAA states "A waiver is an official document issued by the FAA which approves certain operations of aircraft outside the limitations of a regulation. You may request to fly specific drone operations not allowed under part 107 by requesting an operational waiver. These waivers allow drone pilots to deviate from certain rules under part 107 by demonstrating they can still fly safely using alternative methods." . The following table illustrate the conditions under which one needs to apply for a waiver to operate under Part 107.
List of operations that require a waiver under Part 107 (source FAA [106])
How To Apply For a Waiver?
One can apply for a waiver through the FAA website [106]. The FAA details the guidelines for the waiver application and the required information. Pay close attention to the "Waiver Safety Explanation Guidelines for Part 107 Waiver Applications" that you may encounter in the DroneZone operational waiver application. For the waiver application, the FAA required extensive details on:
The following items are required for the "Waiver Safety" part of the application as adopted from the FAA website [107]:
Provide, to the greatest extent possible, how you propose to address or lessen the possible risks of your proposed operation. This could include using operating limitations, technology, additional training, equipment, personnel, restricted access areas, etc. When reviewing the questions for each section below, the FAA's primary concerns are:
The following questions [109] are associated with each waivable section of part 107. Only answer the questions for the regulatory section applicable to the application you will submit:
NOTE: The list of questions may not be all-inclusive. You may need to provide additional information based on your specific operation.
To Do:
In this section, you are expected to develop and submit the required materials for the COA or Part of 107 waiver application for the platform you selected in the activities of Lesson 1. It is helpful to review previously submitted COA or Part 107 waiver applications available on the FAA website before populating your own documentation, so you can become familiar with the format, required materials, and depth of information. The following is a brief list about the materials you may need in order to complete the COA or Part 107 waiver application for your platform:
Make sure to incorporate risk mitigation strategies and address the integration of automation and autonomy in your system in the various sections as appropriate. More details on the information required for a COA or Part 107 waiver application can be found in the template provided. The link to the FAA site provided above also provides plenty of examples on COA and Part 107 waiver applications.
Congratulations! You've finished Lesson 6, Aviation Regulatory and Certificate of Authorization Process (COA). I hope you digested the materials very well, as they are essential to understanding the circumstances of operating any UAS in the U.S. The exercise of developing your own COA or Part 107 waiver application will enable you to manage a UAS operation, as it has provided you with crucial knowledge about logistics and safety concerns regarding UAS operations. The exercise not only had provided you with FAA rules and regulations, but has also given the necessary technical knowledge about different sub-systems of the UAS.
1 | Complete the Lesson 6 Quiz. |
---|---|
2 | For this exercise, you can claim that you are representing a public agency, in which case you need to apply for COA or a civilian operator, in which case you need to apply for Part 107 waiver. Recognizing that a public agency can also apply for Part 107 waiver, start working on your "COA application draft" or "Part 107 waiver draft". Develop draft materials for your application for the UAS you selected in section 2.7 of Lesson 2. Choose a geographic location for your system operations. Name the civilian application you are going to use the system for. You may find that all technical specifications/information about your UAS is not available or publicly published, in which case you need to adopt published technical specifications for a similar UAS or from an existing COA or a Part 107 waiver. Your report should contain at least 20 pages, but not to exceed 30 pages (single line spacing). Upon completion, submit your completed word document(s) in the drop box. (7 points) |
3 | Submit your results for exercise 1 data processing in Pix4D |
4 | Participate in the discussion for Lesson 6 "Differences Between Rules and Regulations". Deadline for this assignment is at the end of Lesson 7. |
5 | Start processing the dataset for exercise 2 using Pix4D software. |
Welcome to Lesson 7! In this lesson, you will understand and be familiar with the photogrammetric process, the processing systems, and data generation from an image-based UAS. Most applications of the UAS today include one form or another of a camera system (video or still camera) from which different interpretations and therefore different applications are evolved. You will also develop understanding of processes such as aerial triangulation and ortho rectification, which are the backbone of any image processing facility. The photogrammetric textbook Elements of Photogrammetry with Applications in GIS will be your companion, beside the lesson notes, in understanding the topic.
At the successful completion of this lesson, you should be able to:
In this section, you will understand the photogrammetric process and the different steps the product goes through in order to develop an ortho photo or digital elevation model.
Figure 7.1 illustrates the different steps of processing that imagery from a UAS is subject to in order to produce a mapping product such as an ortho photo or digital elevation model.
As we learned in Lesson 4, the process starts with the mission planning process. Once all the parameters and requirements are defined for the mission, a flight plan is developed and aerial imagery is acquired according to the project specifications. The resulting imagery will be reviewed to assure the expected quality. Following the image QC, the field work will be conducted to survey the necessary ground controls. The ground controls survey can be conducted either before the imagery acquisition, or after it is completed.
Once the imagery acquisition and the ground control survey are completed, work can begin on the process of aerial triangulation. Aerial triangulation, as it will be described in section 7.2, is performed to determine the position and the orientation of the camera at the moment of exposure of each image. It includes a few processing concepts, such as interior and exterior orientations, relative orientation, and absolute orientation. Aerial triangulation is achieved through processing software that is based on rigorous mathematical models based on least squares. Once the aerial triangulation is completed, the imagery is ready to go through other processing steps such as ortho rectification and digital elevation modeling.
In this section, you will learn about the concept of geo-referencing imagery, which is an important concept. Without it, no further photogrammetric processing of the imagery can take place.
In order to utilize the photogrammetric mathematical model, i.e., the collinearity condition, for the production of any mapping products, the following information needs to be made available:
In this section, we will focus on the process of determining the six exterior orientation parameters. The camera position can be measured accurately using the airborne GPS technique using a GPS antenna on board the UAS. The three camera positions can also be computed using the process of aerial triangulation, as we will discuss soon. However, there are two methods for determining the camera attitude or orientation, and those are the aerial triangulation process and the direct measurement from the IMU, as we discussed in Lesson 6.
Aerial triangulation is usually performed on a photogrammetric block (Figure 7.2), which consists of all the imagery acquired over the project area. Figure 7.2 illustrates a photogrammetric block of imagery consisting of three strips, each of which has multiple overlapping images. Also shown are the different types of image overlaps. The top and middle strips contain images with 60% forward lap, while the bottom strip contains imagery with 80% forward lap. You may also notice in the figure that the middle and the bottom strips are overlapping by the amount of 30%. Such overlap is called side lap.
In the last section (the photogrammetric process), we mentioned a few terms related to aerial triangulation. We will briefly describe these terms in the following sub-sections:
Relative Orientation is the process of orienting images relative to one another (i.e., it recreates the “relative” position and attitude of the images at the instants of exposure), as illustrated below. Figure 7.3 shows four images that are connected to each other in space through the aircraft/GPS trajectory but are not necessarily connected to the ground datum (i.e., they are floating in space).
Relative orientation is an important process that must be performed before we scale the imagery to the ground datum through the process of absolute orientation, which will be discussed in the next section. To form a cohesive block, all images in the block should be relatively oriented with respect to each other through the process of relative orientation.
The process of leveling and scaling the stereo model (formed from two images) with respect to a reference plane or datum using ground control points is shown in Figure 7.4. Figure 7.4 represents the same four images as Figure 7.3, but this time the block was tied to the ground datum through the use of seven ground control points (represented by the black stars).
Without performing the absolute orientation process, the generated map would not be specifically associated with a certain location in space. Generating maps that have geo-location information such as datum and coordinates systems can only happen after the process of absolute orientation is performed following relative orientation.
Exterior orientation of a photograph defines its position and orientation in the object space. There are six elements of exterior orientation, X, Y, and Z of the exposure station position, and the three angles that define the angular orientation: ω, φ, and κ. The six elements of exterior orientation are not known and must be computed through a process called space resection within the aerial triangulation process. Here is the definition of the three orientation angles illustrated in Figure 7.5:
Omega (ω): Rotation about the x axis. It is equivalent to the angle Roll of the navigation system.
Phi (φ): Rotation about the y axis. It is equivalent to the angle Pitch of the navigation system.
Kappa (κ): Rotation about the z axis. It is equivalent to the angle Yaw of the navigation system.
Knowing the six exterior orientation parameters for an image is necessary for any photogrammetric processing aimed at creating products from such an image. Whether you perform map compilation on a stereo plotter or generate an ortho image, the six exterior orientation parameters need to be computed before you start the production process.
Space Resection is the process of determining ray intersection in space to conclude camera position. See Figure 7.6. The method of space resection is a purely numerical method using collinearity equations to simultaneously yield all six elements of exterior orientation (X, Y, Z , omega, phi, and kappa). Once these elements are known, a stereo plotter can measure the photo coordinates of any point in a photo (x,y) and the ground coordinates can be computed. Ortho rectification software also utilizes space resection for ortho-rectifying an image. Figure 7.6 illustrates six images. Each of them has rays from the ground entering the camera through the lens. The intersection of the rays entering the camera at point "O" represents the photo center location, which is important for the determination of the exterior orientation parameters described earlier.
Aerial triangulation can be defined as the process of densification of a sparsely distributed horizontal and vertical control network through:
A conventional (film based) aerial triangulation process consists of the following steps:
Data Preparation: Using a stereoscope, three points are selected down the center of each photo, approximately 1” from the top and bottom and at the center. These points are also marked on every overlapping photo on which they occur. They are often called “pass points” along strips and “tie points” between strips. See Figure 7.7. Ideally, pass points are selected in flat areas of high contrast that are free of obstructions and shadows.
Figure 7.7 represents three overlapping photos that are used to extract pass points between them. Notice that the three middle points for the middle photo (a, b, c) were located and marked on the same locations in the overlapping right and left image. This process is called point marking.
Point Marking: A good point marking device is characterized with:
One of the earliest commercially successful point marking devices was the P.U.G., manufactured by Wild Heerbrugg Instruments, Inc. See Figure 7.8. Over time, pass points marked on dispositive became known simply as pug points.
Point Measurement: A skilled technician with analytical stereo plotting instruments records the location of each previously marked Pass point and tie point on each photograph.
Numerical Computation of Aerial Triangulation: Here is a summary for the steps taken within the processing software:
Unlike the aerial triangulation of the past, which was performed using film-based imagery instead of digital imagery and optical-mechanical instruments, today aerial triangulation is performed on digital imagery using a complete softcopy approach called softcopy aerial triangulation. In softcopy aerial triangulation, all manual work of points marking and measurements are left to the automation of the software. It is more efficient and more accurate.
The backbone of the computational model in Photogrammetry is based on two equations called the collinearity equations, which are based on the collinearity condition. The two collinearity equations are represented below:
Where,
Xc, Xc, Xc = Camera perspective center
X, Y, Z = ground point position
x, y = point position on image
mii = photo orientation matrix
f = camera lens focal length
x0, y0 = Principal point of autocollimation
In the last two decades, navigation technologies have advanced to the point that enabled manufacturers of the Inertial Navigation Systems (INS), usually used for missiles and submarines navigation, to produce an Inertial Measurement Unit (IMU) to accurately measure the orientation of airborne sensors such as cameras and LiDAR. The IMU, which we briefly described in Lessons 2 and 6, are used either to replace the process of aerial triangulation or to assist its solution. Most UAS, including the small ones, carry on board a GPS unit and an IMU unit. Unfortunately, most of these miniaturized low cost IMU that are used for UAS are not accurate enough to replace the aerial triangulation. Such low accuracy IMU is usually used to navigate the UAS but not to support the aerial triangulation. On the other hand, the GPS antenna in most UAS is a survey grade quality that can receive signals from both GPS and GLONASS. Some of the UAS can receive signals from OMNISTAR with real time corrections.
In this section, we will discuss an important topic to any photogrammetric work: ground controls.
A ground control, which we introduced in the last section, is a target in the project area with known coordinates (X,Y,Z). Accurate, well-placed ground controls are essential elements for any photogrammetric project utilizing aerial triangulation.
There are two standard types of ground control points (Figure 7.9), those are:
Many projects make use of one type or the other, or a combination of the two.
The leftmost image In Figure 7.9 represents a pre-marked control point set on black and white fabric, while the image next to it represent a pre-marked control point that is spray-painted on a sidewalk. The rightmost images represent different types of photo identifiable ground control points. On these images, the user can pick any visible ground feature (such as a parking strip or edge of where the concrete meets the asphalt pavement on a bridge) to use as a control point.
There are two techniques to survey ground control points. The most common one is using RTK GPS techniques, as it is the fastest and least expensive. RTK survey results in a horizontal accuracy of about 2cm and about 3cm vertical accuracy. RTK survey is widely used for mapping projects. The second survey technique which is much more expensive is differential leveling for height determination and static GPS for horizontal survey. Differential leveling results in around 1cm vertical accuracy. Here in the United States, surveying a point using RTK GPS usually costs between $150.0 and $300 depending on the location and terrain. Differential leveling costs around $1,000 to $2,000 per point, again depend on location and terrain. Selecting one type of surveying technique versus another depends on the expected mapping product's accuracy. Consult the American Society of Photogrammetry and Remote Sensing (ASPRS) Positional Accuracy Standards for Digital Geospatial Data [113] and chapter 9 to stand on the accuracy requirement of the ground control based on product accuracy.
Ground control requirements vary from one project to another depending on the project specifications and its geographic extent. Projects with high geometrical accuracy requirements require more ground controls. Figure 7.10 illustrates typical distribution of ground controls in a rectangular shaped project when the aircraft does not carry on board a GPS antenna, resulting in a non-GPS supported aerial triangulation, or what is usually called “conventional aerial triangulation.”
However, most aerial triangulation today is solved with airborne GPS data. Having GPS data in the aerial triangulation process saves a tremendous number of ground controls. Figure 7.11 illustrates the low density of ground controls required for GPS-based aerial triangulation.
Despite having ground controls only at the edges of the flight line as shown in Figure 7.11, having few additional controls along the interior of the block (see Figure 7.12) is a wise strategy, especially as high accuracy is expected form the aerial triangulation. Savings can be made in the control survey by replacing most of the ground control points at the edges of flight lines with imagery taken with a flight line perpendicular to the project flight lines at each end of the block (see Figure 7.13). Such additional flight lines that are perpendicular to the normal project flight lines are called “cross flight lines.”
Adding two cross flights (strips) at each edge of the photogrammetric block not only saves on number and cost of the ground control points but it also provides strength to the mathematical model within the bundle block adjustment computations. It helps in modeling and solving GPS and IMU problems.
To summarize the subject of ground control requirement for a block, we start with Figure 7.10, which represents the most control consuming case. That is the case of conventional aerial triangulation, where we do not use GPS on the camera during imagery acquisition. Then comes the most efficient method of aerial triangulation, and that is GPS-based aerial triangulation. Figures 7.11 through 7.13 represent different distribution of ground controls for GPS-based aerial triangulation. Each case has its strength and weakness, however, the configuration in Figure 7.13 represents the most economical way when it comes to the reduction in the ground controls requirement.
In this section, we will discuss products generated from image-based UAS. Although imagery collected by UAS can be used in a variety of applications in the field of remote sensing, we will focus in this lesson on two main mapping products; those are the ortho photo and the digital elevation model.
Digital ortho, ortho photo, orthographic image, or ortho map are different names for the same thing. Ortho photo, which I like to call it most of the time, is an image that is corrected (through the process of ortho-rectification) from the effect of terrain relief or sensor tilt to convert it to a unified scale map. Row images taken over variable terrain will have different scales at different locations on the image. Pixels covering the terrain of the ridge of a mountain will cover a smaller spot, as it is closer to the sensor (aircraft), as compared to a pixel covering a valley.
Performing the process of ortho-rectification will sample all these pixels, so each pixel covers exactly the same ground resolution or GSD regardless of where it falls in the image or from which terrain it originated. In other words, ortho-rectification means reprocessing the raw digital image to eliminate the scale variation and image displacement resulting from terrain relief and sensor (camera) tilt.
Because ortho photos are geometrically corrected, they can be used as map layers in GIS, overlaying, management, update, analysis, or display operations. This is a great advantage offered by the ortho photo as compared to the raw imagery.
The five primary ingredients for the ortho photo generation are the following:
An ortho photo produced using a digital elevation model for the bare earth (no buildings or trees in it) is usually called “ground ortho.” In ground ortho, the building lean is not removed in the process of ortho rectification, and buildings will appear to lean radially away from the center of the image, as you can see in the image of the World Trade Center in Baltimore on the left side of Figure 7.14. On the other hand, "true ortho" is an ortho where the buildings look as if they are erected straight up or as if you are looking at them from right above the roofs, as is illustrated in the right image of Figure 7.14. True ortho is very useful in urban areas, such as downtowns with tall buildings, as it reveals all the information in the streets and pathways surrounding the buildings. True ortho is computationally intensive and needs three-dimensional models of all buildings in the image, which makes it more costly than ground ortho.
It is very important to evaluate the quality of ortho-rectification, as it may cause some defects. Examples of such common defects are the following:
Similar to LiDAR, stereo imagery can be used to generate accurate digital elevation models. Most software used for UAS data processing has the capability of image matching technique to produce fine quality elevation models that can be used for the ortho rectification process and other terrain modeling purposes. The main ingredients for digital terrain data generation are:
Until recently, users did not trust the poor quality of the auto-correlated digital terrain data. However, in the last couple of years, software development companies adopted a new algorithm called “Semi Global Matching” or SGM that results in fine quality elevation data that in some ways competes with the elevation model generated by LiDAR. This made users excited again about using imagery for the development of a fine quality digital elevation data. The SGM algorithm is a new image matching approach that originated in the computer vision community. It utilizes auto-correlation matching technique based on aggregates per-pixel matching costs that was not possible with the old auto-correlation algorithms.
As it is in ortho photo production, digital elevation data needs to be evaluated to stand on the quality of the data.
There are a couple of terms that are used in the geospatial community to describe digital terrain data, those are:
Triangulated Irregular Network (TIN): The term TIN is used to describe the method that most software uses to model the digital terrain data and to present it on the screen. TIN surface represents a set of adjacent, non-overlapping triangles computed from irregularly spaced data points, with x, y horizontal coordinates and z vertical elevations (Figure 7.17).
Congratulations! You have just completed Lesson 7. I hope that you appreciated the value of the UAS imagery in producing geospatial data that is suitable for many applications in our day to day life. Ortho photo and digital elevation model are indispensable tools used in many environmental and engineering projects. Without them, we would have to put many boots on the ground to survey the terrain and provide the necessary data for engineering and planning. Practicing with the processing software Pix4D, which I selected for the course, will help you tremendously in appreciating the quality and value of the digital ortho photo and digital elevation model.
1 | Complete the Lesson 7 Quiz. |
---|---|
2 | Work on your final project report and presentation (due next week). |
3 | Practice generating DSM and ortho mosaic at different GSDs using Pix4D and evaluate the differences in qualities. |
4 | Continue working on your COA Application and the Final Project Report. |
Welcome to Lesson 8! In this lesson, you will become familiar with the different applications that the UAS is utilized for. The list of commercial and civilian applications is increasing by the day. It is difficult, if it is not impossible, to nail down such a list. The low cost and easy deployment of the UAS encouraged many people to utilize the unmanned aircraft to replace manned aircraft for their activities. Users are discovering new applications every day; however, we will only cover in this lesson the most obvious one. We will not cover the military application, as it is obvious, but we will consider, for the purpose of this lesson, the security and surveillance use of UAS as a civilian/commercial application since some of such services are offered commercially. Much of the commercial and scientific use of UAS that concerns us is in the field of geospatial data acquisition for remote sensing activities. The term “geospatial data” refers to any dataset that is referenced spatially (i.e., geolocated or geo-referenced) with known coordinates systems and datum. I expect from you in this lesson to read chapter 6 of the textbook Introduction to Unmanned Aircraft Systems and several external readings I will point out in the lesson notes.
At the successful completion of this lesson, you should be able to:
Digital Image Classification is an information extraction process (machine or automated interpretation) that involves the application of pattern recognition theory to multispectral imagery. It analyzes spectral properties of various surface features (e.g., crops) in a multiband image and sorts spectral data into spectrally related categories by the use of predefined, numerical decision rules.
The process involves:
The process utilizes one or more of the following recognition types:
Among the difficulties usually encountered with this technique are the following:
There are two types of image classification algorithms, those are:
In the unsupervised process, a user directs a computer software package to automatically identify and categorize pixels in an image. This is done on a purely statistical basis, though the user has control over the number of statistical classes, or clusters, to be created. With different classification algorithms, the user will also have control over statistical parameters, such as how much variation is permitted in a single class.
While there are no set rules on how many classes should be defined, a general rule of thumb is that the classes should total three times the number of final land cover categories sought. This allows for the possibility of different spectral signatures pertaining to the same land cover type (for example, if forest is sought as a class, deciduous and coniferous forests may require more than one spectral signature to accurately categorize them as forest). A number of these classes will likely represent meaningless categories or mixed pixels that may then be thrown out at a later point in the process.
Once a set of signatures has been defined, they may then be used to classify the entire image. Pixels with statistical characteristics similar to those in the signature set will be assigned the appropriate class. The resulting thematic layer has every pixel assigned a value representing the signature it was determined to be best represented by, Figure 8.1. This data set is then evaluated by the user to determine what land cover type is represented.
Processing Steps:
a - clustering or grouping
b – coloring
c – identification
Advantages and Disadvantages of Unsupervised Classification
Pros:
+ no extensive prior knowledge of the region required
+ opportunities for human error minimized
+ unique classes are recognized as distinct units
+ logistically less cumbersome
Cons:
- natural groupings do not necessarily correspond nicely with desired information classes
- no control over the menu of classes and their specific id
- spectral properties of informational classes vary over time, relationships between information and spectral classes change - make it difficult to compare unsupervised
classes from one image/date to another
The main steps in supervised classification are the following:
Figure 8.2 Left: Training samples Right: Supervised Classification on an image
Additional Remarks:
Good Strategy for Supervised Classification
Read more on Digital Image Classification [116].
In this section, you will become familiar with and understand the different civilian and commercial applications of the UAS as it stands today. UAS applications that concern us the most are the remote sensing applications. Here, the UAS is replacing manned aircraft as an acquisition platform. Remote sensors such cameras and LiDAR systems are shrunk in size and weight to make them more suitable for the lightweight small UAS as was mentioned in the Payload section of Lesson 2. Remote sensing applications derived from sensors onboard a UAS are more or less similar to the applications that one can expect from a manned system. Manned aircraft can carry larger and heavier payload, which open the door for additional applications that required large sensors such as IFSAR. Reported applications for the UAS include the following:
Details on some of these applications are given in chapter 6 of the textbook and assigned readings listed below. Try to visit the UAV Applications [126] in this site, as it has interesting information about different aspects of the UAS and its applications. Another way to explore potential applications of UAS-derived products is to look into the different applications of Geographic Information System (GIS) as they are closely related. In this regard, ESRI published on their website a good educational overview to highlight the different applications of GIS [127].
Watch the webinar: "Applying Drones to Surveying and Engineering Projects Today" [117]
In this section, you will become familiar with a widely used application of the UAS: the UAS for disaster response.
One of the widely utilized applications for the UAS is for disaster response situations. UAS is particularly useful for tasks that include one or all the 3 Ds -- dirty, dangerous, and dull:
Dirty: is much open to interpretation and to operation environment, but it can be described by flying over oil, nuclear, or gas installation sites where accidents have occurred, such as the Japanese Fukushima Daiichi nuclear plant, to take air samples or imagery.
Dangerous: refers clearly to situations where a pilot in a similar mission could become a casualty due to dangerous operations.
Dull: is when repetitive tasks are required over and over again. An example of the dull mission is border surveillance and maritime patrols that need eyes in the sky for hours at a time.
For UAS to suitably serve disasters, it needs more capabilities besides its adaptation for the 3 D’s factors. Such capabilities are defined by survivability, durability, and adaptability.
Survivability: Survivability of a UAS in a disaster response scenario relies on its efficient system of communications. For a UAS search and rescue mission, the UAS should consider three forms of communications. Those are:
Durability: The system's ability to survive a harsh or unpredictable operation environment such as unpredictable dropping debris, changing environment and loss of signal. UAS operation designers in such environments usually relay on multi-level UASs. As an example of this is the utilization of a High Altitude Long Endurance (HALE) UAS in the operation to carry equipment, provide a backup communication link, and to provide a high altitude overview of the site to plan emergency exits routes.
Adaptability: The ability of a mini-UAS with its small size to overcome fallen debris and unpredictable narrow spaces while maintaining its ability in sensing changes in an unpredictable and uncertain environment.
As examples of the use of UAS for disaster response, we will single out the UAS use for forest fire disasters.
Remote sensing techniques have proven to be very effective in mapping and monitoring fires and in giving feedback to first responders. Satellite remote sensing has limited capabilities in supporting fire response. This is due to the fact that the most available satellites have limited spatial resolutions (limited details) and they only occasionally orbit over the fire site, while fire monitoring needs continuous (24/7) coverage. However, satellite imagery can be useful in monitoring fires on a regional or national level, but not on a fire-front micro level. Thermal imagery from MODIS sensors on board the Terra and Aqua satellites with a resolution of 1 km were used by the U.S. Department of agriculture Forest Service Active Fire Mapping Program to monitor regional fires across the U.S. Beside the coarse resolution of its imagery, MODIS orbit any location only twice daily, which is infrequent for tracking the evolution of the fire and to support firefighters in real time.
Alternative to satellite imagery, aerial imagery from manned and unmanned aircraft is frequently used to provide needed frequent aerial observations of a fire. Two approaches were utilized in using the UAS for fire monitoring. The first High Altitude Long Endurance system (HALE) UAS can fly high and provide imagery with better resolution and better frequency than satellites. However, HALE UAS is expensive to procure and to maintain.
The second approach uses fleets of small UAS working cooperatively to provide more detailed information on the fire and its perimeter. In some cases, both approaches are utilized together with the HALE providing an overview image of the fire while small UASs are used to transmit high definition imagery in real time for the perimeter areas of the fire.
Here in the U.S., several wildfire monitoring programs have been adopted over the years. An example of such programs is the joint cooperation between NASA, General Atomics Aeronautical Systems, Inc. and various government agencies involved in fire research. The project used the General Atomics ALTUS II UAS, which is the civilian version of the Predator. Among the sensors on board the ULTUS II payload was a thermal multispectral scanner. Imagery was transmitted to the ground station through INMARSAT geostationary satellites. Once the imagery is received at the ground station, it goes through the geo-referencing and ortho-rectification processes, which convert them to a geo-referenced map before it goes into the hands of the field team. NASA published images (Figure 8.3) of the Grass Valley/Slide fire near Lake Arrowhead/Running Springs in the San Bernardino Mountains of Southern California acquired by the thermal-infrared imaging sensors on board NASA's Ikhana unmanned research aircraft. For more information on past NASA collaborative efforts in the field of different applications for UAS, visit UAS Integration in the NAS [128].
In this section, we will discuss operational challenges in using the UAS for certain applications.
So far, we have read and discussed materials about the successful utilization of the unmanned aircraft for a variety of applications. However, some of such applications are found to be challenging due to different reasons. Among such reasons are the following:
Congratulations! You have just finished Lesson 8, Civilian and Commercial Applications of the Unmanned Aerial System. You may notice that the use of UAS for civilian applications extends to almost any applications offered by manned aircraft. In fact, the UAS provides more opportunities than the manned aircraft. The UAS, with its small maneuvering size and its low-cost operation, makes them more useful and more affordable than manned aircraft. That is very true for small projects and projects that may involve hazardous operational conditions. UAS applications are expanding, and we hear about new applications every day. Amazon, for example, recently unveiled plans for UAV package delivery service [130]. What do you think the coolest application is that the UAS should be used for and that no one has thought about until now? Post your opinion in the discussion form.
1 | Complete the Lesson 8 Quiz. |
---|---|
2 | Submit your results for exercise 2 data processing in Pix4D. |
3 | Submit your COA Application. |
Welcome to Lesson 9! In this lesson, you will understand and become familiar with the main parameters that need to be considered when selecting a UAS for geospatial business activities. You will also recognize the main manufacturers of UAV, aerial acquisition sensors, and processing software. There are not many materials in the course textbooks that directly deal with these subjects, but one can indirectly derive some information from them. In addition, several research studies were conducted by private or public groups on the status of market and future prediction.
Unmanned Aerial Vehicles (UAVs) are becoming the most dynamic growth sector, and based on a research study conducted by the Teal Group Corporation, it is expected that the global UAV market will top US $54 Billion in the next decade or so.
At the successful completion of this lesson, you should be able to:
In this section, you will understand the requirements for selecting a UAS. Selecting a UAS depends on many factors that are closely related to the intended use of the UAS. Such use requirements will determine the size and weight of the UAS, and its endurance and range of flight, among other factors. In the following sections, we will briefly discuss each of these factors.
Size and weight a play great role in determining payload size and weight and in limiting its range and endurance. Large UASs have the capability of carrying a larger and heavier payload, including the power source. The larger the UAS, the more fuel or battery power it can carry on board. The more power the UAS can carry on board, the better range and endurance of the UAS.
The range of a UAS is an important performance characteristic. It is dependent on a number of basic aircraft parameters and weight of the payload. Maximum UAS range and endurance can be achieved with high propeller efficiency, low fuel consumption, and large onboard fuel (or battery power) capacity. A project that requires long hours in the air will need a larger UAS. However, most UASs that are employed for geospatial mapping purposes now days have an endurance of 90 minutes and a maximum range of around 50 miles.
In physical mechanics, stability refers to the tendency of an object to stay in its present state of rest or motion despite small disturbances. An aircraft must be stable in order to remain in flight. The forces acting on the aircraft, such as thrust, weight, and aerodynamic forces, have to be in certain directions in order to restore the aircraft to its original equilibrium position after it has been disturbed by a wind or other forces. An aircraft has angular degrees of freedom. Those are rotation around the X-axis or roll, the rotation around the Y-axis, or pitch, and the rotation around the vertical to the ground, or yaw. The aircraft has to remain stable around each of these axes. The most critical rotation is the pitch, and stability about it is called longitudinal stability. Some instability can be tolerated around the roll and the yaw.
Stability is essential for aerial data such as imagery acquisition in order to achieve gap-free imaging results. The use of a gyro-stabilized mount for the camera or the imaging sensor is preferred for mapping missions, as it results in uniform coverage free of gaps.
UAS costs play a great role in the decision for acquiring one. The price of a large UAS sometime exceeds the price of a typical manned aircraft, such as various models of Cessnas, used for aerial imaging. However, the cost of a UAS is justified by the type of jobs that are expected for the use of the UAS. Smaller UAS-based aerial imaging jobs are only justified through the use of a small UAS that costs under $100,000. It is worth mentioning here that due to strict regulations by the FAA on flying UAS, there are no large jobs for the UAS at the current time within the geospatial mapping community. No one can commercially utilize UASs for money-making projects, therefore only smaller UASs are utilized by the mapping community. Once the FAA eases the regulation, we should expect larger demand for medium or large UAS.
The maximum weight that a UAS can carry on board also plays an important role in the decision of UAS selection. Different applications require different sensors and therefore different payload capacities. Current UAS used by the mapping community can carry a payload varying in weight between a few to 100 lbs. The payload capacity directly affects the cost of the UAS, as it limits the range and endurance for the UAS. UAS with longer range and endurance cost more than those that fly a maximum distance of 35 miles and for a period of 60 minutes.
Read the article "Five Things to Consider when Adopting Drones for Your Business" [131] by Drone Analyst.
Practice with the use of Pix4D software to process the sample data.
In this section, you will gain an understanding of the different brands and makers of the UAV, payload sensors, and processing software.
Large UAS that are used mainly for defense purposes are around for a long time and have sophisticated technologies built into them. Examples of the manufacturers of such UAS are AAI Corporation, AeroVironment, Aurora Flight Sciences, BAE Systems, Boeing, Elbit Systems, General Atomics Aeronautical Systems, Inc., Israel Aerospace Industries, Northrop Grumman, Raytheon, Rotax, Sagem, Selex Galileo, and many others. Within the last decade, many startup companies started manufacturing low-cost UAS that are mainly used for civilian purposes. Examples of those manufacturers are Trimble, Altavian, Sensefly Ltd, American Aerospace Advisors, Prioria, Uconsystem, Idetec, and many more.
The following four resources contain good information on existing systems and manufacturers:
The sensors required for UAS that are utilized for mapping purposes are mainly limited to cameras (Visible, near-infrared, and thermal infrared). The second resource provided in the previous section offers a list of sensors manufacturers that are used for UAS payloads. UAS payloads used for the mapping community mainly include imaging cameras. Such cameras have a variety of spectral bands such as visible (Red, green, blue), near infrared (NIR) and thermal infrared. There is only one LiDAR system developed mainly for the UAS and that is the VUX-1 manufactured by Riegl, which was described in Lesson 2. The most obvious provider of digital cameras (without endorsing any of them) that are small enough to fit within UAS payloads are the following:
For image-based mapping products generation, users will need efficient photogrammetric processing software. Such software should be capable of performing the following operations, among others:
Among the most obvious data processing software that are optimized for UAS data processing in the market (without endorsing any of them) are the following:
Each of these five software packages meets most of the capabilities listed above. However, some of them may be more suitable than others, depending on the situation and the nature of the project.
Evaluating the quality and accuracy of geospatial data is one of the most important topics among geospatial data users. Geospatial data are used for diverse applications, including engineering and positioning applications. Knowing how accurate the measurements are that are derived from geospatial data is a matter of life or death in some applications, like when locating gas pipelines. In this section, you will be introduced to various statistical concepts that are related to determining geospatial data accuracy. You will also learn about the latest map accuracy standards designed for digital geospatial data published by the American Society of Photogrammetry and Remote Sensing (ASPRS).
For any geospatial data product, collecting metrics about a dataset revolves around the following questions:
Errors exist in any product we produce, no matter how accurate the instrument or the process we utilize. This is because all measuring instruments are not perfect, including laser instruments. Figure 1 illustrates the common instruments used in surveying and mapping practices and which we may think are perfect measurement devices.
There are two types of errors that concern us the most in geospatial data generation, and those are random error and systematic error. The third type, which is what we call blunders, is not considered an error, but we need to understand it and deal with it appropriately.
Random Error (or accidental error) is the type of error that randomly happens in nature due to our, or the instrument’s, incapability in realizing the true value. The true value in any measurement process is elusive to us and is beyond our metaphysical power. In a measuring process, we are only estimating the true value. Random error can be reduced by training, experience, and improved quality, but it cannot be eliminated.
Systematic Error: Is the error that has a repeated constant value and follows a mathematical logic. It can be reduced through calibration.
Blunders: A blunder is not an error; it is a mistake resulting from carelessness or negligence that resembles error. Common causes of blunders in surveying and mapping are:
Accuracy: The closeness of results of observations, computations, or estimates of graphic map features to their true value or position on the ground.
Precision (Repeatability): The closeness with which measurements agree with each other.
To illustrate the concepts of accuracy and precision in a practical fashion, let us consider evaluating the results of the four shooting sessions of Figure 2 that the sharp dart shooter completed at different times. In session A, the shooter’s shots seem to be scattered around the bullseye. He/she managed to get the shots around the targeted spot, or the bullseye, but failed to land them close to each other, i.e. they are scattered apart. To evaluate such a session, we say the shooter was accurate as he/she stayed close to the bullseye, but not precise, as the shots were not close to each other. In session B, we would say the shooter managed to cluster all shots in one spot, so he/she was precise but far away from the bullseye, so he/she was not accurate. Accordingly, in session C, he/she was accurate and precise, while in session D the shooter was neither accurate nor precise. To illustrate the concept of biases in measurements, let us analyze sessions B and C. Assuming the two sessions were shot by the same shooter, it is obvious that the shooter performed perfect shots in both sessions but that his/her shots in session B were biased due to mechanical misalignment of the bow or the gun, if a gun was used. Such misalignment of the bow, the gun barrel, or the sight scope caused the shots to be systematically directed to the wrong position instead of the bullseye, causing a bias in the shots. Once proper calibration is made to these mechanical defects, the bias is then removed and all the shots will perfectly fall around the bullseye, like in session C.
To evaluate the shooter results using probability and density distribution terms, the results of session B are equivalent to the random distribution 3 of Figure 3, precise but not accurate, assuming the most probable value of the bullseye is represented by p on the x-axis. The results of session A, however, resemble the distribution 2 of Figure 3, accurate but not precise. For more information on the subject, please watch this NGS video [145].
To illustrate the different statistical terms we usually run into when we discuss data accuracy, let us consider the five error values (3-in., 2-in, 1-in., 5-in., and 4-in.) that were calculated on a population of data.
Table 1 illustrates the difference between standard deviation and the RMSE in revealing the presence of biases in measurements. The table represents a vertical accuracy evaluation for points cloud derived from UAS imagery by comparing it to a higher accuracy elevation model derived from a mobile lidar mapping system. The UAS-derived elevation model needed to meet 5-cm (0.164-ft) accuracy. If we used standard deviation alone, the data would meet the specifications with a value of 0.076-ft. However, looking at the high value of 0.246-ft. (7.5-cm) of the mean, it is obvious this data set contains a bias and the only way to catch it is by either evaluating the value of the mean or using the RMSE as the accuracy measure. The high value of the RMSE = 0.257-ft. (7.83-cm) will flag the data as not meeting specifications. The far right column contains the error values after removing the bias of 0.246-ft. (7.5-cm) from the measurements. Once we remove the bias, the values for the RMSE and the standard deviation are equal and they both meet the project accuracy specifications. Removing a bias from elevation data could be as simple as shifting the entire dataset up or down by the magnitude of the bias itself, such practice is called z-pump.
In randomly distributed repeated measurements, measurements values will vary around the mean or the average, with most values being closer to the average. Deviation from such behavior indicates the presence of bias(es) or perhaps blunders in the measurements. Figure 4 shows a true random distribution of a set of measurements that do not contain biases. For the measurement’s distribution in Figure 4, we notice that 68.2% of the measured values fall within +/- 1 RMSE or +/- 1 sigma from the mean value, that is 34.1% on both sides of the mean. We also notice that 95% of the measurements fall within +/- 2 RMSE or +/- 2 sigma from the mean. Understanding such distribution is essential to understanding the map accuracy standard we are going to discuss in the following sections.
Table 2 lists the most common terms used to estimate errors in surveying and mapping. Probable error is the term used to describe the probability, or the confidence level, that 50% of the errors fall within, while 95% errors represents the confidence level that 95% of the measured errors values fall under.
Error | % Error | Constant wrt |
---|---|---|
Probable Error | 50 | 0.6745 σ |
Standard Error | 68.27 | 1.000 σ |
90% Error | 90 | 1.6449 σ |
95% Error | 95 | 1.9599 σ |
3σ Error | 99.73 | 3.0000 σ |
According to the ASPRS Positional Accuracy Standards for Digital Geospatial Data, the terms positional error and absolute and relative accuracy are defined as follow:
In November of 2014, the American Society of Photogrammetry and Remote Sensing (ASPRS) published Edition 1 of the first ever mapping accuracy standards that are solely designed for today's digital geospatial data. Edition 2 was published on August 23, 2023 to correct some measures to suite today's technologies and processes and add six addenda on best practices and guidelines.
Motivation Behind the New Standard is:
Legacy map accuracy standards, such as the ASPRS 1990 standard and the National Map Accuracy Standards (NMAS) of 1947, are outdated (over 30 years since ASPRS 1990 was written).
Many of the data acquisition and mapping technologies that these standards were based on are no longer used.
More recent advances in mapping technologies can now produce better quality and higher accuracy geospatial products and maps.
Legacy map accuracy standards were designed to deal with plotted or drawn maps as the only medium to represent geospatial data.
Within the past two decades (during the transition period between the hardcopy and softcopy mapping environments), most standard measures for relating GSD and map scale to the final mapping accuracy were inherited from photogrammetric practices using scanned film.
New mapping processes and methodologies have become much more sophisticated with advances in technology and advances in our knowledge of mapping processes and mathematical modeling.
Mapping accuracy can no longer be associated with camera geometry and flying altitude alone (focal length, xp, yp, B/H ratio, etc.).
These factors can vary widely from project to project, depending on the sensor used and the specific methodology. For these reasons, existing accuracy measures based on map scale, film scale, GSD, c-factor and scanning resolution no longer apply to current geospatial mapping practices.
1) General Best Practices and Guidelines
2) Field Surveying of Ground Control and Checkpoints
3) Mapping with Photogrammetry
4) Mapping with Lidar
5) Mapping with UAS
6) Mapping with Oblique Imagery
Advantage of Specifying the New ASPRS Positional Accuracy Standards for Digital Geospatial Data for a Project
Users of the new standards do not have to specify accuracy details for the intermediate processes in products generation. The user needs to specify the final deliverable product accuracy and the new standards will set up all accuracy specifications for intermediate processes, such as ground survey, aerial triangulation, etc., involved in the production of the final product. Figure 5 illustrates such a concept.
Horizontal Accuracy Class |
Absolute Accuracy RMSEH (cm) |
Orthoimagery Mosaic Seamline Mismatch (cm) |
---|---|---|
#-cm |
Table 4 lists common horizontal accuracy classes for geospatial mapping products.
Horizontal Accuracy Class RMSEx and RMSEy (cm) | RMSEr (cm) | Orthoimage Mosaic Seamline Maximum Mismatch (cm) |
---|---|---|
0.63 | 0.9 | 1.3 |
1.25 | 1.8 | 2.5 |
2.50 | 3.5 | 5.0 |
5.00 | 7.1 | 10.0 |
7.50 | 10.6 | 15.0 |
10.00 | 14.1 | 20.0 |
12.50 | 17.7 | 25.0 |
15.00 | 21.2 | 30.0 |
17.50 | 24.7 | 35.0 |
20.00 | 28.3 | 40.0 |
22.50 | 31.8 | 45.0 |
25.00 | 35.4 | 50.0 |
27.50 | 38.9 | 55.0 |
30.00 | 42.4 | 60.0 |
45.00 | 63.6 | 90.0 |
60.00 | 84.9 | 120.0 |
75.00 | 106.1 | 150.0 |
100.00 | 141.4 | 200.0 |
150.00 | 212.1 | 300.0 |
200.00 | 282.8 | 400.00 |
250.00 | 353.6 | 500.0 |
300.00 | 424.3 | 600.0 |
500.00 | 707.1 | 1000.0 |
1000.00 | 1414.2 | 2000.0 |
Table 6 Vertical Accuracy/Quality Examples for Digital Elevation Data
4. The standards introduced horizontal accuracy estimation for elevation data
Table 7 lists some horizontal accuracy values for lidar data based on the previous formula (the GNSS horizontal accuracy is assumed to be equal to 0.10 m, the IMU error is assumed to be 10.0 arc-seconds for the roll and pitch and 15.0 arc-seconds for the heading)
5. The Standards Introduced a Formal Accuracy Testing Statement:
For the first time, the new standards provide users with formal data evaluation statements to be used by the data users and data producers. The following statement are examples of the accuracy statement of an elevation dataset:
This type of reporting should only be based on a set of independent checkpoints. The positional accuracy of digital orthoimagery, planimetric data, and elevation data products shall be reported in the metadata in one of the manners listed below. For projects with NVA and VVA requirements, two three-dimensional positional accuracy values should be reported based on the use of NVA and VVA, respectively.
5.1.1 Accuracy Testing Meets ASPRS Standard Requirements
If testing is performed using a minimum of thirty (30) checkpoints, accuracy assessment results should be reported in the form of the following statements:
“This data set was tested to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a __(cm) RMSEH horizontal positional accuracy class. The tested horizontal positional accuracy was found to be RMSEH = __(cm)”.
“This data set was tested to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a __(cm) RMSEV Vertical Accuracy Class. NVA accuracy was found to be RMSEV = __(cm).” VVA accuracy was found to be RMSEV = __(cm).”
“This data set was tested to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a ___ (cm) RMSE3D three-dimensional positional accuracy class. The tested three-dimensional accuracy was found to be RMSE3D = ___(cm).”
5.1.2 Accuracy Testing Does Not Meet ASPRS Standard Requirements
If testing is performed using fewer than thirty (30) checkpoints, accuracy assessment results should be reported in the form of the following statements:
“This data set was tested as required by ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023). Although the Standards call for a minimum of thirty (30) checkpoints, this test was performed using ONLY __ checkpoints. This data set was produced to meet a ___(cm) RMSEH horizontal positional accuracy class. The tested horizontal positional accuracy was found to be RMSEH = ___(cm) using the reduced number of checkpoints.”
“This data set was tested as required by ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023). Although the Standards call for a minimum of thirty (30) checkpoints, this test was performed using ONLY __ checkpoints. This data set was produced to meet a ___(cm) RMSEV vertical positional accuracy class. The tested vertical positional accuracy was found to be RMSEV = ___(cm) using the reduced number of checkpoints.”
“This data set was tested as required by ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023). Although the Standards call for a minimum of thirty (30) checkpoints, this test was performed using ONLY __ checkpoints. This data set was produced to meet a ___(cm) RMSE3D three-dimensional positional accuracy class. The tested three-dimensional positional accuracy was found to be RMSE3D = ___(cm) using the reduced number of checkpoints.”
In most cases, data producers do not have access to independent checkpoints to assess product accuracy. If rigorous testing is not performed by the data producer due to the absence of independent checkpoints, accuracy statements should specify that the data was “produced to meet” a stated accuracy. This “produced to meet’’ statement is equivalent to the “compiled to meet” statement used by prior Standards when referring to cartographic maps. The “produced to meet’’ statement is appropriate for data producers who employ mature technologies, and who follow best practices and guidelines through established and documented procedures during project design, data processing and quality control. However, if enough independent checkpoints are available to the data producer to assess product accuracy, it will do no harm to report the accuracy using the statement provided in section 4.1 above.
If not enough checkpoints are available, but the data producer has demonstrated that they are able to produce repeatable, reliable results and thus able to guarantee the produced-to-meet accuracy, they may report product accuracy in the form of the following statements:
“This data set was produced to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a __(cm) RMSEH horizontal positional accuracy class.
“This data set was produced to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a __(cm) RMSEV vertical accuracy class.
“This data set was produced to meet ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 (2023) for a ___ (cm) RMSE3D three-dimensional positional accuracy class
6. The Standards Introduced a new accuracy term, the Three-Dimensional Positional Accuracy:
The following formula defines the three-dimensional accuracy standard for any three-dimensional digital data as a combination of horizontal and vertical radial error. RMSE3D is derived from the horizontal and vertical components of error according to the following formula:
7. The Standards Introduced a new approach for assessing product accuracy by factoring in the accuracy of the surveyed check points when computing product accuracy:
As we are producing more accurate products, errors in surveying techniques of the checkpoints used to assess product accuracy, although it is small, can no longer be neglected and it should be represented in computing the product accuracy. Currently, we quantify products accuracy ignoring the errors in the surveyed check points. In such practice, our surveying techniques approximates the datum, i.e., producing pseudo datum and therefore, we are evaluating the closeness of data to the pseudo datum and not the true datum. The following figure illustrates the current practices and the new one proposed in Edition 2 of the ASPRS standards.
Figure 6 Factoring in the accuracy of the surveyed check points when computing product accuracy
Best Practices in Determining Product Accuracy*
* according to the ASPRS Positional Accuracy Standards for Digital Geospatial Data, Edition 2 of 2023 (https://publicdocuments.asprs.org/PositionalAccuracyStd-Ed2-V1 [146])
The new standards provide Table 8 for the recommended number of check points required for validating product accuracies. For project areas that are larger than 10,000 square kilometers, use only 120 checkpoints.
Find more information about the new ASPRS standards here [147].
Congratulations! You have just completed Lesson 9. You may have noticed from the different sections of the lessons that the UAS market is growing rapidly. There are quite a few manufacturers for the civilian UAS, as well as software and sensor producers. User requirements will drive the selection process for the UAS and the processing software that is right for the job. Required UAS endurance, range and payload capacity will be different from one application to another. However, most applications will prefer more endurance, longer range, and heavier payload if the price is right.
In this lesson, you also learned about the value of evaluating data quality and accuracy and how to use the new ASPRS standards to report such quality and accuracy factors.
By now, you must be finishing the products generation of ortho photo and digital elevation model using Pix4D and the sample imagery. Samples of the products need to be submitted with your project report and presented next week during your presentation.
1 | Prepare your presentation for the final project. |
---|---|
2 | Final Project Milestone: Submit final project report. |
Some American citizens, including lawmakers, believe that before Unmanned Aerial Vehicles (UAVs) start routinely observing Americans from above, the Federal Aviation Administration (FAA) needs to address two little concerns, safety, and privacy. The issue of privacy was received differently depending on who is involved in evaluating the UAS impact. Many of us believe that flying a UAS with a camera is no different from flying a helicopter during a mapping mission. However, others see it as clear invasion of their personal privacy, and they want the FAA to curb the use of the UAS in overpopulated areas. During 2013, the Senate Judiciary Committees held a hearing on UAV/UAS issue, where it was very clear that senators in both parties are worried about the threat to Americans’ privacy posed by increasing use of the unmanned aerial systems (UASs). The article "Lawmakers voice concerns on drone privacy questions [149]" published by NBC details the outcome of the discussions occurred during that hearing.
Integrating small UAS into the NAS raises concerns as unmanned airplanes pose risk by potential air collision with other airplanes or by causing properties damage or loss of life on the ground. Many people and lawmakers also have concerns about their security, as they believe that drones can be hacked and used for terrorist acts. The report "Unmanned Aerial Vehicles: Examining the Safety, Security, Privacy and Regulatory Issues of Integration into U.S. Airspace [150]" provides fairly good details on the issues of UAS privacy, security, and safety.
The burden of concerns over privacy was lessened with the issuing of PART 107 by the FAA, as it contained no clause to regulate the use of UAS as it relates to privacy. It was the right move by the FAA as the topic is controversial and there is no easy solution for it. However, things may change in the future as the "FAA Reauthorization Act of 2018 [151]" which was authorized by Congress during its 115th session on April 13, 2018 brought back the privacy issues to the table by mandating the FAA to carry out a review to identify any potential reduction of privacy specifically caused by the integration of unmanned aircraft systems into the national airspace system.
UAS is capable of collecting very high definition/resolution imagery of people's backyards and perhaps through windows. The public in the United States expressed two main opinions about allowing UAS to fly over populated areas, especially if it is used for surveillance and search and rescue missions.
Post to the discussion board and respond to at least one posting from your peers.
Links
[1] http://news.psu.edu/story/300122/2014/01/14/research/programming-drones-fly-birds?utm_source=newswire&utm_medium=email&utm_term=300215&utm_content=01-14-2014-14-59&utm_campaign=engineering%20newswire
[2] http://www.airspacemag.com/flight-today/drones-for-hire-125909361/
[3] http://www.airspacemag.com/flight-today/robot-reporters-125191633/
[4] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/Free-Report-6-Predictions-for-2016.pdf
[5] https://psu.instructure.com/files/138261760/download?download_frd=1
[6] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson01/remotesensing-04-01671.pdf
[7] https://psu.instructure.com/files/156251265/download?download_frd=1
[8] http://news.psu.edu/story/300122/2014/01/14/research/programming-drones-fly-birds?utm_source=newswire&utm_medium=email&utm_term=300215&utm_content=01-14-2014-14-59&utm_campaign=engineering newswire
[9] https://en.wikipedia.org/wiki/National_Intelligence_Managers_for_Aviation
[10] https://home.army.mil/rucker/index.php
[11] https://rosap.ntl.bts.gov/view/dot/18249
[12] https://psu.instructure.com/files/158846037/download?download_frd=1
[13] https://en.wikipedia.org/wiki/Bayraktar_Mini_UAV#Specifications
[14] https://www.americanaerospace.com/airanger-uas
[15] https://commons.wikimedia.org/wiki/File:USMC-01522.jpg
[16] http://american-aerospace.net/
[17] http://www.nasa.gov/multimedia/imagegallery/image_feature_2362.html
[18] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/Collier_Crouch_thesis_a435680.pdf
[19] https://www.e-education.psu.edu/geog892/node/5
[20] https://psu.instructure.com/files/156251248/download?download_frd=1
[21] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/US%20Army%20UAS%20RoadMap%202010%202035-1.pdf
[22] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/9548_MS_RemoteSensing.pdf
[23] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/Human_Factor_Implications_200608.pdf
[24] http://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson01/20-717_SWOT-analysis.pdf
[25] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/B4842_16MP_CCDCamera_Specs.pdf
[26] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/IXA180_IXA160.pdf
[27] https://geospatial.phaseone.com/drone-payload/p3-payload-for-drones/?utm_source=referral&utm_medium=eblast&utm_campaign=GEO-2021-05-05-Geoconnection-eblast-P3_payload_for_drones&utm_content=CTA
[28] https://geospatial.phaseone.com/drone-payload/
[29] https://www.imperx.com/bobcat-2-0-ccd/
[30] https://www.phaseone.com/
[31] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/PH_Geospatial_Overview_Deck_v5.pdf
[32] https://www.parrot.com/en/support/documentation/sequoia
[33] https://en.wikipedia.org/wiki/Electromagnetic_spectrum
[34] https://www.parrot.com/assets/s3fs-public/2021-09/sequoia_integration_manual_en.pdf
[35] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/flir-a6700sc-mwir-series-infrared-camera-datasheet.pdf
[36] https://www.flir.com/
[37] http://en.wikipedia.org/wiki/LIDAR
[38] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/DataSheet_VUX-1_14-02-2014_PRELIMINARY_4pages.pdf
[39] https://youtu.be/YaGw-dzo9Mc
[40] http://www.riegl.com/nc/
[41] http://velodynelidar.com/
[42] http://en.wikipedia.org/wiki/Global_Positioning_System
[43] http://en.wikipedia.org/wiki/Inertial_Measurement_Unit
[44] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/IG-500E-Leaflet-1.pdf
[45] http://www.sbg-systems.com/
[46] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/UAS-for-ITS-CTS11-06_Risk-and-CONOP.pdf
[47] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/FAA-UAS-Conops-Version-2-0-1.pdf
[48] http://www.fas.org/irp/doddir/usaf/conops_uav/toc.htm
[49] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/NOAA_CONOPS.pdf
[50] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/Mission_planner_v2.pdf
[51] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson06/Camera_Calibration-yastikli_naci.pdf
[52] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson06/Camera_Calibration_91.pdf
[53] https://store.usgs.gov/maps
[54] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson06/PA_State%20College_223993_1962_24000_geo.pdf
[55] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/Lesson04/Aeronautical_Chart.pdf
[56] http://www.youtube.com/watch?v=6ITjUfl80bs
[57] http://en.wikipedia.org/wiki/Visual_flight_rules
[58] http://www.faa.gov/air_traffic/flight_info/aeronav/digital_products/vfr/#SecPDFs
[59] https://www.faa.gov/air_traffic/flight_info/aeronav/productcatalog/VFRCharts/
[60] https://app.airmap.com/geo?34.017931,-118.496046,9.417193z
[61] https://faa.maps.arcgis.com/apps/webappviewer/index.html?id=9c2e4406710048e19806ebf6a06754ad
[62] https://www.faa.gov/uas/recreational_fliers/where_can_i_fly/b4ufly/
[63] http://en.wikipedia.org/wiki/Digital_camera
[64] http://en.wikipedia.org/wiki/CCD_camera
[65] https://wingtra.com/best-drones-for-photogrammetry-wingtraone-comparison/phantom-4-rtk-vs-wingtra/
[66] https://www.youtube.com/watch?v=dNVtMmLlnoE&feature=youtu.be
[67] http://en.wikipedia.org/wiki/Shutter_%28photography%29
[68] http://en.wikipedia.org/wiki/Waypoint
[69] https://psu.instructure.com/files/102041812/download?download_frd=1
[70] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/GEOG892_Pix4D_Excercise2.pdf
[71] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/TRB2013_39g4ub.pdf
[72] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/PART107_RIN_2120-AJ60_Clean_Signed.pdf
[73] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/FAA-2013-0061-0104_test-site-program.pdf
[74] https://psu.instructure.com/files/133123121/download?download_frd=1
[75] https://www.faa.gov/uas/programs_partnerships/test_sites/
[76] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/TRB2013_Regulations.pdf
[77] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson05/FAA-Part-107-7-Things-to-Know.pdf
[78] http://www.flytandem.com/airspace.htm
[79] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/Unmanned%20Aircraft%20Systems%20%28UAS%29%20Operational%20Approval_n_8900.227.pdf
[80] https://www.commerce.senate.gov/public/index.cfm/hearings?ID=FD65111D-3DD5-472B-8C91-AEED13D00AE0
[81] http://geospatial-solutions.com/faa-commercial-drones-are-illegal-public-so-what/
[82] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/PART107_Federal_Register.pdf
[83] https://www.faa.gov/uas/programs_partnerships/test_sites
[84] http://en.wikipedia.org/wiki/Airspace_class_%28United_States%29#Airspace_classes
[85] http://aspmhelp.faa.gov/index.php/Airspace_Classification
[86] https://www.faa.gov/uas/programs_partnerships/psp/
[87] https://www.rpastraining.com.au/
[88] http://www.uxvuniversity.com/uav-pilot-training-certificate/
[89] https://www.dronepilotgroundschool.com/
[90] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/remotesensing-04-01671.pdf
[91] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson06/Detect_Sense_Avoid_ar0841.pdf
[92] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/UAS_Roadmap_2013-1.pdf
[93] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/AC%2091-57A%20Change%201.pdf
[94] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson06/2120-AJ60_NPRM_2-15-2015_joint_signature.pdf
[95] http://www.faa.gov/documentlibrary/media/notice/n_8900.227.pdf
[96] https://psu.instructure.com/files/102896912/download?download_frd=1
[97] https://www.faa.gov/uas/resources/public_records/foia_responses/
[98] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson06/COA_Application_Components_Template-v2.pdf
[99] https://psu.instructure.com/files/133124382/download?download_frd=1
[100] https://www.faa.gov/uas/advanced_operations/certification/section_44807
[101] https://www.faa.gov/aircraft/air_cert/airworthiness_certification/sp_awcert/
[102] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson06/faa_uas_civil_coa_request_v2.pdf
[103] https://caps.faa.gov/
[104] https://www.faa.gov/sites/faa.gov/files/about/office_org/headquarters_offices/ato/COA%2520Sample%2520Application%2520v%25201-1.pdf
[105] https://www.faa.gov/about/office_org/headquarters_offices/ato/service_units/systemops/aaim/organizations/uas/coa/
[106] https://www.faa.gov/uas/commercial_operators/part_107_waivers/
[107] https://www.faa.gov/uas/commercial_operators/part_107_waivers/waiver_safety_explanation_guidelines/
[108] https://www.ecfr.gov/cgi-bin/text-idx?SID=804147500dfd16a3f71bf98f780f06d2&mc=true&node=se14.2.107_141&rgn=div8
[109] https://www.ecfr.gov/current/title-14/chapter-I/subchapter-F/part-107?toc=1
[110] https://psu.instructure.com/courses/2156313/files/folder/Course%20Resources/Sample%20COAs
[111] https://www.faa.gov/uas/commercial_operators/part_107_waivers/waivers_issued
[112] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson07/processflow.png
[113] https://www.e-education.psu.edu/geog892/node/707
[114] https://edition.cnn.com/2012/07/26/tech/innovation/technology-fighting-fire/index.html
[115] http://dronenodes.com/commercial-drone-applications/
[116] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson08/GEOG892_Digital-image-classification.pdf
[117] https://www.youtube.com/watch?v=OTGQ2fQpytI
[118] https://gisgeography.com/image-classification-techniques-remote-sensing/
[119] https://earth.esa.int/landtraining09/D2L2_Caetano_Classification_Techniques.pdf
[120] http://spie.org/x32288.xml
[121] https://psu.instructure.com/files/135143862/download?download_frd=1
[122] https://psu.instructure.com/files/135144121/download?download_frd=1
[123] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson08/ISPRS_102-UAV-based_RemoteSensing.pdf
[124] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson09/GIM%20International.pdf
[125] https://www.eu-startups.com/2020/05/drone-pioneer-wingcopter-a-winner-of-german-government-covid-19-hackathon/
[126] http://www.asctec.de/en/uav-uas-drone-applications/
[127] https://www.esri.com/en-us/industries/index
[128] https://www.nasa.gov/directorates/armd/integrated-aviation-systems-program/uas-in-the-nas/uas-integration-in-the-nas-about-us/
[129] https://www.nasa.gov/
[130] http://www.nydailynews.com/news/national/amazon-testing-drones-deliver-packages-article-1.1534548
[131] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/5-things-to-consider-when-adopting-drones-for-your-business.pdf
[132] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/AerialServices_UASGuide_Small.pdf
[133] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/UAS-Suppliers.pdf
[134] https://mags.shephardmedia.com/HB-samples-2017/Commercial_Unmanned_Systems_Handbook_Sample_2017/pubData/mobile/index.htm#/1/
[135] https://geospatial.phaseone.com/cameras/
[136] https://www.imperx.com/ccd-cameras/
[137] https://www.nikonusa.com/en/Nikon-Products/dslr-cameras/index.page
[138] https://micasense.com/dual-camera-system/
[139] https://www.parrot.com/uk/shop/accessories-spare-parts/other-drones/sequoia
[140] https://www.agisoft.com/
[141] https://www.pix4d.com/#products
[142] http://www.menci.com/
[143] https://www.simactive.com/correlator3d-mapping-software-features
[144] https://www.geospatial.trimble.com/products-and-solutions/trimble-inpho-uasmaster?gclid=Cj0KCQiAu62QBhC7ARIsALXijXRQifB6ey8Ds3y6jWhROSjxYhXAqI9Swg4kDxDDrfvTc6OY3VC3eLMaAltuEALw_wcB
[145] https://www.ngs.noaa.gov/corbin/class_description/Precision_Accuracy/
[146] https://publicdocuments.asprs.org/PositionalAccuracyStd-Ed2-V1
[147] http://www.asprs.org/PAD-Division/ASPRS-POSITIONAL-ACCURACY-STANDARDS-FOR-DIGITAL-GEOSPATIAL-DATA.html
[148] http://www.asprs.org/wp-content/uploads/2015/01/ASPRS_Positional_Accuracy_Standards_Edition1_Version100_November2014.pdf
[149] http://nbcpolitics.nbcnews.com/_news/2013/03/20/17389193-lawmakers-voice-concerns-on-drone-privacy-questions?lite
[150] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/sp-Drones-long-paper.pdf
[151] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson10/BILLS-115hr4ih.pdf
[152] http://usgovinfo.about.com/od/rightsandfreedoms/a/Unmanned-Aircraft-Used-In-The-United-States.htm
[153] https://aerialservicesinc.com/implications-of-drones-on-american-privacy-and-freedom-2/