Welcome to Lesson 2! In this lesson, you will become familiar with the elements that combine to create an operational Unmanned Aerial System (UAS). Most UASs consist of an Unmanned Aerial Vehicle (UAV), human elements, payload, control elements (for a larger system it will be a ground control station (GCS) or mission planning and control station (MPCS)), and a data link communication unit (Figure 2.1). Military versions of the UAS will have an additional weapon system platform and supporting soldiers as part of the human element.
In addition, you will understand and develop knowledge about the different acquisition and auxiliary aerial sensors that are usually carried on board the UAS payload. Finally, at the end of this lesson, you will have a working knowledge of the different components forming a UAS and how the different components relate and interact with one another, the data acquisition sensors, and the auxiliary sensors that accompany a UAS mission, such as GPS and IMU.
At the successful completion of this lesson, you should be able to:
The air vehicle is the airborne part of the system. The air vehicle here means the aircraft, in conjunction with the payload, that forms an Unmanned Aerial System (UAS). In general, the unmanned aircraft is usually called an Unmanned Aerial Vehicle (UAV) and can be either a fixed-wing or rotary airplane that flies without a human on board.
The UAV is a complicated system including structures, aerodynamic elements such as wings and control surfaces, propulsion systems, control systems, communication elements, and launch and recovery subsystems. Larger UAVs use fuel-powered engines in order to attain flight, while smaller UAVs typically use either gasoline-powered engines or electrically powered engines. When the UAV has sensors and payloads, it is customarily called an Unmanned Aerial System (UAS). In this course, the terms UAS and UAV will be used interchangeably to mean the same. Due to inclusion of the word "unmanned," there has been some resistance in recent years to use of the names Unmanned Aircraft and Unmanned Aerial Vehicle. There is a push to adopt the term Remotely Piloted Aircraft (RPA) or Remotely Piloted Vehicle (RPV) because of the crucial human involvement related to the operation of the system. UAVs come in all different sizes and shapes; however, the following are the major factors to be considered in designing a UAV:
The term data link is used to describe how commands are communicated back and forth between the ground control system and the autopilot. The data link is a key subsystem for any UAS, as it provides a two-way communication to ensure that missions are executed safely and according to plan. A good data link is illustrated in Figure 2.2:
There are two different modes for operating a UAS. Those are:
More details on the two operating modes will be covered in lesson 4.
The command and control element is the nerve center for the UAS operation. It controls the following tasks:
The command and control element utilizes several subsystems to accomplish its missions. They are:
The most important parts of the command and control element are the Autopilot and the ground control station, as described in the following subsections:
Autopilot is the sub-system that enables partial or fully autonomous flight. A UAV can be operated completely by a remote control, where an operator steers the air vehicle all the time, or a UAV can be flown autonomously, where a pre-programmed path is fully executed from takeoff to landing by the autopilot sub-system without any pilot intervention. Small, light-weight autopilots are readily available and are made by a few manufacturers. Besides guiding the air vehicle throughout the pre-set flight path, the autopilot also executes a “lost link” routine if the UAV loses contact with the ground control station. The lost link procedure guides the UAV to a known waypoint, where contact with the ground control station can again be established. The following scenario was developed for a typical emergency procedure based on loss of link between the Yamaha RMAX UAS and the ground Control Station:
The RMAX utilizes a redundant communication system to ensure constant contact between the aircraft and the remote pilot. The ground control station provides real-time data regarding aircraft location, altitude and flight characteristics. The pilot constantly monitors the flight information provided to the ground control station, and through the assistance of a trained observer, maintains a visual line of sight to the aircraft. In the event of a loss of link between the aircraft and the ground control station, the subsequent procedures are followed:
Problem: | Sign of Problem: | Monitored throughout: | Solution: |
---|---|---|---|
Low Signal | Vehicle is slow to respond to manual commands or PCC commands. Autopilot terminates steering mode. Audible and warning light alarms. | Yes, signal strength displayed in percentage and packet update rate. | Turn Autopilot on and abandon manual flight. Initiate auto-land. |
Loss of Communication | Autopilot terminates manual control or fails to respond to PCC commands. Audible communication alarm and warning light. | Yes. | The vehicle returns to loss communication waypoint, hovers until elapse of flight timer, then commences auto-land procedure. |
Loss of GPS | First indication is poor altitude hold performance, also poor position hold during hover. | Yes, indicated by the number of satellites tracked and GPS Quality PDOP. | Assume manual control of aircraft and land. |
Low Power Avionics | Lower than nominal voltage displayed. | Yes. | Land Immediately. |
Engine Failure | Noise level or RPM changes, engine loses power. | Yes, monitored by rotor RPM through the RPM sensor. | Return and land immediately. If the engine dies, initiate autorotation procedure. |
Tail Rotor Failure | Loss of tail control. | No. | Switch to manual control and initiate autorotation procedure. |
The ground command station (GCS) is the site where the pilot controls the UAV during the flight. The GCS size and sophistication depends on the category of the UAS/UAV. Some large UASs require a formal facility with multiple workstations and personnel, while a GCS for small UAS can be a handheld transmitter. Most UASs used by the geospatial community are small UASs that do not require a dedicated GCS.
Payload refers to air vehicle (aircraft) cargo. It is also defined as the amount of cargo weight an air vehicle can safely carry. Carrying a payload on board is the sole purpose for most UASs. Payloads come in a variety of sizes, weights, and functions. In our business of geospatial remote sensing, we focus on remote sensing sensors and the necessary navigation systems accompanying them. A UAS dedicated to remote sensing and mapping missions is usually equipped with one or more of the following sensors.
Auxiliary sensors here mean the navigation sensors that are necessary to determine the location and the orientation of the UAS and its remote sensors that are mentioned earlier in this section. For the UAS and onboard sensor position determination, the Global Positioning System (GPS) is used, and for the attitudes or orientation of the UAS and the onboard sensors, the Inertial Measurement Unit (IMU) is used.
The GPS does not need introduction, as everyone is familiar with its definition. It is the same GPS that you might use to drive around town. However, GPS that is used to determine a remote sensor position usually undergoes a post-processing to enhance the accuracy of the position.
UAS are offered with two grades of GPS accuracies. The most common one is the single frequency GPS receivers as it is cheaper, and it does not require post-processing or real-time correction service. Such receivers provide location accuracy of around 1 to 2-meter. For more accurate geospatial products generation, the more accurate dual frequency receiver and precise services are need needed. The latter receivers offer two modes of operations, both of which yield positional accuracy of 1 to 3 cm with little or no ground controls required for the project. UAS vendors are fielding systems with two operational modes, those are:
In principle, both RTK and PPK promise positional accuracies at the 1-3cm level. The main purpose of RTK and PPK is to minimize or eliminate the need for ground control points, thereby reducing cost. For more details on GPS, please visit GPS Defined. [23]
An inertial measurement unit, or IMU [24], is an electronic device that measures and reports on aerial vehicle velocity, orientation, and gravitational forces using accelerometers and gyroscopes. IMUs are typically used to control and maneuver manned aircraft, unmanned aerial vehicles (UAVs), and satellites. Another important use for the IMU is that it helps IMU-enabled GPS devices to maintain positioning information when GPS-signals are unavailable, such as in tunnels, inside buildings, or when electronic interference is present.
The IMU is the main component of inertial navigation systems (INS) used in aircraft, spacecraft, watercraft, and guided missiles in Geo-spatial mapping activities. The data collected from the IMUs sensors allows us to determine the orientation of the sensor, which is an important aspect in geolocating on the ground each pixel of the sensor. The IMU, like other components necessary for the operation of UASs, is miniaturized in weight and size to make it fit on small UASs. An example of these small IMUs, which are mainly designed for UASs, is the SBG 500E, [25] illustrated in Figure 2.12.
For more details on the IMU, you can visit the IMU Wikipedia page [24].
The launch and recovery element is an area that requires the most human interaction. Some UASs require elaborate launching procedures, while others can be hand thrown toward the sky. Some large UASs require long runways and other field support equipment such as fuel trucks, ground power units, and ground tugs. Similarly, the requirements for recovery procedures vary widely. Most small UASs that are used for geospatial projects require simple procedures and can be hand held or launched with the use of a catapult.
Some UASs, such as target drones, are air-launched from fixed-wing aircraft. Usually, large UASs are equipped with wheels for takeoff and landing and do not need special equipment, while smaller UASs need a variety of launch and recovery strategies depending on the complexity of the system.
A truck driven at a speed of 60 mph can be used to launch a small UAS assuming that the launching site contains a smooth surface for the truck to use. In this type of launching method, the UAS is held in a cradle above the truck cab with its nose pointed high toward the launching path, Figure 2.13. Once speed is sufficient for takeoff, the UAS is released and lifts upward toward its takeoff path.
Many small and medium-sized UAS launch systems have a requirement to be mobile, or in other words, to be mounted on a truck or a trailer. Such mobile launchers fall within one of the following types:
For more details on these launchers, refer to chapter 17 of the supplemental textbook Introduction to UAV Systems, 4th edition.
Like any other technology that requires human intervention for the safety of operation, human involvement is considered to be the most important element for the successful and safe operation of the UAS. Even with autonomous flights using autopilot, the human role during launch and recovery is crucial to the operation of the UAS. As navigation technology develops further, the human role in operating a UAS will diminish dramatically.
The human element is key in almost all operational aspects of any UAS and plays a great role in the success and survival of its operation. Starting with mission planning, humans have to design and arrange a concept of operation in order to guarantee success. Equally important is the human role in the flight control process. Autopilot can do only so much without the guidance and intervention of the operator.
The role of the pilot and the observer cannot be underestimated, as without them the flight will not occur. This is true even with the most sophisticated drones, such as the Predator. Even the Predator, with sophistication and automation built in, needs a pilot to fly it. The human element is involved in all of the following aspects of operating a UAS:
Automation in operating a UAS results in less human intervention, but it will never eliminate the role of the human in such an operation. Imagine that an airline invites you to be on board an airplane flown solely by autopilot. There are no pilots on board. Would you accept such an invitation? I am certain your answer would be a big NO. Using the same analogy, could you imagine operating a UAS, which is less sophisticated than a jetliner, without a pilot and without an observer? That is how important the human role is in operating a UAS. That is at least true for the time being. Who knows what the future may bring to this field.
Congratulations! You've finished most of the Lesson 2 material. What I hope you learned from this lesson is all you need to know about the different elements that form a UAS. The payload section is very valuable to individuals with background in geospatial mapping, as it goes through the different sensors utilized by the industry today. Understanding the functionality of each of the UAS elements will help you in the common lessons, where we are going to talk about Concepts of Operation (CONOP), risk assessment, and Certificate of Authorization (COA). Therefore, please make sure that you understand the different topics of this lesson and do not hesitate to ask questions.
Task | Description |
---|---|
1 | Complete Lessons 1 & 2 Quizzes |
2 | Complete the discussion assignment on SWOT analysis in lesson 2 on CANVAS |
3 | Install Pix4D software. Pix4D is the data processing software you will use to process UAS imagery. Follow the instructions in Canvas. |
Links
[1] https://psu.instructure.com/files/156251248/download?download_frd=1
[2] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/US%20Army%20UAS%20RoadMap%202010%202035-1.pdf
[3] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/9548_MS_RemoteSensing.pdf
[4] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/Human_Factor_Implications_200608.pdf
[5] http://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/images/lesson01/20-717_SWOT-analysis.pdf
[6] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/B4842_16MP_CCDCamera_Specs.pdf
[7] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/IXA180_IXA160.pdf
[8] https://geospatial.phaseone.com/drone-payload/p3-payload-for-drones/?utm_source=referral&utm_medium=eblast&utm_campaign=GEO-2021-05-05-Geoconnection-eblast-P3_payload_for_drones&utm_content=CTA
[9] https://geospatial.phaseone.com/drone-payload/
[10] https://www.imperx.com/bobcat-2-0-ccd/
[11] https://www.phaseone.com/
[12] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/PH_Geospatial_Overview_Deck_v5.pdf
[13] https://www.parrot.com/en/support/documentation/sequoia
[14] https://en.wikipedia.org/wiki/Electromagnetic_spectrum
[15] https://www.parrot.com/assets/s3fs-public/2021-09/sequoia_integration_manual_en.pdf
[16] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/flir-a6700sc-mwir-series-infrared-camera-datasheet.pdf
[17] https://www.flir.com/
[18] http://en.wikipedia.org/wiki/LIDAR
[19] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/DataSheet_VUX-1_14-02-2014_PRELIMINARY_4pages.pdf
[20] https://youtu.be/YaGw-dzo9Mc
[21] http://www.riegl.com/nc/
[22] http://velodynelidar.com/
[23] http://en.wikipedia.org/wiki/Global_Positioning_System
[24] http://en.wikipedia.org/wiki/Inertial_Measurement_Unit
[25] https://www.e-education.psu.edu/geog892/sites/www.e-education.psu.edu.geog892/files/IG-500E-Leaflet-1.pdf
[26] http://www.sbg-systems.com/