Senising On Uavs For Mapping And Obstacle Avoidance

Lacaze; Alberto Daniel ;   et al.

Patent Application Summary

U.S. patent application number 15/176229 was filed with the patent office on 2017-07-13 for senising on uavs for mapping and obstacle avoidance. The applicant listed for this patent is Alberto Daniel Lacaze, Karl Nicholas Murphy, Raymond Paul Wilhelm, III. Invention is credited to Alberto Daniel Lacaze, Karl Nicholas Murphy, Raymond Paul Wilhelm, III.

Application Number20170201738 15/176229
Document ID /
Family ID59275062
Filed Date2017-07-13

United States Patent Application 20170201738
Kind Code A1
Lacaze; Alberto Daniel ;   et al. July 13, 2017

SENISING ON UAVS FOR MAPPING AND OBSTACLE AVOIDANCE

Abstract

Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras. The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space. At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full a 360 degree range.


Inventors: Lacaze; Alberto Daniel; (Potomac, MD) ; Murphy; Karl Nicholas; (Rockville, MD) ; Wilhelm, III; Raymond Paul; (Gaithersburg, MD)
Applicant:
Name City State Country Type

Lacaze; Alberto Daniel
Murphy; Karl Nicholas
Wilhelm, III; Raymond Paul

Potomac
Rockville
Gaithersburg

MD
MD
MD

US
US
US
Family ID: 59275062
Appl. No.: 15/176229
Filed: June 8, 2016

Related U.S. Patent Documents

Application Number Filing Date Patent Number
62175231 Jun 13, 2015

Current U.S. Class: 1/1
Current CPC Class: G01S 7/4813 20130101; B64C 2201/108 20130101; B64C 2201/14 20130101; H04N 2013/0081 20130101; G01B 11/2545 20130101; G01S 17/42 20130101; G01S 7/4816 20130101; G01S 17/48 20130101; B64C 2201/024 20130101; G01B 11/245 20130101; H04N 13/128 20180501; G01B 11/2518 20130101; H04N 13/254 20180501; G01S 7/4815 20130101; B64C 2201/162 20130101; H04N 13/271 20180501; G01S 17/933 20130101; B64C 39/024 20130101; B64C 2201/123 20130101; G01S 17/89 20130101; B64C 2201/027 20130101; H04N 13/243 20180501; G01S 17/08 20130101
International Class: H04N 13/02 20060101 H04N013/02; G01S 17/93 20060101 G01S017/93; G01S 7/481 20060101 G01S007/481; B64D 47/08 20060101 B64D047/08; H04N 5/232 20060101 H04N005/232; B64C 39/02 20060101 B64C039/02; B64C 27/08 20060101 B64C027/08; G01S 17/89 20060101 G01S017/89; G01S 17/02 20060101 G01S017/02

Claims



1. A sensing device for UAVs, comprising: a UAV; a structured light sensor; the structured light sensor configured to use the size of the quadrotor, in order to provide a disparity requirement; and a computer or microprocessor to process the structured light sensor information; and the computer or microprocessor sending the structured light sensor information to one or more recipients.

2. The sensing device for UAVs of claim 1, wherein the processing is used for obstacle avoidance.

3. The sensing device for UAVs of claim 1, wherein the processing is used for mapping the surroundings.

4. The sensing device for UAVs of claim 1, wherein the UAV is a quadrotor.

5. The sensing device for UAVs of claim 1, wherein the structured light sensor is rotated; and the rotation is accomplished by a mechanism on the vehicle.

6. The sensing device for UAVs of claim 1, wherein the structured light sensor is rotated; and the rotation is accomplished by moving the body of the vehicle.

7. The sensing device for UAVs of claim 1, wherein the structured light sensor is rotated; and the rotation is accomplished by at least one of a mechanism on the vehicle and moving the body of the vehicle, or a combination of the two.

8. The sensing device for UAVs of claim 1, wherein multiple lines are used, one horizontal line and one vertical line, to increase the coverage.

9. The sensing device for UAVs of claim 1, further comprising a time-of-flight sensor.

10. A sensing device for UAVs, comprising a quadrotor; one or more line time-of-flight sensors; a computer or microprocessor to process range information; and the computer or microprocessor sending the range information to one or more recipients.

11. The sensing device for UAVs of claim 10, wherein the processing is used for obstacle avoidance.

12. The sensing device for UAVs of claim 10, wherein the processing is used for mapping the surroundings.

13. The sensing device for UAVs of claim 10, wherein the line time-of-flight sensor is rotated; and the rotation is accomplished by a mechanism on the vehicle.

14. The sensing device for UAVs of claim 10, wherein the line time-of-flight sensor is rotated; and the rotation is accomplished by moving the body of the vehicle.

15. The sensing device for UAVs of claim 10, wherein the line time-of-flight sensor is rotated; and the rotation is accomplished by at least one of a mechanism on the vehicle and moving the body of the vehicle, or a combination of the two.

16. The sensing device for UAVs of claim 10, further comprising a structured light sensor.

17. The sensing device for UAVs of claim 16, wherein multiple lines are used, one horizontal line and one vertical line, to increase the coverage.

18. The sensing device for UAVs of claim 10, wherein the UAV is a quadrotor.

19. A sensing device for UAVs, comprising: a plurality of fisheye cameras; the cameras are separated from each other to provide parallax; four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere; a plurality of laser line scanners; the near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras; the laser projection system creates vertical lines, while the cameras will be displaced from each other horizontally` this relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space; at each point in time, a vertical stripe of the world will be triangulated; over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities; the two laser line projectors are used to create a line that can then be sensed with the omnidirectional cameras; each imager is composed of a camera module, a spectral filter, and a wide-angle compound lens; an optical bandpass filter can be installed to attenuate incoming ambient light; if no filter is installed, the imaging system can be used as a visible light imager to provide full 360 degree RGB imagery in addition to point clouds; a laser projection unit consists of a solid-state laser diode, laser pulsing circuitry, aspheric collimation lens, beam splitter, small rotating mirror, and laser line lens; the laser circuitry pulses the laser while also providing a frame trigger to each imager; the laser light is collimated into a beam using a small aspheric lens directly in front of the laser; the laser beam is then split into an upward and downward beam; each beam is reflected off a small rotating mirror coupled to a laser line lens; the upward beam creates a laser line that extends from horizontal to positive 80 degrees pitch; the downward beam creates a laser line that extends from horizontal to negative 80 degrees pitch; the structured light sensor will be able to measure 360 degrees horizontally and 160 degrees vertically; at each point in time, the sensor will generate approximately 2080 vertical range measurements; each imager capturing approximately 180 images/second, the sensor will be able to generate over 370 k points per second; the yaw scan rate can be varied, depending upon the current mission needs; the sensor can be operated with a fine yaw resolution and slow scan rate, providing detailed scans of the environment; or, the sensor can be operated with a faster yaw rate, providing faster updates at a coarser rate; and since this device relies on triangulation, the range accuracy will be dependent on range.

20. The sensing device for UAVs of claim 19, comprising: a UAV; one or more range sensors that are used to sense the surrounding environment; a time-of-flight line sensor to perform the same task as shown with the structured light sensor; and a vertical sensing plan aligned with the direction of travel.
Description



CROSS REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority from U.S. Patent Application Ser. 62/175,231, entitled "SENISING ON UAVS FOR MAPPING AND OBSTACLE AVOIDANCE", filed on 13 Jun. 2015. The benefit under 35 USC .sctn.119(e) of the United States provisional application is hereby claimed, and the aforementioned application is hereby incorporated herein by reference.

FEDERALLY SPONSORED RESEARCH

[0002] Not Applicable

SEQUENCE LISTING OR PROGRAM

[0003] Not Applicable

TECHNICAL FIELD OF THE INVENTION

[0004] The present invention relates to UAVs. More specifically, the present invention is related to providing structured light and time of flight sensors on UAVs for obstacle avoidance and creating mapping capabilities.

BACKGROUND OF THE INVENTION

[0005] There are few sensors that are well suited for autonomous mobility and mapping functions on small aerial platforms. LADAR choices that can fit the SWAP requirements are severely limited; few LADARs are available within the SWAP. One option, the single line sensor, needs to be configured into an up-down tilt configuration, the so called "yes-yes" ladar, or into a side to side pan configuration, so called "no-no" ladar, in order to get the coverage needed to traverse a complex environment.

[0006] Some other sensors provide a relatively small vertical field-of-view. Quadrotors of a small size and weight create significant pitch when traveling at high speeds. This pitch can be as high as 45 degrees when traveling at high speeds, or when quadrotors are used in windy areas.

[0007] Therefore, if a sensor with relatively small vertical field of view is installed horizontally, the vehicle will be blind in the direction of travel at high speeds. Once again, there is a need of a tilt mechanism.

[0008] The other approach, which better fits the SWAP constraints of a quadrotor, is stereo vision--or structure from motion. However, in both cases, poor lighting of an indoor environment--together with the lower quality optics camera combinations that can be carried with the quads--makes it a poor choice. Many attempts like this have been performed in the past few years, with very poor results.

Definitions

[0009] LADAR (also known as LIDAR) is an optical remote sensing technology that can measure the distance to, or other properties of a target by illuminating the target with light, often using pulses from a laser. LIDAR technology has application in geomatics, archaeology, geography, geology, geomorphology, seismology, forestry, remote sensing and atmospheric physics, as well as in airborne laser swath mapping (ALSM), laser altimetry and LIDAR contour mapping. The acronym LADAR (Laser Detection and Ranging) is often used in military contexts. The term "laser radar" is sometimes used, even though LIDAR does not employ microwaves or radio waves and therefore is not radar in the strict sense of the word.

[0010] In computing, a graphical user interface (GUI, commonly pronounced gooey) is a type of user interface that allows users to interact with electronic devices using images rather than text commands. GUIs can be used in computers, hand-held devices such as MP3 players, portable media players or gaming devices, household appliances and office equipment. A GUI represents the information and actions available to a user through graphical icons and visual indicators such as secondary notation, as opposed to text-based interfaces, typed command labels or text navigation. The actions are usually performed through direct manipulation of the graphical elements.

[0011] MAPHAC is a 3D scanning device for measuring the three-dimensional shape of an object using projected light patterns and a camera system.

[0012] A quadcopter, also called a quadrotor helicopter or quadrotor, is a multirotor helicopter that is lifted and propelled by four rotors. Quadcopters are classified as rotorcraft, as opposed to fixed-wing aircraft, because their lift is generated by a set of rotors (vertically oriented propellers). Unlike most helicopters, quadcopters use two sets of identical fixed pitched propellers; two clockwise (CW) and two counter-clockwise (CCW). These use variation of RPM to control lift and torque. Control of vehicle motion is achieved by altering the rotation rate of one or more rotor discs, thereby changing its torque load and thrust/lift characteristics.

[0013] A Small Unmanned Ground Vehicle (SUGV) is a lightweight, man portable Unmanned Ground Vehicle (UGV) capable of conducting military operations in urban terrain, tunnels, sewers, and caves. The SUGV aids in the performance of manpower-intensive or high-risk functions (i.e. urban Intelligence, Surveillance, and Reconnaissance (ISR) missions, chemical/Toxic Industrial Chemicals (TIC), Toxic Industrial Materials (TIM), reconnaissance, etc.). Working to minimize Soldiers' exposure directly to hazards, the SUGV's modular design allows multiple payloads to be integrated in a plug and play fashion.

[0014] An Unmanned Ground Vehicle (UGV) is a vehicle that operates while in contact with the ground and without an onboard human presence. UGVs can be used for many applications where it may be inconvenient, dangerous, or impossible to have a human operator present. Generally, the vehicle will have a set of sensors to observe the environment, and will either autonomously make decisions about its behavior or pass the information to a human operator at a different location who will control the vehicle through teleoperation. The UGV is the land-based counterpart to unmanned aerial vehicles and remotely operated underwater vehicles. Unmanned robotics are being actively developed for both civilian and military use to perform a variety of dull, dirty, and dangerous activities.

[0015] SWAP constraints are directed to size, weight, and power of a military platform as defined by the military for a given platform and providing a basis for which a platform and utilize components from various manufacturers.

SUMMARY OF THE INVENTION

[0016] Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. In order to accommodate these sensors on a quadrotor, modifications will be done to the location of the camera and the laser emitters as taught by the present invention.

[0017] The proposed configuration makes use of multiple fisheye cameras and laser line scanners. Four, wide degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment. If the light hits objects in the environment it is reflected and viewed by the cameras.

[0018] The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space. At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities.

BRIEF DESCRIPTION OF THE DRAWINGS

[0019] The accompanying drawings, which are incorporated herein a form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.

[0020] FIG. 1. MAPHAC is a structured light sensor that is designed for SUGVs.

[0021] FIG. 2 Point cloud generated by MAPHAC, color-coded for range. The point cloud shows a leaning ladder, and a variety of office clutter.

[0022] FIG. 3a. Quad copter with four imagers and laser projection system.

[0023] FIG. 3b. Approximate field-of-view of a single imager.

[0024] FIG. 3c. Overhead view of combined field-of-view of all imagers.

[0025] FIG. 3d. Side-view of combined field-of-view of all imagers.

[0026] FIG. 4. Two laser line projectors are used to create a line that can then be sensed with the omnidirectional cameras.

[0027] FIG. 5. Complete field of view showing laser and cameras.

[0028] FIG. 6. Expected range error of structured light sensor.

[0029] FIGS. 7a and 7b. Prototype sensing plane configuration.

DETAILED DESCRIPTION OF THE INVENTION

[0030] In the following detailed description of the invention of exemplary embodiments of the invention, reference is made to the accompanying drawings (where like numbers represent like elements), which form a part hereof, and in which is shown by way of illustration specific exemplary embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, but other embodiments may be utilized and logical, mechanical, electrical, and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.

[0031] In the following description, numerous specific details are set forth to provide a thorough understanding of the invention. However, it is understood that the invention may be practiced without these specific details. In other instances, well-known structures and techniques known to one of ordinary skill in the art have not been shown in detail in order not to obscure the invention. Referring to the figures, it is possible to see the various major elements constituting the apparatus of the present invention.

[0032] Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. In sharp contrast, with conventional stereo and structure from motion, poor lighting actually improves the range and accuracy of this sensor. There is also no need to have rich features in the environment, since the laser "projects its own features." Therefore, it will even work on featureless walls and floors.

[0033] One such approach is presented in FIG. 1, which is currently installed on a SUGV (small unmanned ground vehicle). It is designed to create very high density point clouds for mapping applications at two megapixels per second. FIG. 1 illustrates where a MAPHAC 100 is a structured light sensor that is designed for SUGVs.

[0034] FIG. 2, shows the scan of a typical cluttered room as a point cloud 200, including a ladder 201, a camera with a tripod 202, chairs 203, lamps 204, etc. In FIG. 2 the point cloud 200 generated by an MAPHAC is color-coded for range. The point cloud 200 shows a leaning ladder 201, and a variety of office clutter. The current incarnation of MAPHAC 100 is designed to become a substitute for a SUGV antenna, where it can serve as both an autonomous mobility sensor and radio antenna.

[0035] In order to accommodate these sensors on a quadrotor, modifications will be done to the location of the camera and the laser emitters. However, the core electronics and software have already been designed, but never used in this combination. The sensor is designed to meet the unique needs of an autonomous multicopter for indoor and outdoor environments, including: Large-field of view for obstacle avoidance and mapping; Light-weight system with minimal moving parts; Accurate ranges at short distances, with decreasing accuracy at longer ranges; Use of eye-safe lasers, while providing resilience to ambient light; and a Predicted weight under 150 grams.

[0036] The proposed configuration makes use of multiple fisheye cameras and laser line scanners. Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras. The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space.

[0037] At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities as illustrated by FIGS. 3a, 3b, 3c, and 3d.

[0038] FIG. 3a illustrates a Quad copter 300 with four imagers 301, 302, 303, and 304 and laser projection system 305. FIG. 3b illustrates an approximate field-of-view 306 of a single imager 301. FIG. 3c illustrates an overhead view of combined field-of-view 307 of all imagers 301, 302, 303, and 304. FIG. 3d illustrates a side-view of combined field-of-view 308 of all imagers.

[0039] FIG. 4 illustrates where two laser line projectors 401 and 402 are used to create a line 403 that can then be sensed with the omnidirectional cameras.

[0040] Each imager is composed of a camera module, a spectral filter, and a wide-angle compound lens. The camera must be small in size and weight, while providing high sensitivity and a wide dynamic range. Depending on mission requirements, an optical bandpass filter can be installed to attenuate incoming ambient light. If no filter is installed, the imaging system can be used as a visible light imager to provide full 360 degree RGB imagery in addition to point clouds.

[0041] A laser projection unit consists of a solid-state laser diode, laser pulsing circuitry, aspheric collimation lens, beam splitter, small rotating mirror, and laser line lens. The laser circuitry pulses the laser while also providing a frame trigger to each imager. The laser light is collimated into a beam 403 and 404 using a small aspheric lens directly in front of the laser. The laser beam is then split into an upward and downward beam 403 and 404. Each beam 403 and 404 is reflected off a small rotating mirror coupled to a laser line lens. The upward beam 403 creates a laser line that extends from horizontal to positive 80 degrees pitch, while the downward beam 404 creates a laser line that extends from horizontal to negative 80 degrees pitch.

[0042] The proposed field-of-view (shown in FIG. 4) shows the field-of-view of the projected lines 403 and 404. FIG. 5 shows the combined field-of-view of the cameras 405 and 406 and laser projectors 308.

[0043] The structured light sensor will be able to measure 360 degrees horizontally and 160 degrees vertically. At each point in time, the sensor will generate approximately 2080 vertical range measurements. With each imager capturing approximately 180 images/second, the sensor will be able to generate over 370 k points per second.

[0044] The yaw scan rate can be varied, depending upon the current mission needs. The sensor can be operated with a fine yaw resolution and slow scan rate, providing detailed scans of the environment; or, the sensor can be operated with a faster yaw rate, providing faster updates at a coarser rate.

[0045] Since this device relies on triangulation, the range accuracy will be dependent on range. The expected range error 600 is shown in FIG. 6 in graph format.

[0046] A second approach is to use a time-of-flight line sensor to perform the same task as shown with the structured light sensor. The line sensors can be organized as seen in FIGS. 7a and 7b.

[0047] One more possible configuration is the same as shown in FIGS. 7a and 7b, but with the vertical sensing plan 700 aligned with the direction of travel 701.

[0048] The system is composed of a quadrotor, or other UAV, and one or more range sensors that are used to sense the surrounding environment.

[0049] Thus, it is appreciated that the optimum dimensional relationships for the parts of the invention, to include variation in size, materials, shape, form, function, and manner of operation, assembly and use, are deemed readily apparent and obvious to one of ordinary skill in the art, and all equivalent relationships to those illustrated in the drawings and described in the above description are intended to be encompassed by the present invention.

[0050] Furthermore, other areas of art may benefit from this method and adjustments to the design are anticipated. Thus, the scope of the invention should be determined by the appended claims and their legal equivalents, rather than by the examples given.

* * * * *


uspto.report is an independent third-party trademark research tool that is not affiliated, endorsed, or sponsored by the United States Patent and Trademark Office (USPTO) or any other governmental organization. The information provided by uspto.report is based on publicly available data at the time of writing and is intended for informational purposes only.

While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, reliability, or suitability of the information displayed on this site. The use of this site is at your own risk. Any reliance you place on such information is therefore strictly at your own risk.

All official trademark data, including owner information, should be verified by visiting the official USPTO website at www.uspto.gov. This site is not intended to replace professional legal advice and should not be used as a substitute for consulting with a legal professional who is knowledgeable about trademark law.

© 2024 USPTO.report | Privacy Policy | Resources | RSS Feed of Trademarks | Trademark Filings Twitter Feed