Systems And Methods For Using Radar-adaptive Beam Pattern For Wingtip Protection

Dusik; Matej ;   et al.

Patent Application Summary

U.S. patent application number 13/889537 was filed with the patent office on 2014-03-27 for systems and methods for using radar-adaptive beam pattern for wingtip protection. The applicant listed for this patent is Honeywell International Inc.. Invention is credited to Matej Dusik, James C. Kirk, Filip Magula, David C. Vacanti, Jiri Vasek.

Application Number20140085124 13/889537
Document ID /
Family ID49669535
Filed Date2014-03-27

United States Patent Application 20140085124
Kind Code A1
Dusik; Matej ;   et al. March 27, 2014

SYSTEMS AND METHODS FOR USING RADAR-ADAPTIVE BEAM PATTERN FOR WINGTIP PROTECTION

Abstract

Systems and methods for adaptively steering radar beam patterns for coverage during aircraft turns. The radar sensor system is mechanically or electrically steered to alter the radar sensor's beam pattern in order to adapt the radar sensor's field of view (FOV) to cover the area of anticipated aircraft wingtip trajectory. The anticipated trajectory is derived, for example, from the aircraft groundspeed, acceleration, heading, turn rate, tiller position, attitude, taxi clearance, etc.


Inventors: Dusik; Matej; (Brno, CZ) ; Vasek; Jiri; (Brno, CZ) ; Kirk; James C.; (Clarksville, MD) ; Vacanti; David C.; (Renton, WA) ; Magula; Filip; (Albrechtice, CZ)
Applicant:
Name City State Country Type

Honeywell International Inc.;

US
Family ID: 49669535
Appl. No.: 13/889537
Filed: May 8, 2013

Related U.S. Patent Documents

Application Number Filing Date Patent Number
61653297 May 30, 2012
61706632 Sep 27, 2012

Current U.S. Class: 342/29
Current CPC Class: G01S 13/931 20130101; B60Q 9/008 20130101; G01S 13/934 20200101; G01S 2013/9329 20200101; B64C 25/42 20130101; G01S 13/93 20130101; B64D 43/00 20130101; G08G 5/045 20130101; G08G 5/065 20130101; G01S 7/04 20130101; G01S 13/765 20130101; B64D 45/00 20130101; G08G 5/04 20130101; G01C 23/00 20130101; G01S 13/66 20130101
Class at Publication: 342/29
International Class: G01S 13/93 20060101 G01S013/93; G01S 13/66 20060101 G01S013/66

Claims



1. A device located on a vehicle, the device comprising: at least one electrically or mechanically steerable sensor; a processor in signal communication with the sensor, the processor configured to receive information regarding at least one of a future position of the vehicle, an estimated trajectory of at least part of the vehicle or an estimate of location of a tracked target; determine at least one area to sense based on the received information; and generate at least one sensor steering signal based on the determined area, wherein the at least one sensor is steered based on the generated at least one sensor steering signal.

2. The device of claim 1, wherein the vehicle is an aircraft located on the ground.

3. The device of claim 2, wherein the at least part of the aircraft comprises at least one wingtip or engine nacelle.

4. The device of claim 1, wherein the future position is based on taxi clearance information.

5. The device of claim 1, wherein the processor is further configured to estimate trajectory based on received position information.

6. The device of claim 1, wherein a target is tracked, based on information actively communicated from the target.

7. The device of claim 1, wherein the processor is further configured to estimate trajectory based on received navigation information.

8. The device of claim 1, wherein the processor is further configured to: generate an image based on information generated by the at least one steerable sensor; and generate an indication that at least one steerable sensor is being steered, further comprising: a display device configured to present the generated image and the generated indication.

9. A method performed by a device located on a vehicle, the method comprising: at a processor in signal communication with a sensor, receiving information regarding at least one of a future position of the vehicle, an estimated trajectory of at least part of the vehicle, or an estimate of location of a tracked target; determining at least one area to sense, based on the received information; generating at least one sensor steering signal based on the determined area, at at least one steerable sensor and steering the vehicle based on the generated at least one sensor steering signal.

10. The method of claim 9, wherein the vehicle is an aircraft located on the ground.

11. The method of claim 10, wherein the at least part of the aircraft comprises at least one wingtip or engine nacelle.

12. The method of claim 9, wherein the future position is based on taxi clearance information.

13. The method of claim 9, wherein estimating trajectory is based on received position information.

14. The method of claim 9, wherein a target is tracked, based on information actively communicated from the target.

15. The method of claim 9, wherein estimating trajectory is based on received navigation information.

16. The method of claim 9, further comprising, at the processor: generating an image based on information generated by the at least one steerable sensor; and generating an indication that at least one steerable sensor is being steered; and at a display device presenting the generated image and the generated indication.

17. A system located on a vehicle, the system comprising: a means for receiving information regarding at least one of a future position of the vehicle, an estimated trajectory of at least part of the vehicle, or an estimate of location of a tracked target, determining at least one area to sense, based on the received information, and generating at least one sensor steering signal based on the determined area; a means for steering the vehicle based on the generated at least one sensor steering signal.

18. The system of claim 17, wherein the vehicle is an aircraft located on the ground, wherein the at least part of the aircraft comprises at least one wingtip or engine nacelle.

19. The system of claim 17, wherein estimating trajectory is based on at least one of received position information, received navigation information, or taxi clearance information.

20. The system of claim 17, further comprising: a means for generating an image based on information generated by the at least one steerable sensor; a means for generating an indication that at least one steerable sensor is being steered; and a means for presenting the generated image and the generated indication.
Description



PRIORITY CLAIM

[0001] This application claims the benefit of U.S. Provisional Application Ser. No. 61/653,297, filed May 30, 2012, the contents of which are hereby incorporated by reference. This application also claims the benefit of U.S. Provisional Application Ser. No. 61/706,632, filed Sep. 27, 2012, the contents of which are hereby incorporated by reference.

BACKGROUND OF THE INVENTION

[0002] Currently there exists an expensive safety problem of aircraft wingtips clipping obstacles (e.g., 2011 Paris Air Show, an A380 accident in which a wing hit a building; 2012 Chicago O'Hare accident in which a Boeing 747 cargo aircraft's wing clipped an Embraer 140's rudder; 2011 Boston Logan Int. Airport, a Boeing 767 struck a horizontal stabilizer of a Bombardier CRJ900, etc.). Some solutions focus on object detection by radar sensors placed at the wingtips and information about these potential obstacles is presented to the pilot on a human-machine interface (e.g., head-up, head-down, or head-mounted display). A challenging drawback of this solution is the fact that the sensor signal covers only the directly forward area in front of the wingtip and leaving the side wingtip angles uncovered by the radar signal, which can be dangerous, especially in turns. Many wingtip collisions were investigated and it was found that many accidents occur in turns (e.g., 1995 London Heathrow, A340 struck B757 tail; 2006 Melbourne, B747 hit B767 horizontal stabilizer; 2010 Charlotte Douglas, A330 hit A321 rudder, etc.). Current solutions provide only limited benefit in such cases, as the obstacle would appear in the sensor's field of view (FOV) just before striking the obstacle and, thus, not providing the aircrew sufficient time for suitable reaction with respect to the given situation.

SUMMARY OF THE INVENTION

[0003] The present invention provides an enhanced system that uses adaptive steering of radar beam pattern for coverage during aircraft turns. In an exemplary solution, the radar sensor system is installed in an adaptive rack, which would mechanically or electrically alter the radar sensor's beam pattern in order to adapt the radar sensor's field of view (FOV) to cover the area of anticipated aircraft wingtip trajectory. The anticipated trajectory is derived, for example, from the aircraft groundspeed, acceleration, heading, turn rate, tiller position, attitude, taxi clearance, etc. Also, the anticipated trajectory can be derived from knowledge of the operator, i.e., the radar beam can be steered manually by the aircraft operator as well.

[0004] In one aspect of the invention, the radar sensor's beam pattern is steered, based on the trajectory information and/or based on knowledge of position(s) of obstacles.

BRIEF DESCRIPTION OF THE DRAWINGS

[0005] Preferred and alternative embodiments of the present invention are described in detail below with reference to the following drawings:

[0006] FIG. 1 is a schematic diagram of an aircraft formed in accordance with an embodiment of the present invention;

[0007] FIG. 2 is an exemplary image presented to an ownship operator formed in accordance with an embodiment of the present invention;

[0008] FIG. 3 shows an exemplary sensor's sweep range provided by a sensor on an aircraft formed in accordance with an embodiment of the present invention;

[0009] FIG. 4 shows an image of multiple aircraft taxiing on an airport with icons that represent steerable sensor beam patterns; and

[0010] FIG. 5 shows an image of multiple aircraft taxiing an airport map with icons that represent nonsteerable sensor beam patterns.

DETAILED DESCRIPTION OF THE INVENTION

[0011] In one embodiment, as shown in FIG. 1, an exemplary airport surface collision-avoidance system (ASCAS) 18 includes an aircraft 20 that includes an electrically and/or mechanically steerable sensor 26 (e.g., active sensor, radar, or passive sensor camera) included within aircraft light modules 30 or located at the other positions about the aircraft 20. The light modules 30 also include navigation/position lights 34, a processor 36, and a communication device 38. The sensors 26 are in communication via the communication device 38 (wired or wirelessly) to a user interface (UI) device 44.

[0012] In one embodiment, the UI device 44 includes a processor 50 (optional), a communication device (wired or wireless) 52, and an alerting device(s) 54. The UI device 44 provides audio and/or visual cues (e.g., via headphones, PC tablets, etc.) based on sensor-derived and processed information.

[0013] Based on information from the sensors 26, the UI device 44 provides some or all of the following functions: detect and track intruders, evaluate and prioritize threats, sensor steering and control, and declare and determine actions. Once an alert associated with a detection has been produced, then execution of a collision-avoidance action (e.g., stop the aircraft, maneuver around intruder, etc.) is manually performed by the operator or automatically by an automated system (e.g., autobrakes, auto steering).

[0014] In one embodiment, processing of the sensor information is done by the processor 36 at the sensor level and/or the processor 50 at the UI device 44.

[0015] In one embodiment, situational awareness is improved by integration with automatic dependent surveillance-broadcast/traffic information service-broadcast (ADS-B/TIS-B), airport/airline information on vehicles/aircraft/obstacles (e.g., through WiMax or other wireless communication means), and with synthetic vision system/enhanced vision system/combined vision system (SVS/EVS/CVS) received by the respective devices using the communication device 38.

[0016] In one embodiment, the present invention reduces false alarms by utilizing flight plan and taxi clearance information, and airport building/obstacle databases stored in memory 60 or received from a source via the communication device 52.

[0017] The sensors 26 included in the wing and tail navigation light modules provide near-complete sensor coverage of the aircraft 20. Full coverage can be attained by placing sensors in other lights or locations that are strategically located on the aircraft 20.

[0018] The pilot is alerted aurally, visually, and/or tactilely. For example, a visual alert presented on a primary flight or navigation display or an electronic flight bag (EFB) display shows aircraft wingtips outlined or a highlight of any obstructions. Aural alerting is through existing installed equipment, such as the interphone or other warning electronics or possibly the enhanced ground proximity warning system (EGPWS) platform.

[0019] FIG. 2 shows a top-down image 120 presented on a display that is part of the alerting device 54. The image 120 includes an ownship aircraft icon 126 with two radar beam coverage areas 124 that project forward from wingtips of the icon 126. Two range rings 132, 134 are shown on the image 120 at fixed distances in front of the wing and can be scaled using either an interface on the EFB or iPad or the cursor control device (CCD) in the aircraft, when shown on a navigation display.

[0020] In one embodiment, the processor 36 or 50 determines direction to steer or sweep the sensor(s) 26, based on information from any of a number of different sources ADS-B, flight management system (FMS), global positioning system (GPS), inertial navigation system (INS), etc. For example, a radar beam pattern produced by a radar sensor is adaptively steered to provide coverage of an incremented area during a sensed aircraft turn.

[0021] In one embodiment, the radar sensor is installed in an adaptive rack, which mechanically moves and/or adjusts the radar sensor's beam pattern in order to adapt the radar sensor's field-of-view (FOV) to cover the area into which the aircraft is taxiing (anticipated trajectory), based on the information from the other source(s). The anticipated trajectory is derived, for example, from at least some of the following data: groundspeed, acceleration, heading, turn rate, tiller position, and/or attitude, etc.

[0022] In one embodiment, a UI device (not shown) is included in the UI device 44. The UI device allows a user to control the steering of the beam(s).

[0023] In one embodiment, the sensor is installed at other fuselage areas, such as above each engine or at the nose of the aircraft, etc. Even though the sensor is not at the wingtip, the scan data is buffered, thus allowing the image 120 to be displayed.

[0024] FIG. 3 shows a top-down view of a taxiing aircraft 100. Three wingtip protection coverage areas 102, 104, 106 are shown. A default coverage area 102 has a centerline that is approximately parallel to a centerline of the aircraft 100. Also shown are maximum inner sensor deflection coverage area 104 and maximum outer sensor deflection coverage area 106. The processor(s) 36, 50 provide signals to mechanically or electrically steer the sensors 26. If the sensors are located elsewhere on the aircraft 100, then the processor(s) 36,50 provide signals that ensure the sensors scan the anticipated trajectory or track an identified obstacle/target. For example, if the processor(s) 36,50 determines a certain radius of turn (trajectory) of the wingtips, the sensors 26 will be steered so that the included area where the wingtips are turning is adequately covered.

[0025] When the beam pattern is turned, as shown in FIG. 4, obstacles 200 distributed on the surface of an airport 202 are detectable.

[0026] In this case, the obstacles 200 are detected as the adaptive beam pattern is directed on the basis of aircraft anticipated trajectory (turn), as determined by the processor(s) 36, 50, based on trajectory information determined from speed, heading, and position information received from another vehicle/aircraft system, such as a global positioning system (GPS), inertial navigation system (INS), comparable system or by operator input. The detected obstacle can be therefore presented to the pilot on a display inside the cockpit, see exemplary display image 120 in FIG. 2.

[0027] The beam coverage areas 124 will show any obstacles that are sensed when the radar has been steered. In one embodiment, the beam coverage areas 124 are parallel with the ownship icon 126 and an indicator (i.e., text (e.g., Steering Left), arrow . . . ) is presented to indicate to the operator the direction that the sensor is being steered. In another embodiment, the beam coverage areas 124 are curved (not shown) based on the determined trajectory, radius of turn, or location of a tracked target.

[0028] The beam coverages shown are given as examples. Actual beam coverages may expand into more of a cone, and permit somewhat wider effective coverage at distance. Adaptive beam steering, coupled with an ability to selectively widen the field of view to one that spreads outward with distance, will, when combined properly in software masking of the field of view and with the mechanical steering, provide nearly full coverage.

[0029] As shown in FIG. 5, the straight forward directed beams of the radar sensor installed in the wingtips of the aircraft are not able to detect these obstacles 200.

[0030] While the preferred embodiment of the invention has been illustrated and described, as noted above, many changes can be made without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is not limited by the disclosure of the preferred embodiment. Instead, the invention should be determined entirely by reference to the claims that follow.

* * * * *


uspto.report is an independent third-party trademark research tool that is not affiliated, endorsed, or sponsored by the United States Patent and Trademark Office (USPTO) or any other governmental organization. The information provided by uspto.report is based on publicly available data at the time of writing and is intended for informational purposes only.

While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, reliability, or suitability of the information displayed on this site. The use of this site is at your own risk. Any reliance you place on such information is therefore strictly at your own risk.

All official trademark data, including owner information, should be verified by visiting the official USPTO website at www.uspto.gov. This site is not intended to replace professional legal advice and should not be used as a substitute for consulting with a legal professional who is knowledgeable about trademark law.

© 2024 USPTO.report | Privacy Policy | Resources | RSS Feed of Trademarks | Trademark Filings Twitter Feed