Head-up display and vehicle using the same

Graf , et al. August 2, 2

Patent Grant 9405120

U.S. patent number 9,405,120 [Application Number 14/548,281] was granted by the patent office on 2016-08-02 for head-up display and vehicle using the same. This patent grant is currently assigned to MAGNA ELECTRONICS SOLUTIONS GMBH. The grantee listed for this patent is MAGNA ELECTRONICS SOLUTIONS GMBH. Invention is credited to Matthias Doepp, Stefan Graf.


United States Patent 9,405,120
Graf ,   et al. August 2, 2016

Head-up display and vehicle using the same

Abstract

A head-up display (HUD) comprises a detection unit, a picture generating unit and a control unit. The detection unit detects a position of an object and generates a detection signal indicated the position of the object. The picture generating unit comprises a screen and an optical unit. The screen displays a visual image. The optical unit projects the visual image onto the screen. The control unit is coupled to the detection unit and the picture generating unit, controlling the screen to facing toward the object in response to the detection signal.


Inventors: Graf; Stefan (Taipei, TW), Doepp; Matthias (Taipei, TW)
Applicant:
Name City State Country Type

MAGNA ELECTRONICS SOLUTIONS GMBH

Wetzlar

N/A

DE
Assignee: MAGNA ELECTRONICS SOLUTIONS GMBH (Wetzlar, DE)
Family ID: 55961525
Appl. No.: 14/548,281
Filed: November 19, 2014

Prior Publication Data

Document Identifier Publication Date
US 20160139409 A1 May 19, 2016

Current U.S. Class: 1/1
Current CPC Class: G02B 27/0101 (20130101); G02B 27/0149 (20130101); G02B 2027/0154 (20130101); G02B 27/0093 (20130101); G02B 2027/0187 (20130101); G02B 2027/0161 (20130101)
Current International Class: G02B 27/14 (20060101); G02B 27/01 (20060101)
Field of Search: ;359/630-639

References Cited [Referenced By]

U.S. Patent Documents
1523495 January 1925 Silberman
1629456 May 1927 Pellegrini
1684499 September 1928 Mayer
1721347 July 1929 Macrae et al.
2117160 May 1938 Gale
2360227 October 1944 Hemphill
2688865 September 1954 Foster et al.
2760050 August 1956 Porsche
3016968 January 1962 Lenz et al.
3241344 March 1966 Peters
3553448 January 1971 Davis et al.
3582639 June 1971 Chamberlain
3596484 August 1971 Peters
3678716 July 1972 Cobb
3759556 September 1973 Wright
3766539 October 1973 Bradshaw et al.
3829693 August 1974 Schwarz
3839640 October 1974 Rossin
3956732 May 1976 Teich
3992909 November 1976 McGhee
4007955 February 1977 Kobayashi
4052716 October 1977 Mortensen
4080812 March 1978 Knott
4122371 October 1978 Talmage et al.
4127966 December 1978 Schmidt
4155233 May 1979 Lira
4166955 September 1979 Keller
4242669 December 1980 Crick
4312197 January 1982 Carrion et al.
4318089 March 1982 Frankel et al.
4322959 April 1982 Mochida
4342210 August 1982 Denningham
4365232 December 1982 Miller
4371205 February 1983 Kaveney, Jr.
4379971 April 1983 Smith et al.
4384207 May 1983 Doctor
4418335 November 1983 Genahr
4437003 March 1984 Doctor
4441023 April 1984 Doctor et al.
4464649 August 1984 Her
4468657 August 1984 Rossin
4482179 November 1984 Johnson
4507654 March 1985 Stolarczyk et al.
4546417 October 1985 Watts
4556796 December 1985 Renals
4604524 August 1986 Kotlicki et al.
4612442 September 1986 Toshimichi
4645233 February 1987 Bruse et al.
4647967 March 1987 Kirschner
4667990 May 1987 Quantz
4697081 September 1987 Baker
4704533 November 1987 Rose et al.
4709153 November 1987 Schofield
4745284 May 1988 Masuda et al.
4746910 May 1988 Pfister et al.
4752768 June 1988 Steers et al.
4764755 August 1988 Pedtke et al.
4775347 October 1988 Takada et al.
4796013 January 1989 Yasuda et al.
4797657 January 1989 Vorzimmer et al.
4825079 April 1989 Takamatsu et al.
4848114 July 1989 Rippe
4848509 July 1989 Bruhnke et al.
4857912 August 1989 Everett, Jr. et al.
4868390 September 1989 Keller et al.
4881148 November 1989 Lambropoulos et al.
4895009 January 1990 Kleefeldt et al.
4928212 May 1990 Benavides
4930864 June 1990 Kuster et al.
4933668 June 1990 Oyer et al.
4952808 August 1990 Turnbull et al.
4954813 September 1990 August, Sr. et al.
4979384 December 1990 Malesko et al.
4981314 January 1991 Carr
4982094 January 1991 Matsuda
5003800 April 1991 Bublewicz
5027104 June 1991 Reid
5030012 July 1991 Hagins et al.
5045702 September 1991 Mulleer
5054686 October 1991 Chuang
5054826 October 1991 Dow et al.
5063371 November 1991 Oyer et al.
5065976 November 1991 Woody
5071160 December 1991 White et al.
5077549 December 1991 Hershkovitz et al.
5084696 January 1992 Guscott et al.
5093656 March 1992 Dipoala
5166679 November 1992 Vranish et al.
5174643 December 1992 Priesemuth
5216407 June 1993 Hwang
5219413 June 1993 Lineberger
5231359 July 1993 Masuda et al.
5276772 January 1994 Wang et al.
5283551 February 1994 Guscott
5297010 March 1994 Camarota et al.
5317620 May 1994 Smith
5349329 September 1994 Smith
5383703 January 1995 Irvine, III
5404128 April 1995 Ogino et al.
5406171 April 1995 Moody
5409273 April 1995 Claar et al.
5424711 June 1995 Muller et al.
5424712 June 1995 Rosenberger
5424718 June 1995 Muller et al.
5445326 August 1995 Ferro et al.
5457493 October 1995 Leddy et al.
5457575 October 1995 Groves et al.
5482314 January 1996 Corrado et al.
5486810 January 1996 Schwarz
5512836 April 1996 Chen et al.
5525843 June 1996 Howing
5550677 August 1996 Schofield et al.
5580153 December 1996 Motz
5581230 December 1996 Barrett
5585625 December 1996 Spies
5636536 June 1997 Kinnucan
5663704 September 1997 Allen et al.
5670935 September 1997 Schofield et al.
5677701 October 1997 Okuyama et al.
5680096 October 1997 Grasmann
5693943 December 1997 Tchernihovski et al.
5711559 January 1998 Davis
5719551 February 1998 Flick
5724024 March 1998 Sonderegger et al.
5726629 March 1998 Yu
5737083 April 1998 Owechko et al.
5793291 August 1998 Thornton
5796094 August 1998 Schofield et al.
5802479 September 1998 Kithil et al.
5805056 September 1998 Mueller et al.
5808552 September 1998 Wiley et al.
5848802 December 1998 Breed et al.
5859479 January 1999 David
5877897 March 1999 Schofield et al.
5887466 March 1999 Yoshizawa
5914610 June 1999 Gershenfeld et al.
5933090 August 1999 Christenson
5938321 August 1999 Bos et al.
5949331 September 1999 Schofield et al.
5949340 September 1999 Rossi
5986549 November 1999 Teodorescu
6000076 December 1999 Webster et al.
6018292 January 2000 Penny, Jr.
6024388 February 2000 Tomah et al.
6028509 February 2000 Rice
6051981 April 2000 Gershenfeld et al.
6067019 May 2000 Scott
6086131 July 2000 Bingle et al.
6091322 July 2000 Ang et al.
6104293 August 2000 Rossi
6130614 October 2000 Miller et al.
6135514 October 2000 Kowalewski et al.
6139172 October 2000 Bos et al.
6166625 December 2000 Teowee et al.
6209933 April 2001 Ang et al.
6222442 April 2001 Gager et al.
6226816 May 2001 Webster et al.
6254261 July 2001 Bingle et al.
6275146 August 2001 Kithil et al.
6313454 November 2001 Bos et al.
6320176 November 2001 Schofield et al.
6335687 January 2002 Terashima et al.
6339376 January 2002 Okada
6349984 February 2002 Marrazzo et al.
6353392 March 2002 Schofield et al.
6356854 March 2002 Schubert et al.
6390529 May 2002 Bingle et al.
6396397 May 2002 Bos et al.
6460906 October 2002 Bingle et al.
6477464 November 2002 McCarthy et al.
6480103 November 2002 McCarthy et al.
6485081 November 2002 Bingle et al.
6497503 December 2002 Dassanayake et al.
6498620 December 2002 Schofield et al.
6504518 January 2003 Kuwayama et al.
6515582 February 2003 Teowee et al.
6542305 April 2003 Nakamura et al.
6578871 June 2003 Gray et al.
6587770 July 2003 Gray et al.
6657789 December 2003 Nakamura et al.
6678614 January 2004 McCarthy et al.
6690268 February 2004 Schofield et al.
6753780 June 2004 Li
6768420 July 2004 McCarthy et al.
6783167 August 2004 Bingle et al.
6806452 October 2004 Bos et al.
6824281 November 2004 Schofield et al.
6946978 September 2005 Schofield
6975775 December 2005 Rykowski et al.
7004593 February 2006 Weller et al.
7004606 February 2006 Schofield
7005974 February 2006 McMahon et al.
7038577 May 2006 Pawlicki et al.
7043056 May 2006 Edwards et al.
7062300 June 2006 Kim
7065432 June 2006 Moisel et al.
7097226 August 2006 Bingle et al.
7123168 October 2006 Schofield
7167796 January 2007 Taylor et al.
7227611 June 2007 Hull et al.
7331671 February 2008 Hammoud
7460693 December 2008 Loy et al.
7526103 April 2009 Schofield et al.
7572008 August 2009 Elvesjo et al.
7639149 December 2009 Katoh
7653213 January 2010 Longhurst et al.
7720580 May 2010 Higgins-Luthman
7855755 December 2010 Weller et al.
7869129 January 2011 Lebreton
7914187 March 2011 Higgins-Luthman et al.
8066375 November 2011 Skogo et al.
8120577 February 2012 Bouvin et al.
8165347 April 2012 Heinzmann et al.
8185845 May 2012 Bjorklund et al.
8220926 July 2012 Blixt et al.
8258932 September 2012 Wahlstrom
8314707 November 2012 Kobetski et al.
8339446 December 2012 Blixt et al.
8342687 January 2013 Blixt et al.
8482535 July 2013 Pryor
8562136 October 2013 Blixt et al.
8610768 December 2013 Holmberg et al.
2003/0021043 January 2003 Nakamura et al.
2004/0114381 June 2004 Salmeen et al.
2005/0024490 February 2005 Harada et al.
2008/0088527 April 2008 Fujimori et al.
2008/0285138 November 2008 Lebreton
2009/0067057 March 2009 Sprague
2009/0086329 April 2009 Potakowskyj et al.
2009/0237803 September 2009 Hotta
2010/0253597 October 2010 Seder et al.
2012/0176683 July 2012 Rumpf et al.
2012/0188650 July 2012 Rumpf et al.
2013/0127980 May 2013 Haddick et al.
2013/0194426 August 2013 Schofield et al.
2014/0098008 April 2014 Hatton
2014/0139676 May 2014 Wierich
2014/0207344 July 2014 Ihlenburg et al.
2014/0218529 August 2014 Mahmoud
2014/0333729 November 2014 Pflug
2014/0336876 November 2014 Gieseke et al.
2015/0022664 January 2015 Pflug
2015/0112586 April 2015 Ihara et al.
2015/0129343 May 2015 Teng et al.
2015/0232030 August 2015 Bongwald
2015/0294169 October 2015 Zhou
2015/0296135 October 2015 Wacquant
Foreign Patent Documents
2636099 Feb 1978 DE
3732936 Sep 1987 DE
9006007 Jun 1991 DE
102012203491 Sep 2013 DE
0235372 Nov 1986 EP
1550572 Jul 2005 EP
2693807 Jan 1994 FR
2252438 Aug 1992 GB
2246878 Dec 1992 GB
2266799 Nov 1993 GB
2005153723 Jun 2005 JP
2006036018 Feb 2006 JP
2007302195 Nov 2007 JP
2009515768 Apr 2009 JP
5779765 Jul 2015 JP
WO 9739920 Oct 1997 WO

Other References

"Kit 62 Movement Detector Components", Sep. 24, 1994 p. 1-5. cited by applicant .
Kircher et al., "Vehicle Control and Drowsiness", VTI meddelande 922A, Swedish National Road and Transport Research Institute, Feb. 2012. cited by applicant .
Tobii Technology the World Leader in Eye Tracking and Gaze Interaction Company Article, Feb. 14, 2012. cited by applicant .
EYETRACKING, INC., "EyeTracking, Inc. Timeline" Page, Website Screenshot, Feb. 2014. cited by applicant .
SEEING MACHINES, INC., "Seeing Machines | Saves Lives" Page, Website Screenshot, Nov. 2014. cited by applicant .
SECURAPLANE TECHNOLOGIES LLC, "Securaplane System 500" Product Information, Jan. 16, 2008. cited by applicant .
Yarrow, "Eye-tracking technology--how your car will watch your every move for you", MSN Cars, Feb. 18, 2014. cited by applicant .
WIKIPEDIA, "Driver Monitoring System" Website Screenshot, Dec. 11, 2010. cited by applicant .
AUTOLIVE, SmartEye Website Screenshot, Oct. 22, 2014. cited by applicant.

Primary Examiner: Hasan; Mohammed
Attorney, Agent or Firm: Gardner, Linn, Burkhart & Flory, LLP

Claims



What is claimed is:

1. A head-up display (HUD), comprising: a detection unit for detecting a position of an object and generating a detection signal that indicates the position of the object, the object comprising at least a portion of a head of a person within a vehicle; a picture generating unit comprising: a screen configured to be spaced from a vehicle windshield for displaying a visual image so that the displayed visual image on the screen is viewable by the person within the vehicle; and an optical unit coupled to the screen, the optical unit projecting the visual image onto the screen; and a control unit coupled to the detection unit and the picture generating unit, the control unit, responsive to the detection signal, controlling a position of the screen relative to the vehicle windshield to adjust the screen so that the screen is facing generally toward the object to enhance viewing of the displayed visual image at the screen by the person within the vehicle.

2. The HUD of claim 1, wherein the control unit performs face recognition to determine the position of the object relative to the HUD.

3. The HUD of claim 1, wherein the optical unit comprises: a projector for projecting the visual image; and a reflector for guiding the visual image from the projector to the screen.

4. The HUD according to claim 3, wherein the control unit rotates the picture generating unit around a fulcrum, and the fulcrum is located along a line between the center of the screen and the center of the visual image.

5. The HUD of claim 1, wherein the control unit rotates the body of the HUD to control the screen facing towards the object.

6. The HUD of claim 1, wherein the coverage of a detection area of the detection unit covers a driver's position.

7. The HUD of claim 1, wherein the detection unit is installed in a base of the HUD.

8. The HUD of claim 1, wherein the detection unit is installed in a border of the screen.

9. A vehicle applying a head-up display (HUD), wherein the HUD comprises: a detection unit for detecting a position of an object and generating a detection signal that indicates the position of the object, the object comprising at least a portion of a head of a person within the vehicle; a picture generating unit comprising: a screen spaced from a windshield of the vehicle for displaying a visual image so that the displayed visual image on the screen is viewable by the person within the vehicle; and an optical unit coupled to the screen, the optical unit projecting the visual image onto the screen; and a control unit coupled to the detection unit and the picture generating unit, the control unit, responsive to the detection signal, controlling a position of the screen relative to the vehicle windshield to adjust the screen facing generally toward the object to enhance viewing of the displayed visual image at the screen by the person within the vehicle.

10. The vehicle of claim 9, wherein the control unit performs face recognition to determine the position of the object relative to the HUD.

11. The vehicle of claim 9, wherein the optical unit comprises: a projector for projecting the visual image; and a reflector for guiding the visual image from the projector to the screen.

12. The vehicle according to claim 11, wherein the control unit rotates the picture generating unit around a fulcrum, and the fulcrum is located along a line between the center of the screen and the center of the visual image.

13. The vehicle of claim 9, wherein the control unit rotates the body of the HUD to control the screen facing towards the object.

14. The vehicle of claim 9, wherein the coverage of a detection area of the detection unit covers a driver's position.

15. The vehicle of claim 9, wherein the detection unit is installed in a base of the HUD.

16. The vehicle of claim 9, wherein the detection unit is installed in a border of the screen.
Description



TECHNICAL FIELD

The disclosure relates in general to an optical display device and a vehicle using the same, and in particular to a head-up display (HUD) and a vehicle using the same.

BACKGROUND

Today, head-up displays (HUDs) are commonly used in vehicles. Current HUDs can be categorized into two groups: windscreen HUD and combiner HUD. The former uses the windscreen of vehicle as a projection surface, while the later uses an additional combiner screen as the projection surface. The combiner screen HUDs are superior to the windscreen HUDs in means of over-all costs, and in addition, the combiner HUDs can be easily sold as aftermarket products in a one-box design.

However, the combiner HUD has the particular drawback of their smaller screen size of the projection surface. This may lead to a smaller text and symbol size or less information to be displayed on the combiner screens. Furthermore, the display of the combiner HUD usually cannot use its total surface for the projection because it must compensate for movements of the driver's head.

SUMMARY

The disclosure is directed to a head-up display (HUD) and a vehicle using the same, which are capable of increasing the usable area of the projection screen of the HUD.

According to one embodiment, a HUD is provided. The HUD comprises a detection unit, a picture generating unit and a control unit. The detection unit detects a position of an object and generates a detection signal indicated the position of the object. The picture generating unit comprises a screen and an optical unit. The screen displays a visual image. The optical unit projects the visual image onto the screen. The control unit is coupled to the detection unit and the picture generating unit, controlling the screen to facing toward the object in response to the detection signal.

According to another embodiment, a vehicle applying a HUD is provided. The HUD comprises a detection unit, a picture generating unit and a control unit. The detection unit detects a position of an object and generates a detection signal indicated the position of the object. The picture generating unit comprises a screen and an optical unit. The screen displays a visual image. The optical unit projects the visual image onto the screen. The control unit is coupled to the detection unit and the picture generating unit, controlling the screen to facing toward the object in response to the detection signal.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 shows a block diagram of a head-up display (HUD) according to an embodiment of the present disclosure.

FIG. 2A shows a top view of a picture generating unit of a HUD according to an embodiment of the present disclosure.

FIG. 2B shows a schematic diagram of the configuration of the picture generating unit according to an embodiment of the present disclosure.

FIG. 2C shows the potential location for disposing the fulcrum of the HUD.

FIG. 3 shows a schematic diagram of a HUD according to an embodiment of the present disclosure.

FIG. 4 shows a schematic diagram of a HUD according to an embodiment of the present disclosure.

FIG. 5 shows a schematic diagram of a vehicle installing a HUD.

FIG. 6A shows an exemplary picture displayed on a screen of a HUD in a view from a driver's position.

FIG. 6B shows an exemplary picture displayed on a screen of a HUD in a view from a driver's position.

In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.

DETAILED DESCRIPTION

Below, exemplary embodiments will be described in detail with reference to accompanying drawings so as to be easily realized by a person having ordinary knowledge in the art. The inventive concept may be embodied in various forms without being limited to the exemplary embodiments set forth herein. Descriptions of well-known parts are omitted for clarity, and like reference numerals refer to like elements throughout.

FIG. 1 shows a block diagram of a head-up display (HUD) 100 according to an embodiment of the present disclosure. The HUD 100 comprises a detection unit 102, a picture generating unit 104 and a control unit 106.

The detection unit 102 is configured to detect an object and generate a detection signal which indicated the position of the object relative to the HUD 100. For example, the detection unit 102 may detect the driver's head/face in the vehicle from the perspective of the HUD 100 and generate a detection signal which indicated the position of the driver relative to the HUD 100 accordingly. The detection unit 102 can realized by a camera, an ultrasonic sensor, a radar, an infra-red (IR) based sensor or any other sensors can be used for performing such detection. The detection signal may be, for example, a digital representation of the detected object. The position of the driver's head/face relative to the HUD 102 can be identified and processed by the control unit 106 based on the detection signal. Details of the operation of the control unit 106 are further exemplified and described in the following detailed description.

The picture generating unit 104 comprises a screen 108 and an optical unit 110. The screen 108 is configured to display a visual image. The optical unit 110 is coupled to the screen 108 and configured to project the visual image onto the screen 108. In one embodiment, the visual image may include driving-relevant information such as navigation map, vehicle speed, the amount of remaining oil, over speed warning and etc. Thus, the driver can keep his/her eyes on the road while checking on the driving-relevant information provided by the HUD 100 thereby ensuring the safety and security.

The control unit 106 is coupled to the detection unit 102 and the picture generating unit 104, controlling the screen 108 to facing toward the object in response to the detection signal. In one embodiment, the control unit 106 is configured to control an angle of inclination and/or rotation of the picture generating unit 104 in response to the detection signal. For example, in response to the detection signal generated by the detection unit 102, the control unit 106 may control the angle of inclination and/or rotation of the picture generating unit 104 to adjust the screen 108 to face towards the object.

The angle of inclination and/or rotation of the picture generating unit 104 controlled by the control unit 106 can be determined by the position of the object. For example, after the detection unit 102 detects an object (e.g., the driver's head/face), the control unit 106 may determine the position of the object relative to the HUD 100 by such as performing face recognition and then control the angle of inclination and/or rotation of the picture generating unit 104 to adjust the screen 108 to face towards the position of the object. Accordingly, even the driver changes his/her position while driving, the movement of the driver can be compensated by adjusting the screen 108 to face towards the driver's head/face, and thus the usable area of the screen 108 of the HUD 100 can be increased.

Referring to FIG. 2A and FIG. 2B, FIG. 2A shows an exemplary top view of the HUD 100 according to an embodiment of the present disclosure, and FIG. 2B shows a schematic diagram of the configuration of the picture generating unit 104 of the HUD 100. In the embodiment, the optical unit 110 of the HUD 100 comprises a projector 2102 and a reflector 2104.

The projector 2102 is configured to project the visual image. In one embodiment, the projector 2102 can be a TFT display or other kinds of picture projecting source. The reflector 2104 is configured to guide the visual image from the projector 2102 to the screen 108 and thus the visual image can be projected onto the screen 108. In one embodiment, the reflector 2104 can be a mirror.

As shown in FIG. 2B, light projected by the projector 2102 is guided to the screen 108 through the reflector 2104 so that the driver can see the virtual image VI through the screen 108. In one embodiment, the distance (DL) between the driver and the virtual image VI is 190 mm. In one embodiment, the virtual image VI measures by 130 mm.times.40 mm.

Referring to FIG. 2A again, in one embodiment, the picture generating unit 104 can be controlled by the control unit 106 to rotate around a vertical axis. As shown in FIG. 2A, the HUD 100 further comprises a fulcrum (FP) used for rotating the picture generating unit 104 horizontally.

FIG. 2C shows the potential location for disposing the fulcrum (FP) of the HUD 100. In FIG. 2C, the box EA represents the position of the driver's eyes. The driver may see the virtual image VI through the screen 108. In the embodiment, the fulcrum (FP) is located along a line between the center of the screen 108 and the center of the visual image VI. However, the present disclosure is not limited thereto. The fulcrum (FP) can also be disposed at other positions of the HUD 100 according to various designs.

FIG. 3 shows a schematic diagram of a HUD 300 according to an embodiment of the present disclosure. In the example of FIG. 3, the picture generating unit 304 is a movable part of the HUD 300 and can be rotated in at least two directions (a vertical direction and a horizontal direction). For example, in response to a detection signal generated by the detection unit 302, the control unit (not shown in FIG. 3) of the HUD 300 may control an angle of inclination and/or rotation of the picture generating unit 304 to adjust the screen 306 of the HUD 300 to face towards the object 31 (e.g., the driver's head/face).

FIG. 4 shows a schematic diagram of a HUD 400 according to an embodiment of the present disclosure. In the example of FIG. 4, the detection unit 402 captures a picture of an object 41 (e.g., the driver's head/face) and generates a detection signal accordingly. In response to the detection signal of the detection unit 402, the control unit (not shown in FIG. 4) of the HUD 400 may rotate the whole body of the HUD 400 so as to control the screen 406 of the HUD 400 facing towards the object 41. In other words, compared to the embodiment shown in FIG. 3, the HUD 400 of the embodiment shown in FIG. 4 can be rotated horizontally in response to the detection signal so that the screen 406 of the HUD 400 can maintain facing towards the object 41 even though the object 41 changes its position.

FIG. 5 shows a schematic diagram of a vehicle 51 applying a HUD 500. In the example of FIG. 5, the coverage of a detection area (DA) of the detection unit 502 covers a driver's position (DP). The detection unit 502 may detect the driver's head/face and then generate a detection signal. In response to the detection signal generated by the detection unit 502, the control unit (not shown in FIG. 5) of the HUD 500 may determine the driver's position (DP) relative to the HUD 500 by such as performing face recognition.

The detection unit 502 of the HUD 500 can be located at any position inside the vehicle 51 from which it can detect the position of the driver's eyes. In one embodiment, the detection unit 502 is installed in a position that allows the coverage of the detection area (DA) to cover the driver's position (DP) in the vehicle 51. As shown in FIG. 5, the detection unit 502 is installed in the base (BS) of the HUD 500 that faces towards the driver. In another example, the detection unit 502 can be installed at the border (BD) of the screen 504 of the HUD 500 (if existing) as long as the detection unit 502 can detect the position of the driver's eyes.

FIG. 6A shows an exemplary picture displayed on a screen 604 of a HUD 600 in a view from a driver's position (DP'). In the example of FIG. 6A, some navigation information can not be seen by the driver when the screen 604 does not face towards the driver because the viewable area of screen 604 is limited by the viewing angle of the driver.

When the HUD 600 finds that the screen 604 does not face towards the driver, the HUD 600 may perform the above mentioned adjustment operations to adjust the screen 604 to face towards the driver. After the screen 604 is adjusted to face towards the driver, the viewable area of screen 604 is maximized and the loss of navigation information is avoided, as shown in FIG. 6B.

According to the above, the HUD of the present disclosure may control its screen facing towards the driver. Accordingly, part of the area of screen reserved for compensating the movement of the driver can be reduced or omitted, and thus the usable area of the screen of the HUD can be increased.

It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.

* * * * *


uspto.report is an independent third-party trademark research tool that is not affiliated, endorsed, or sponsored by the United States Patent and Trademark Office (USPTO) or any other governmental organization. The information provided by uspto.report is based on publicly available data at the time of writing and is intended for informational purposes only.

While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, reliability, or suitability of the information displayed on this site. The use of this site is at your own risk. Any reliance you place on such information is therefore strictly at your own risk.

All official trademark data, including owner information, should be verified by visiting the official USPTO website at www.uspto.gov. This site is not intended to replace professional legal advice and should not be used as a substitute for consulting with a legal professional who is knowledgeable about trademark law.

© 2024 USPTO.report | Privacy Policy | Resources | RSS Feed of Trademarks | Trademark Filings Twitter Feed