Method, apparatus and system providing an integrated hyperspectral imager

Smith; Scott

Patent Application Summary

U.S. patent application number 11/642867 was filed with the patent office on 2007-09-06 for method, apparatus and system providing an integrated hyperspectral imager. This patent application is currently assigned to Micron Technology, Inc.. Invention is credited to Scott Smith.

Application Number20070206242 11/642867
Document ID /
Family ID38193548
Filed Date2007-09-06

United States Patent Application 20070206242
Kind Code A1
Smith; Scott September 6, 2007

Method, apparatus and system providing an integrated hyperspectral imager

Abstract

Methods, apparatuses, and systems are disclosed which provide a plurality of pixel arrays on a common substrate each associated with a hyperspectral filter. Images from each of the arrays may be separately analyzed or combined into a composite image.


Inventors: Smith; Scott; (Los Angeles, CA)
Correspondence Address:
    DICKSTEIN SHAPIRO LLP
    1825 EYE STREET, NW
    WASHINGTON
    DC
    20006
    US
Assignee: Micron Technology, Inc.

Family ID: 38193548
Appl. No.: 11/642867
Filed: December 21, 2006

Related U.S. Patent Documents

Application Number Filing Date Patent Number
11367580 Mar 6, 2006
11642867

Current U.S. Class: 358/505 ; 348/E3.032; 348/E9.007
Current CPC Class: H04N 5/3415 20130101; H04N 9/093 20130101; H01L 27/14621 20130101; H04N 3/1593 20130101; H01L 27/14623 20130101
Class at Publication: 358/505
International Class: H04N 1/46 20060101 H04N001/46

Claims



1. An imager apparatus comprising: a plurality of pixel arrays on a single die; and a plurality of imaging lenses for focusing an image on the pixel arrays, each of the pixel arrays being associated with a respective hyperspectral bandpass filter.

2. The imager apparatus of claim 1, wherein each of the hyperspectral filters has a spectral bandpass of less than 100 nm.

3. The imager apparatus of claim 2, wherein each of the hyperspectral filters has a spectral bandpass of between 50 nm and 100 nm.

4. The imager apparatus of claim 2, wherein each of the hyperspectral filters has a spectral bandpass of less than 10 nm.

5. The imager apparatus of claim 4, wherein each of the hyperspectral filters has a spectral bandpass of between 5 nm and 10 nm.

6. The imager apparatus of claim 1, wherein each hyperspectral bandpass filter is integrated with its associated imaging lens.

7. The imager apparatus of claim 6, wherein each hyperspectral bandpass filter is a thin film coating on its associated lens.

8. The imager apparatus of claim 1, wherein each hyperspectral bandpass filter is a separate element from its associated imaging lens.

9. The imager apparatus of claim 1, wherein each hyperspectral bandpass filter is a thin film coating on a surface of its associated pixel array.

10. The imager apparatus of claim 1, wherein each hyperspectral bandpass filter has a unique bandpass range.

11. The imager apparatus of claim 1, wherein the hyperspectral bandpass filters collectively cover a spectrum of between about 700 nm and about 1000 nm.

12. The imager apparatus of claim 1, wherein the hyperspectral bandpass filters collectively cover a spectrum of between about 200 nm and about 400 nm.

13. The imager apparatus of claim 1, wherein the plurality of pixel arrays comprise M.times.N pixel arrays arranged in a M.times.N pattern of arrays on the die.

14. The imager apparatus of claim 13, wherein M equals N.

15. The imager apparatus of claim 13, wherein M does not equal N.

16. The imager apparatus of claim 1, wherein the pixel arrays are spaced less than 2 mm apart.

17. The imager apparatus of claim 16, wherein the pixel arrays are spaced about 0.5 mm to about 1 mm apart.

18. The imager apparatus of claim 16, wherein the imaging lenses and associated hyperspectral bandpass filters are configured to capture a full image of a scene.

19. The imager apparatus of claim 1, further comprising an optical barrier between adjacent pixel arrays.

20. The imager apparatus of claim 1, further comprising an optical barrier between adjacent imaging lenses.

21. An imager apparatus comprising: a plurality of pixel arrays formed on a single die, wherein the plurality of arrays are spaced apart by less than 2 millimeters; a plurality of hyperspectral bandpass filters respectively associated with the pixel arrays, each of the hyperspectral filters having a spectral bandpass of less than 100 nm; and a plurality of imaging lenses respectively associated with the pixel arrays.

22. The imager apparatus of claim 21, wherein each of the hyperspectral filters has a spectral bandpass of between 50 nm and 100 nm.

23. The imager apparatus of claim 21, wherein each of the hyperspectral filters has a spectral bandpass of less than 10 nm.

24. The imager apparatus of claim 23, wherein each of the hyperspectral filters has a spectral bandpass of between 5 nm and 10 nm.

25. The imager apparatus of claim 21, wherein each imaging lens has a hyperspectral bandpass filter associated with its respective imaging lens.

26. The imager apparatus of claim 25, wherein each hyperspectral bandpass filter is integrated with its associated imaging lens.

27. The imager apparatus of claim 26, wherein each hyperspectral bandpass filter is a thin film coating on its associated lens.

28. The imager apparatus of claim 25, wherein each hyperspectral bandpass filter is a separate element from its associated imaging lens.

29. The imager apparatus of claim 21, wherein each hyperspectral bandpass filter is a thin film coating on a surface of its associated pixel array.

30. The imager apparatus of claim 21, wherein each hyperspectral bandpass filter has a unique bandpass range.

31. The imager apparatus of claim 21, wherein the plurality of pixel arrays comprise M.times.N pixel arrays arranged in a M.times.N pattern of arrays on the die.

32. The imager apparatus of claim 31, wherein M equals N.

33. The imager apparatus of claim 31, wherein M does not equal N.

34. The imager apparatus of claim 21, wherein the pixel arrays and the imaging lenses are spaced about 0.5 mm to about 1 mm apart.

35. The imager apparatus of claim 34, wherein the imaging lenses and associated hyperspectral bandpass filters are configured to capture a full image of a scene.

36. An imager apparatus comprising: an image sensor comprising: a plurality of pixel arrays; a plurality of imaging lenses respectively arranged above the pixel arrays; and a hyperspectral bandpass filter associated with each imaging lens, wherein each hyperspectral bandpass filter is unique for each imaging lens; and a readout circuit for reading out respective image signals from each of the arrays.

37. The imager apparatus of claim 36, further comprising: an image combining device for combining the respective image signals from each of the pixel arrays.

38. The imager apparatus of claim 37, wherein the image combining device performs a parallax adjustment calculation.

39. The imager apparatus of claim 37, wherein a common readout circuit is provided for all of the pixel arrays.

40. The imager apparatus of claim 37, wherein separate readout circuits are provided for each of the pixel arrays.

41. The imager apparatus of claim 36, wherein the plurality of pixel arrays are formed on a single die.

42. The imager apparatus of claim 36, wherein each hyperspectral bandpass filter is integrated with its associated imaging lens.

43. The imager apparatus of claim 42, wherein each hyperspectral bandpass filter is a thin film coating on its associated lens.

44. The imager apparatus of claim 36, wherein each hyperspectral bandpass filter is a separate element from its associated imaging lens.

45. The imager apparatus of claim 36, wherein each hyperspectral bandpass filter is a thin film coating on a surface of its associated pixel array.

46. The imager apparatus of claim 36, wherein the pixel arrays are spaced less than 2 mm apart.

47. The imager apparatus of claim 46, wherein the pixel arrays are spaced about 0.5 mm to about 1 mm apart.

48. The imager apparatus of claim 36, wherein the imager apparatus is a component of a digital camera.
Description



CROSS REFERENCE TO RELATED APPLICATION

[0001] This application is a continuation-in-part of U.S. application Ser. No. 11/367,580 filed Mar. 6, 2006, the disclosure of which is incorporated by reference herein.

FIELD OF THE INVENTION

[0002] Embodiments of the invention relate generally to the field of semiconductor devices and more particularly to multi-array image sensors.

BACKGROUND OF THE INVENTION

[0003] Image sensors, such as multispectral image sensors, generally produce images with a few relatively broad wavelength bands from a wavelength of about 400 nm to about 700 nm. These bands typically correlate to the red, green and blue color filters (RGB) of a Bayer patterned color filter array (described below) used in the image sensor. Hyperspectral image sensors, on the other hand, simultaneously collect image data in dozens or hundreds of narrow, adjacent hyperspectral bands. Hyperspectral sensors create a larger number of images from contiguous, rather than disjoint, regions of the spectrum, typically, with much finer resolution than can be obtained with a multispectral image sensor. Hyperspectral imaging involves acquisition of image data in many contiguous narrow hyperspectral bands, the goal being to produce laboratory quality reflectance spectra for each pixel in an image.

[0004] The semiconductor industry currently produces different types of semiconductor-based image sensors, such as charge coupled devices (CCDs), CMOS active pixel sensors (APS), photodiode arrays, charge injection devices and hybrid focal plane arrays, among others. These image sensors use imaging lenses to focus electromagnetic radiation onto photo-conversion devices, e.g., photodiodes. Also, these image sensors can use color filters to pass particular wavelengths of electromagnetic radiation to the photo-conversion devices, such that the photo-conversion devices typically are associated with a particular color.

[0005] FIGS. 1A and 1B respectively show a top view and a simplified cross sectional view of a portion of a conventional color image sensor for the visual light spectrum using a Bayer patterned color filter pixel cell array 100. The array 100 includes pixel cells 10, each being formed on a substrate 1. The pixel array 100 is covered by a color filter array 30. The color filter array 30 includes color filters 31r, 31g, 31b, each disposed over a respective pixel cell 10. Each of the filters 31r, 31g, 31b allows only particular wavelengths of light to pass through to a respective photo-conversion device. Typically, the color filter array 30 is arranged in a repeating Bayer pattern that includes two green color filters 31g for every red color filter 31r and blue color filter 31b, arranged as shown in FIG. 1A. As shown in FIG. 1B, each pixel cell 10 includes a photo-conversion device 12r, 12g, for example, a photodiode, having an associated charge collecting well 13r, 13g. The elements 12r, 12g, 13r, 13g are associated with red and green colors based on the color being passed to the pixels of one row of the color filter array 30, however, it should be appreciated that there may be a photo-conversion device 12 and a charge collecting well 13 that is associated with the color blue that is not shown in the cross sectional view of FIG. 1B. The illustrated array 100 has imaging lenses 20 that collect and focus light onto respective photo-conversion devices 12r, 12g, which in turn convert the focused light into electrons that are stored in the respective charge collecting wells 13r, 13g.

[0006] Between the color filter array 30 and the pixel cells 10 is a passivation layer 6 which typically covers the gate structure of transistors of the pixels and an overlying interlayer dielectric (ILD) region 3. The ILD region 3 typically includes multiple layers of interlayer dielectrics and conductors that form connections between devices of the pixel cells 10 and from the pixel cells 10 to circuitry 150 peripheral to the array 100. A dielectric layer 5 may also be provided between the color filter array 30 and imaging lenses 20.

[0007] As discussed above, a hyperspectral image sensor or camera system relies on many narrow hyperspectral bandpass filters to capture the hyperspectral image content of a scene. The hyperspectral bandpass filters may be applied to an imaging system of the type illustrated in FIG. 1B. One major disadvantage of this is that because of the large number of narrow band filters required, the hyperspectral image sensor becomes very large and expensive. Further, because the image sensor relies on applying separate hyperspectral bandpass filters to a lens system over one or more pixels in each pixel array, the filters do not capture the full image content and may produce poor quality images.

[0008] Therefore, it would be advantageous to have an integrated hyperspectral image sensor which better captures the hyperspectral image content of a scene, which is also compact.

BRIEF DESCRIPTION OF THE DRAWINGS

[0009] FIG. 1A is a top plan view of a portion of a conventional Bayer pattern color image sensor.

[0010] FIG. 1B is a cross sectional view of a row portion of a conventional Bayer pattern color image sensor.

[0011] FIG. 2A is a plan view of a multi-array image sensor in accordance with an embodiment described herein.

[0012] FIG. 2B is a cross sectional view along a line A-A of the FIG. 2A image sensor in accordance with an embodiment described herein.

[0013] FIG. 3 is a graph showing the measurements of hyperspectral imaging in accordance with an embodiment described herein.

[0014] FIG. 4 is a graph showing the measurements of a continuous spectrum for each hyperspectral image from each pixel in accordance with an embodiment described herein.

[0015] FIG. 5 is a cross sectional view of a portion of an image sensor in accordance to an embodiment described herein.

[0016] FIG. 6 illustrates a block diagram of a CMOS image sensor constructed in accordance with an embodiment described herein.

[0017] FIG. 7 depicts a processor system constructed in accordance with an embodiment described herein.

DETAILED DESCRIPTION OF THE INVENTION

[0018] In the following detailed description, reference is made to the accompanying drawings which form a part hereof, and in which are shown by way of illustration of specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those of ordinary skill in the art to make and use them, and it is to be understood that structural, logical or procedural changes may be made to the specific embodiments disclosed herein.

[0019] The embodiments described herein relate to methods, apparatuses, and systems for integrating a plurality of pixel arrays onto a single substrate, each having an associated hyperspectral bandpass filter and imaging lens. Each imaging array and its associated lens and bandpass filter form an image of objects in a scene. If the image sensor arrays and lenses are in close proximity, on the order of less than 2 mm, e.g., about 0.5 mm to about 1 mm, then object parallax will be quite small and each array shared will image the same scene for objects that are all farther than 1 m from the arrays. A hyperspectral image may be constructed from individual images respectively acquired by the pixel arrays such that each pixel in the image contains a hyperspectral representation of reflectance of that point in the scene.

[0020] FIG. 2A is a plan view of an embodiment of a hyperspectral image sensor 200 on a semiconductor, e.g., silicon, substrate 201 of a single die. In the illustrated embodiment, image sensor 200 is formed as a 4.times.4 array configuration of pixel arrays 203 for a total of 16 pixel arrays 203, each having an associated lens element 204. The imaging lenses 204 are formed by molding or by etching a lens material layer. A hyperspectral bandpass filter 205 (FIG. 2B) may be formed over each optical imaging lens 204. The hyperspectral bandpass filters 205 may be formed separate from lenses 204, as illustrated in FIG. 2B, and may be formed above or below the lenses 204, or may be formed as a coating on the lenses 204, or may be formed as part of the lens 204 by using a colored material for lens 204. Each hyperspectral bandpass filter 205 associated with an imaging lens 204 has a different narrow hyperspectral bandpass characteristic. Thus, in the embodiment illustrated in FIGS. 2A-2B, there are sixteen different narrow hyperspectral bandpass filters 205, each passing a narrow wavelength range of light to a respective pixel array 203 below. Hyperspectral bandpass filters 205 may pass narrow bandwidths that are narrower than the bandwidths used to simply separate red, green and blue wavelengths and may have pass bands of and provide a spectral resolution of less than 100 nm per band, e.g., 50-100 nm per band, and could include infrared bands from about 700 nm to about 1000 nm and/or UV bands from about 200 nm to about 400 nm. The filters 205 may also have a bandwidth resolution that is as narrow as less than 10 nm per band, for example, 5-10 nm per band.

[0021] Referring to FIG. 2B, image sensor 200 includes a substrate 201 on which a plurality of pixel arrays 203 and associated support circuitry (not shown) for each array 203 are fabricated. The illustrated portion shows one pixel array 203 for one color that has a corresponding color filter that only allows a limited narrow spectrum of wavelengths to pass through. Each imaging lens 204 or pixel array 203 then captures different narrow band hyperspectral information. For simplicity, the embodiment shown in FIG. 2B shows pixel arrays 203 that are four pixels wide, but it should be appreciated that each of the pixel arrays 203 may contain hundreds, thousands or millions of pixels, as desired. Opaque walls 260 separate the individual arrays 203 and opaque walls 270 separate the arrays of imaging lenses 204 that are arranged above each respective pixel array 203. Opaque walls may also optionally separate the narrow hyperspectral filters 205. The hyperspectral bandpass filters 205 are unique for each imaging lens 204, meaning that each imaging lens 204 or pixel array 203 under each lens 204 has a different narrow band color filter that only allows a limited spectrum of wavelengths to pass through. Thus, each pixel array 203 captures different narrow band hyperspectral information.

[0022] As noted, the pixel arrays 203 preferably are integrated on a single silicon die substrate with common circuitry. Including a multi-array color image sensor on a single die provides for a reduction of color crosstalk artifacts, especially for compact camera modules with pixel sizes less than 6 microns by 6 microns. Moreover, an imaging lens with a short focal length can minimize parallax effects and allow a camera module employing image sensor 200 to be more compact.

[0023] Although FIG. 2B illustrates a single lens 204 over each array 203, a modified embodiment provides an imaging lens 204 over one or more pixels in each array 203. For example, each individual lens 204 may cover and focus light on a sixteen pixel section (in a 4.times.4 pattern) of the pixels in each array 203.

[0024] Because the image sensor 200 employs hyperspectral imaging, the image data is simultaneously collected in dozens or hundreds of narrow, adjacent hyperspectral bands, as illustrated in FIG. 3. For example, a 4.times.4 arrangement of pixel arrays (as illustrated in FIGS. 2A-2B) has at most 16 hyperspectral bands. For collection of hundreds of hyperspectral bands, the image sensor must have a hundred or more individual pixel arrays, e.g., a 10.times.10 arrangement of pixel arrays.

[0025] Each image I.sub.1 . . . I.sub.n from a respective array represents a hyperspectral image from a respective pixel array 203 (FIG. 2A). The totality of images I.sub.1 . . . I.sub.n represent an image with all hyperspectral bands, as illustrated in FIG. 4. The individual images can then be combined and thus, each pixel in the combined image would contain full hyperspectral information. In FIG. 3, .beta..sub.1 . . . .beta..sub.n represents the bandpass optical filter for each image I.sub.1 . . . I.sub.n of FIG. 4. Each imaging lens 204 with its unique hyperspectral bandpass filter will image the same scene if the image sensor arrays 203 are in close proximity to one another, e.g., on the order of less than 2 mm and with a preferred spacing in the range of about 0.5 mm to about 1 mm. For arrays 203 which have respective lenses 204, the lenses are similarly spaced. Thus, a hyperspectral or full spectral image may be constructed such that each pixel in the produced image contains a full spectral representation of reflectance of that point in the scene. Further, because the image sensor arrays 203 and imaging lenses 204 are in close proximity, e.g., on the order of less than 2 mm and preferably about 0.5 mm to about 1 mm, then object parallax is quite small.

[0026] FIG. 5 is a simplified cross sectional view of a portion of an image sensor 200 according to any embodiment of the invention. Sensor 200 includes a substrate 201 on which a plurality of pixel arrays 203 and associated support circuitry (not shown) for each array are fabricated. Each color pixel array 203 is associated with a different color on a wavelength spectrum. As discussed above, each imaging lens 204 has a unique hyperspectral bandpass filter 205 that is associated with each pixel array 203 which is designed to pass a limited spectrum of wavelengths. Each pixel array 203 then captures different hyperspectral information.

[0027] As an object to be imaged moves closer to the array of imaging lenses 204, the individual arrays 203 will exhibit an increase in parallax distance between them. The magnitude of the parallax distance between two adjacent arrays is approximated by the following formula:

P.sub.x=(N/x)*d.sub.x (1)

where N is the pixel array dimension (in pixels), d.sub.x is the distance between the outer focal points of lenses 204 and where x is approximated by the following formula:

x=2*O*Tan(.alpha./2) (2)

where O is the object distance from the camera (or distance from the object to be imaged and the imaging lenses 204) and .alpha. is the field of view (FOV) angle. Once the object distance O has been measured or approximated, the parallax distance calculation can be performed. An example for calculating parallax is discussed below. This example also shows that small pixel arrays (on the order of about 1 mm and spaced less than 2 mm apart, e.g., from about 0.5 mm to about 1 mm) will not produce excessive parallax.

[0028] For an object distance O of about 2 m from a camera and a field of view angle of 60.degree. , the P.sub.x or parallax distance at the object plane between adjacent pixel arrays 203 would equal 0.78 pixels by using formulas (1) and (2) above. Thus, for image sensors with 4.times.4 imaging arrays (i.e., 4.times.4 image sensor), the maximum parallax (at 60.degree. FOV) between images for objects with a distance of greater than or equal to 2 m from the camera would equal 2.35 pixels. A 4.times.4 image sensor would have 16 hyperspectral bands, a pixel element size of 2.2 .mu.m.times.2.2 .mu.m and a target hyperspectral image size of 800.times.600.

[0029] Various image sensor configurations allow tradeoffs in performance. For example, a 3.times.3 image sensor will have less maximum parallax than a 4.times.4 image sensor, but will have only 9 hyperspectral bands. A larger FOV will have less parallax but the user must get closer to objects they want to isolate in a scene. Further, smaller pixels may be used to reduce maximum parallax with a tradeoff of a reduction in camera sensitivity.

[0030] FIG. 6 illustrates a block diagram of a CMOS imaging device 500, which employs a multi-array sensor 200 having a plurality of pixel arrays 203 constructed according to any embodiment described above. Multiple arrays 203 are arranged to form one large pixel array 505. The pixels of each row in array 505 are all turned on at the same time by a row select line, and the pixels of each column are selectively output onto the output lines by respective column select lines. A plurality of row and column select lines are provided for the entire array 505. The row lines are selectively activated in sequence by a row driver 510 in response to row address decoder 515. The column select lines are selectively activated in sequence for each row activation by a column driver 520 in response to column address decoder 525. Thus, a row and column address is provided for each pixel.

[0031] The CMOS image sensor 500 is operated by a control circuit 530, which controls address decoders 515, 525 for selecting the appropriate row and column select lines for pixel readout. Control circuit 530 also controls the row and column driver circuitry 510, 520 so that they apply driving voltages to the drive transistors of the selected row and column select lines. The pixel output signals typically include a pixel reset signal V.sub.rst read out of the storage region after it is reset by the reset transistor and a pixel image signal V.sub.sig, which is read out of the storage region after photo-generated charges are transferred to the region. The V.sub.rst and V.sub.sig signals are sampled by a sample and hold circuit 535 and are subtracted by a differential amplifier 540, to produce a differential signal V.sub.rst-V.sub.sig for each readout pixel. V.sub.rst-V.sub.sig represents the amount of light impinging on the pixels. This difference signal is digitized by an analog-to-digital converter 545. The digitized pixel signals are fed to an image processor 550 to form a digital image output. The digitizing and image processing can be located on or off the imaging device chip. In some arrangements the differential signal V.sub.rst-V.sub.sig can be amplified as a differential signal and directly digitized by a differential analog to digital converter. The analog-to digital converter 545 supplies the digitized pixel signals to an image processor 550, which performs appropriate image processing, which can include combining the outputs of multiple arrays and performing a parallax adjustment calculation if needed or desired, before providing digital signals defining an image output.

[0032] It should be noted that FIG. 6 represents one example of a readout circuit for multi-array 505. Another embodiment could employ the readout circuit in FIG. 6 for each individual array 203 with an image processor 550 combining the outputs of the individual arrays 203. Also, FIG. 6 illustrates a readout circuit suitable for CMOS arrays 203, but the invention may be used with other solid state imaging arrays, for example, CCD arrays in which case a readout circuit suitable for reading out CCD arrays could be employed.

[0033] FIG. 7 illustrates a processor system 900 including the image sensor 500 of FIG. 6. The processor system 900 is one example of a system having digital circuits that could include image sensor devices. Without being limiting, such a processor system could also include a computer system, camera system, scanner, machine vision, vehicle navigation, video phone, surveillance system, auto focus system, star tracker system, motion detection system, image stabilization system, and other processor system.

[0034] The processor system 900, for example a digital camera system, generally comprises a central processing unit (CPU) 995, such as a microprocessor for common operational control, that communicates with an input/output (I/O) device 991 over a bus 993. Image sensor 500 also communicates with the CPU 995 over the bus 993. The processor-based system 900 also includes random access memory (RAM) 992, and can include removable memory 994, such as flash memory, which also communicate with CPU 995 over the bus 993. Image sensor 500 may be combined with a processor, such as a CPU, digital signal processor, or microprocessor, with or without memory storage on a single integrated circuit or on a different chip than the processor. A parallax adjustment calculation may be performed by the image processor 550 in image sensor 500, or by the CPU 995.

[0035] While the embodiments have been described in detail in connection with the embodiments known at the time, it should be readily understood that the claimed invention is not limited to the disclosed embodiments. Rather, they can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described. For example, while embodiments are described in connection with a CMOS pixel image sensor, they can be practiced with any other type of pixel image sensor (e.g., CCD, etc.). Furthermore, the various embodiments could be used in automotive applications and other applications where the object plane is at a constant distance, parallax is easily accounted for by using a simple linear shift of pixel data from each camera to properly register all images, such as in machine vision or industrial imaging. In addition, various embodiments may be used in low-cost, solid-state hyperspectral scanners, in which multiple hyperspectral cameras in one scanning system may produce high-resolution images very quickly.

* * * * *


uspto.report is an independent third-party trademark research tool that is not affiliated, endorsed, or sponsored by the United States Patent and Trademark Office (USPTO) or any other governmental organization. The information provided by uspto.report is based on publicly available data at the time of writing and is intended for informational purposes only.

While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, reliability, or suitability of the information displayed on this site. The use of this site is at your own risk. Any reliance you place on such information is therefore strictly at your own risk.

All official trademark data, including owner information, should be verified by visiting the official USPTO website at www.uspto.gov. This site is not intended to replace professional legal advice and should not be used as a substitute for consulting with a legal professional who is knowledgeable about trademark law.

© 2024 USPTO.report | Privacy Policy | Resources | RSS Feed of Trademarks | Trademark Filings Twitter Feed