U.S. patent application number 15/700918 was filed with the patent office on 2018-01-11 for devices and methods for locating and visualizing underwater objects.
The applicant listed for this patent is NAVICO HOLDING AS. Invention is credited to Michael Smith, Jung Wook Yeum.
Application Number | 20180011189 15/700918 |
Document ID | / |
Family ID | 57584377 |
Filed Date | 2018-01-11 |
United States Patent
Application |
20180011189 |
Kind Code |
A1 |
Smith; Michael ; et
al. |
January 11, 2018 |
DEVICES AND METHODS FOR LOCATING AND VISUALIZING UNDERWATER
OBJECTS
Abstract
Sonar devices for detecting underwater objects are provided
whereby a set of angled ultrasound transducers are employed to
sense ultrasound signals from a plurality of different spatial
regions. The angled ultrasound transducers may include a first pair
of side-viewing ultrasound transducers and a second pair of
ultrasound transducers configured for interrogating forward and
reverse directions. The ultrasound signals from the set of angled
ultrasound transducers may be processed to identify the presence of
underwater objects in each spatial region, and the resulting
identified underwater objects may be displayed, on a per-region
basis, on a user interface. The ultrasound signals may additionally
or alternatively be processed to generate a topographical model of
the bed surface, and to generate a topographical surface image
based on the three-dimensional topographical model. The
topographical surface image may be displayed as a fly-over
animation.
Inventors: |
Smith; Michael; (Toronto,
CA) ; Yeum; Jung Wook; (Mississauga, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
NAVICO HOLDING AS |
Egersund |
|
NO |
|
|
Family ID: |
57584377 |
Appl. No.: |
15/700918 |
Filed: |
September 11, 2017 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
15189650 |
Jun 22, 2016 |
9759813 |
|
|
15700918 |
|
|
|
|
62182989 |
Jun 22, 2015 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G01S 15/89 20130101;
G01S 7/6263 20130101; G01S 7/6218 20130101; G01S 7/6272 20130101;
G01S 15/86 20200101; G01S 15/87 20130101; G01S 7/52003 20130101;
G01S 7/6281 20130101; G01S 15/04 20130101; G01S 15/96 20130101;
G01S 7/521 20130101; G01S 15/8902 20130101 |
International
Class: |
G01S 15/04 20060101
G01S015/04; G01S 7/62 20060101 G01S007/62; G01S 15/89 20060101
G01S015/89; G01S 15/87 20060101 G01S015/87; G01S 15/96 20060101
G01S015/96; G01S 7/521 20060101 G01S007/521 |
Claims
1. A sonar device for detecting underwater objects, the sonar
device comprising: a housing; a plurality of angled ultrasound
transducers supported by said housing; processing circuitry
provided within said housing, said processing circuitry comprising
an ultrasound transceiver that is operably connected to said angled
ultrasound transducers, said processing circuitry further
comprising an interface for communicating with a remote computing
device, and wherein said processing circuitry is connected or
connectable to a power source; wherein at least two of said
ultrasound transducers are angled ultrasound transducers having
respective ultrasound beam axes that are directed at an acute angle
relative to a primary axis of said sonar device, such that each
angled transducer is configured to interrogate a different spatial
region; and wherein said sonar device is configured to float in a
stable orientation such that said primary axis is vertically
oriented in still water.
2.-41. (canceled)
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to U.S. Provisional
Application No. 62/182,989, titled "DEVICES AND METHODS FOR
LOCATING AND VISUALIZING UNDERWATER OBJECTS" and filed on Jun. 22,
2015, the entire contents of which is incorporated herein by
reference.
BACKGROUND
[0002] The present disclosure relates to sonar device and methods
detection of underwater objects.
[0003] Fish finding sonar devices typically employ a single,
dual-frequency transducer for sonar detection, where the higher
frequency is used for clarity in shallower waters, and the lower
frequency is employed to achieve penetration in deeper waters. The
lower frequency casts a powerful, wide beam that can more easily
detect underwater objects at or near the bottom of a body of water,
however, with the beam being relatively wide, the increased depth
penetration comes at the expense of lateral spatial resolution. The
higher frequency beam, although being incapable of deep
penetration, provides a much narrower beam to better detect
structural changes and suspended fish. A user may switch between
the two frequencies depending on the environment they are in. A
common frequency pairing for a low-cost fish finder is 50/200 kHz,
where the 50 kHz signal would have a relatively wide angle (40+
degrees) capable of deep penetration with the higher 200 kHz signal
having a much narrower beam (.about.20 degrees) that is only
effective in shallower waters.
SUMMARY
[0004] Sonar devices for detecting underwater objects are provided
whereby a set of angled ultrasound transducers are employed to
sense ultrasound signals from a plurality of different spatial
regions. The angled ultrasound transducers may include a first pair
of side-viewing ultrasound transducers and a second pair of
ultrasound transducers configured for interrogating forward and
reverse directions. The ultrasound signals from the set of angled
ultrasound transducers may be processed to identify the presence of
underwater objects in each spatial region, and the resulting
identified underwater objects may be displayed, on a per-region
basis, on a user interface. The ultrasound signals may additionally
or alternatively be processed to generate a topographical model of
the bed surface, and to generate a topographical surface image
based on the three-dimensional topographical model. The
topographical surface image may be displayed as a fly-over
animation.
[0005] Accordingly, in a first aspect, there is provided a sonar
device for detecting underwater objects, the sonar device
comprising:
[0006] a housing;
[0007] a plurality of angled ultrasound transducers supported by
said housing;
[0008] processing circuitry provided within said housing, said
processing circuitry comprising an ultrasound transceiver that is
operably connected to said angled ultrasound transducers, said
processing circuitry further comprising an interface for
communicating with a remote computing device, and wherein said
processing circuitry is connected or connectable to a power
source;
[0009] wherein at least two of said ultrasound transducers are
angled ultrasound transducers having respective ultrasound beam
axes that are directed at an acute angle relative to a primary axis
of said sonar device, such that each angled transducer is
configured to interrogate a different spatial region; and
[0010] wherein said sonar device is configured to float in a stable
orientation such that said primary axis is vertically oriented in
still water.
[0011] In another aspect, there is provided a system for detecting
and locating underwater objects, the system comprising:
[0012] a remote computing device; and
[0013] a sonar device configured as described above;
[0014] wherein one of said remote computing device and said sonar
device is configured to: [0015] obtain signals from the angled
ultrasound transducers, wherein the signals are received in
response to ultrasound beams emitted from the angled transducers;
and [0016] process the signals to identify, within each spatial
region, the presence of one or more underwater objects; and
[0017] wherein said remote computing device is configured to
display, on a user interface, a visual representation indicating,
on a per-region basis, the presence of the underwater objects
detected within each the spatial region.
[0018] In another aspect, there is provided a computer-implemented
method of generating and presenting sonar data, the method
comprising:
[0019] receiving, on the remote computing device, sonar data
transmitted from a sonar device configured according to claim 1,
the sonar data comprising signals obtained from the angled
ultrasound transducers, the signals having been obtained in
response to ultrasound beams emitted from the angled ultrasound
transducers;
[0020] processing the signals to identify, within each spatial
region, the presence of one or more underwater objects;
[0021] displaying, on a user interface associated with remote
computing device, a visual representation indicating, on a
per-region basis, the presence of the underwater objects detected
within each the spatial region.
[0022] In another aspect, there is provided a system for measuring
and displaying a visualization of a bed surface of a body of water,
the system comprising:
[0023] a remote computing device; and
[0024] a sonar device configured as described above;
[0025] wherein said sonar device is configured to obtain signals
from a pair of angled ultrasound transducers and from said central
ultrasound transducer, wherein the signals are received in response
to ultrasound beams emitted by the ultrasound transducers; and
[0026] wherein one of said remote computing device and said sonar
device is configured to: [0027] process the signal from the central
ultrasound transducer to determine, a central bed depth measure
providing an estimate of bed depth within the central region; and
[0028] process the signals from the pair of angled ultrasound
transducers to determine lateral bed depth measures, each lateral
bed depth measure providing an estimate of bed depth within a
respective lateral region; and wherein said remote computing device
is configured to: [0029] process the central bed depth measures and
the lateral bed depth measures associated with a plurality of
locations in a longitudinal direction to generate a
three-dimensional topographical model of the bed surface; and
[0030] render a topographical surface image based on the
three-dimensional topographical model and displaying the
topographical surface image on a user interface.
[0031] In another aspect, there is provided a computer-implemented
method of measuring and displaying a visualization of a bed surface
of a body of water, the method comprising:
[0032] receiving, on the remote computing device, signals a sonar
device configured as described above, the signals having been
obtained from a pair of angled ultrasound transducers and from a
central ultrasound transducer at a plurality of locations in a
longitudinal direction, wherein the signals are received in
response to ultrasound beams emitted by the ultrasound transducers;
and processing the signal from the central ultrasound transducer to
determine, at each location, a central bed depth measure providing
an estimate of bed depth within the central region; and
[0033] processing the signals from the pair of angled ultrasound
transducers to determine, at each longitudinal location, lateral
bed depth measures, each lateral bed depth measure providing an
estimate of bed depth within a respective lateral region;
[0034] processing the central bed depth measures and the lateral
bed depth measures associated with the locations to generate a
three-dimensional topographical model of the bed surface; and
[0035] rendering a topographical surface image based on the
three-dimensional topographical model and displaying the
topographical surface image on a user interface.
[0036] A further understanding of the functional and advantageous
aspects of the disclosure can be realized by reference to the
following detailed description and drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0037] Embodiments will now be described, by way of example only,
with reference to the drawings, in which:
[0038] FIG. 1 shows an example system for identifying underwater
objects with a sonar device.
[0039] FIGS. 2A-F shows various example transducer configurations
of a sonar device.
[0040] FIGS. 3A-F show several views of the bottom portion of a
housing of an example sonar device.
[0041] FIG. 4 illustrates an example of configuration of an
ultrasound beam employed for detection of underwater objects.
[0042] FIG. 5 is a flow chart describing an example method of
visualizing the spatial locations of underwater objects identified
by a plurality of ultrasound transducers, the transducers being
configured to detect underwater objects in different spatial
directions.
[0043] FIGS. 6A and 6B show example user interface screenshots
illustrating spatial-resolved detection and identification of
underwater objects.
[0044] FIG. 7 is a flow chart illustrating an example method for
rendering a topographical surface image of a bed surface on a user
interface.
[0045] FIG. 8 is a screenshot of an example rendering of a
topographical image of a bed surface.
DETAILED DESCRIPTION
[0046] Various embodiments and aspects of the disclosure will be
described with reference to details discussed below. The following
description and drawings are illustrative of the disclosure and are
not to be construed as limiting the disclosure. Numerous specific
details are described to provide a thorough understanding of
various embodiments of the present disclosure. However, in certain
instances, well-known or conventional details are not described in
order to provide a concise discussion of embodiments of the present
disclosure.
[0047] As used herein, the terms "comprises" and "comprising" are
to be construed as being inclusive and open ended, and not
exclusive. Specifically, when used in the specification and claims,
the terms "comprises" and "comprising" and variations thereof mean
the specified features, steps or components are included. These
terms are not to be interpreted to exclude the presence of other
features, steps or components.
[0048] As used herein, the term "exemplary" means "serving as an
example, instance, or illustration," and should not be construed as
preferred or advantageous over other configurations disclosed
herein.
[0049] As used herein, the terms "about" and "approximately" are
meant to cover variations that may exist in the upper and lower
limits of the ranges of values, such as variations in properties,
parameters, and dimensions. Unless otherwise specified, the terms
"about" and "approximately" mean plus or minus 25 percent or
less.
[0050] It is to be understood that unless otherwise specified, any
specified range or group is as a shorthand way of referring to each
and every member of a range or group individually, as well as each
and every possible sub-range or sub-group encompassed therein and
similarly with respect to any sub-ranges or sub-groups therein.
Unless otherwise specified, the present disclosure relates to and
explicitly incorporates each and every specific member and
combination of sub-ranges or sub-groups.
[0051] As used herein, the term "on the order of", when used in
conjunction with a quantity or parameter, refers to a range
spanning approximately one tenth to ten times the stated quantity
or parameter.
[0052] In one example embodiment, a sonar device is provided for
locating the presence of underwater objects, such as one or more
fish, in a plurality of spatial regions. FIG. 1 shows an example
implementation of a system including a sonar device 100 for use in
locating the presence of underwater objects. Sonar device 100
includes ultrasound transducers 110A-110C secured in a housing 120,
where the ultrasound transducers 110A-110C are supported and
oriented such that their respective ultrasound beam axes 112A-112C
are directed in different directions for interrogating different
spatial regions 114A-114C. Each ultrasound transducer is connected
to an ultrasound transceiver 130, which provides excitation pulses
to the ultrasound transducers, and receives ultrasound signals
responsively generated by reflections from underwater objects.
[0053] Housing 120 contains and protects internal electronics and
processing circuitry. The shape of the housing 120 need not be
spherical as shown in FIG. 1. The housing 120 may be waterproof and
constructed of plastic or another buoyant material. It will be
understood that there are numerous sizes, shapes, and materials
that could be utilized for the housing in the embodiments of the
present disclosure. The housing 120 can generally take on a variety
of shapes, provided that it floats in a stable and pre-configured
orientation.
[0054] The sonar device 100 is configured, by selection of the
housing shape, and the weight distribution of the housing and its
supported components, such that it floats in a stable and
pre-selected configuration. The sonar device 100 is shown in FIG. 1
as floating in still water 101, whereby a primary axis 105
associated with the sonar device 100 is oriented in the vertical
orientation. The primary axis 105 of the sonar device tilts
relative to the vertical direction when the sonar device is
perturbed.
[0055] The example embodiment illustrated in FIG. 1 shows the
transceiver 130 connected, through bus or electrical path 135, to
processing circuitry that includes processor 140 and memory 150,
and to communications interface 160 and power source 170.
Transceiver 130 controls ultrasound transducers 110A-110C to emit
excitation ultrasound energy therefrom along respective ultrasound
beam axes 112A-112C, and transceiver 130 receives signals from the
ultrasound transducers 110A-110C in response to ultrasound waves
reflected by underwater objects. In one example implementation,
transceiver 130 receives raw sonar electrical signals from
ultrasound transducers 110A-110C.
[0056] Processor 140 is configured, based on executable
instructions stored in memory 150, to control the transmission of
sonar data, via communications interface 160, to the remote
computing device 200. The sonar data that is transmitted to the
remote computing device 200 may be raw sonar data (e.g. digitized
raw data suitable for transmission over a wireless communication
channel) or data that is pre-processed by the processor 140. For
example, processor 140 may be programmed to identify the presence,
and optionally depth, associated with one or more objects in each
spatial region 114A-114C. Alternatively, such processing may be
performed remotely by remote computing device 200, as described
further below.
[0057] As shown in FIG. 1, communications interface 160 may be
wireless interfacing device, which employs antenna 180 to transmit
and optionally receive wireless signals 185. For example,
communications interface 160 may include a wireless network
transceivers (e.g., Wi-Fi.TM., Bluetooth.RTM., cellular), wired
network interfaces (e.g., a CAT-type interface), USB, FireWire, or
other known interfaces. A wireless housing may be direct (directly
between the sonar device 100 and the remote computing device 200)
or indirectly (e.g. where each device remotely connects to a remote
server through a cellular network). A wired connection may be
facilitated thought a suitable water-tight connector that is
externally accessible on the housing.
[0058] As shown in FIG. 1, sonar device 100 includes, or is
connectable to, a power supply 170. A rechargeable or
non-rechargeable battery may be used to provide power. The power
may also be provided by an external power source, such as an AC
adapter or a powered docking cradle that supplements and/or
recharges a battery. Alternatively, power may be provided through a
solar cell device, such as a photovoltaic array.
[0059] Referring again to FIG. 1, ultrasound transducers 110A-110C
may be formed from a wide range of piezoelectric materials. For
example, the ultrasound transducers 110A-110C may be formed from
ceramic piezoelectric materials. In some example embodiments, the
ultrasound transducers 110A-110C may be operable at one or more
frequencies. For example, one or more of ultrasound transducers
110A-110C are operated at dual frequencies, in order to produce
respective ultrasound beams that have different angular bandwidths
and different penetration depths within the water. In one example
implementation, the sonar device 100 may be configured such that
the ultrasound transducers 110A-110C are operated at a first
frequency of approximately 300 kHz and a second frequency of
approximately 500 kHz.
[0060] As shown in FIG. 1, the housing 120 may be configured with a
tether location 190 that is configured for the tethering of a line
or cable (such as a fishing line) thereto. For example, the tether
location 190 may include an eyelet, hook, clamp, or any other line
tethering mechanism, such as those commonly employed in fishing
lures and bobbers.
[0061] The sonar device 100 may be configured to collect one or
more additional forms of information in addition to signals
associated with ultrasound detection. One of ordinary skill in the
art would appreciate that components of the system could be
configured to collect a variety of different information, and
embodiments of the present invention are contemplated, and may be
adapted, for use with a variety of additional forms of collectable
information. For example, the sonar device 100 may further include
one or more additional sensors for collecting additional signals
and/or sensor data. For example, one or more additional sensors may
be selected from the non-limiting list including a thermometer, a
speed sensor, an accelerometer, and a Global Positioning System
device. One of ordinary skill in the art would appreciate that
there are numerous types of sensors that could be utilized with
embodiments of the present disclosure. The sonar device 100 may
further include additional components, not shown in FIG. 1,
including, but not limited to, one or more of a data storage
device, an indicator light, and an auditory signal generation
device.
[0062] Sonar device 100 may also optionally include a GPS receiver
for determining an absolute or relative location of the sonar
device, and/or a speed or velocity of the sonar device. The GPS
receiver can also employ other geo-positioning mechanisms,
including, but not limited to, triangulation, assisted OPS (AGPS),
E-OTD, CI, SAI, ETA, BSS or the like, to further determine the
physical location of the sonar device 100.
[0063] In the example system shown in FIG. 1, sonar device 200 is
connected, or connectable, to a remote computing device 200. The
sonar device 200 employs ultrasound transducers 110A-110C to
acoustically interrogate a plurality of underwater spatial regions
114A-114C. Remote computing device 200 communicates with sonar
device 100 in order to receive information collected by the sonar
device, and to present information related to the detected
underwater objects on a display.
[0064] It will be understood that remote computing device may be
any device capable of processing the received signals and
displaying, or interfacing with an external display, for the
presentation of information to a user. Non-limiting examples of
remote computing devices include smartphones, tablets, laptop
computers, smartwatches, and other portable computing devices.
Another example of a remote computing device is a computing system
residing on, or integrated with, a vessel. The phrase "remote"
refers to two devices that are physically separated and connect
through a wired or wireless interface.
[0065] FIG. 1 illustrates an example embodiment of the computer
hardware associated with remote computing device 200. Remote
computing device 200 includes one or more processors 210 in
communication with memory 220 via a bus 205. Remote computing
device 200 includes a communication interfaces 230 for
communicating with sonar device 100, a display 240, an optional
internal or external storage media 250, and an optional
input/output interfaces 260. Remote computing device 200 also
includes, or is connectable to, a power supply. A rechargeable or
non-rechargeable battery may be used to provide power. The power
may also be provided by an external power source, such as an AC
adapter or a powered docking cradle that supplements and/or
recharges a battery.
[0066] The processor 210 include may include an arithmetic logic
unit, a microprocessor, a general purpose controller, or some other
processor array to perform computations and/or provide electronic
display signals to a display device. Processor 210 is shown coupled
to the bus 205 for communication with the other components of the
remote computing device 200. Although only a single processor 210
is shown in FIG. 1, multiple processors may be included and each
processor may include a single processing core or multiple
interconnected processing cores. Processor 210 may be capable of
processing sonar data and rendering images displayable on a display
device.
[0067] Memory 220 may include a RAM, a ROM, and other storage
means. Memory 220 illustrates another example of computer storage
media for storage of information such as computer readable
instructions, data structures, program modules or other data.
Memory 220 may store a basic input/output system ("BIOS") or
firmware for controlling low-level operation of remote computing
device 200. The memory 220 may also store an operating system 241
for controlling the operation of remote computing device 200. It
will be appreciated that this component may include a general
purpose operating system such as a version of Windows, Mac OS,
UNIX, or LINUX.TM., or a specialized mobile client communication
operating system such as iOS.TM., Android.TM., Windows Mobile.TM.,
or the Symbian.RTM. operating system, or an embedded operating
system such as Windows CE. The operating system may include, or
interface with a Java virtual machine module that enables control
of hardware components and/or operating system operations via Java
application programs.
[0068] Remote computing device may be configured to execute one or
more applications or "apps". Such applications may include computer
executable instructions stored by memory 220, and which, when
executed by remote computing device 200, perform one or more
algorithms disclosed herein for the generation and presentation of
information on a user interface, where the information pertains to
the detection of one or more underwater objects (optionally
including the bed surface 102).
[0069] A communications interface 230 is provided for communication
with one or more sonar devices 100, as described above in the
context for communications interface 160. Communications interface
230 may include devices for communicating with other electronic
devices.
[0070] Display 240 may be any suitable display device, such as a
liquid crystal display (LCD), gas plasma, light emitting diode
(LED), e-ink, or any other type of display used with a computing
device. Display 240 may also include a touch sensitive screen
arranged to receive input from an object such as a stylus or a
digit from a human hand. In another example implementation, remote
computing device 200 need not include a display, but may be
connected or connectable to an external display device.
[0071] As shown in FIG. 1, remote computing device 200 may also
include an internal or external storage medium 250, such as
removable flash memory, a hard disk drive, or another external
storage device. In one embodiment, a portion of the instructions
executable by the processor 210 may also be located external to
remote computing device 200.
[0072] Remote computing device 200 may also include input/output
interfaces 260 for communicating with external devices, such as a
headset, smartwatch or other input or output devices not shown in
FIG. 1. Remote computing device 200 may also include a GPS
receiver.
[0073] Remote computing device 200 may optionally communicate with
a base station (not shown), or directly with another computing
device. For example, a network interface device (not shown) may be
included that provides circuitry for coupling remote computing
device 200 to one or more networks, and is constructed for use with
one or more communication protocols and technologies including, but
not limited to, global system for mobile communication (GSM), code
division multiple access (CDMA), time division multiple access
(TDMA), user datagram protocol (UDP), transmission control
protocol/Internet protocol (TCP/IP), SMS, general packet radio
service (GPRS), WAP, ultra wide band (UWB), IEEE 802.16 Worldwide
Interoperability for Microwave Access (WiMax), SIP/RTP,
Bluetooth.TM., infrared, Wi-Fi, Zigbee, or any of a variety of
other wireless communication protocols.
[0074] It is to be understood that the example system shown in FIG.
1 is not intended to be limited to the components that may be
employed in a given implementation. Although only one of each
component is illustrated in FIG. 1, any number of each component
can be included. For example, the sonar device 100 and/or the
remote computing device 200 may include one or more additional
processors. In another example, a remote computing device may
contain a number of different data storage media 250.
[0075] Although FIG. 1 illustrates an example embodiment that
includes a single sonar device, it will be understood that in other
example embodiments, a plurality of sonar devices may be connected
to the remote computing device 200.
[0076] While some embodiments can be implemented in computer
hardware, various embodiments are capable of being distributed as a
computing product in a variety of forms and are capable of being
applied regardless of the particular type of machine or computer
readable media used to actually effect the distribution. At least
some aspects disclosed can be embodied, at least in part, in
software. That is, the techniques may be carried out in a computer
system or other data processing system in response to its
processor, such as a microprocessor, executing sequences of
instructions contained in a memory, such as ROM, volatile RAM,
non-volatile memory, cache or a remote storage device.
[0077] A computer readable storage medium can be used to store
software and data which when executed by a data processing system
causes the system to perform various methods. The executable
software and data may be stored in various places including for
example ROM, volatile RAM, nonvolatile memory and/or cache.
Portions of this software and/or data may be stored in any one of
these storage devices. As used herein, the phrases "computer
readable material" and "computer readable storage medium" refers to
all computer-readable media, except for a transitory propagating
signal per se.
[0078] Referring again to FIG. 1, the sonar device 100 illustrates
a non-limiting example transducer configuration involving three
ultrasound transducers 110A-110C, where each transducer is oriented
with its ultrasound beam axis directed in a unique direction, for
the interrogation of different spatial regions. The example
configuration shown in FIG. 1 illustrates an embodiment in which
the angled ultrasound transducers 110A and 110B form a pair of
angled ultrasound transducers located on opposite sides of the
primary axis 105. The pair of angled ultrasound transducers 110A
and 110B are thus configured to interrogate respective lateral
regions 114A and 114B on either side of the device.
[0079] In various embodiments, the sonar device includes a
plurality of angled ultrasound transducers, such as angled
ultrasound transducers 110A and 110B, and may optionally include a
central ultrasound transducer, such as central ultrasound
transducer 110C. As shown in FIG. 1, the angled ultrasound
transducers and oriented such that their respective ultrasound beam
axes are directed outwardly at an acute angle relative to the
primary axis 105.
[0080] Although the angular bandwidth of the angled transducers
110A and 110B are shown in FIG. 1 as non-overlapping, it will be
understood that one or more of the frequencies of the angled
ultrasound transducers, and their respective angles relative to the
primary axis 105, may be varied in order to achieve spatial
overlap. In one non-limiting example implementation, the acute
angles of the angled transducers, relative to the primary axis 105,
may be selected to lie between 20.degree. and 30.degree..
[0081] As noted above, the sonar device 100 may optionally include
a central transducer 110C that is oriented such that its ultrasound
beam axis 112C is parallel to, or directed along, the primary axis
105. As described below, the central ultrasound transducer 110C may
be employed to detect the presence of underwater objects directly
underneath the sonar device, in addition to the detection in the
lateral spatial regions associated with the angled ultrasound
transducers. The central ultrasound transducer 110C may also
optionally be employed to provide a bed depth measure associated
with the depth of the bed surface 102. The central ultrasound
transducer may be provided between, or may be surrounded by, the
angled transducers. Alternatively, the central ultrasound
transducer may be located adjacent to the angled transducers.
[0082] FIGS. 2A-2F illustrate a number of example configurations of
the angled transducers, where the housing 120 is shown from below,
viewed along the primary axis. FIG. 2A shows an example
configuration in which a triangular array of transducers 110 are
provided. FIG. 2B shows another example configuration involving a
pair of adjacent angled transducers, located on opposing sides of
the primary axis. FIG. 2C shows the angled-transducer pair of FIG.
2B with an additional central transducer centered therebetween.
FIG. 2D shows the triangular array of angled transducers of FIG.
2A, with an additional central transducer centered therebetween.
FIG. 2E shows an example configuration involving a square array of
angled transducers, where each angled transducer is angled to
interrogated a separate spatial quadrant. FIG. 2F shows yet another
example configuration whereby a pentagonal array of transducers is
provided.
[0083] As shown in FIGS. 2A-2F, the arrays of angled transducers
may be centered on the primary axis. Also, as shown in the figures,
the angled transducers may be evenly spaced around the primary
axis. In other example embodiments, the transducers may be unevenly
spaced, for example, in the form of a rectangular array, as opposed
to the square array of FIG. 2E.
[0084] FIGS. 2A-2E show the presence of the tether location 190.
Referring now to FIG. 2E, by way of example, the tether location
190 and the primary axis 105 define a longitudinal plane 195,
indicative a towing direction of the sonar device when the sonar
device is towed by a tether attached to tether location 190. As
shown in FIG. 2E, a first pair of angled transducers 116A and 116B
may be located on opposing sides of the longitudinal plane 195,
such that the first pair of transducers 116A and 116B scans the
lateral directions (left and right; port and starboard) as the
sonar device is towed. A second pair of angled transducers 118A and
118B may be provided such that their respective ultrasound beam
axes are directed within the longitudinal plane (or approximately
within the plane, for example within .+-.1.degree., .+-.2.degree.,
or .+-.5.degree.), such that the second pair of angled transducers
118A and 118B are configured to scan the longitudinal directions
(forward and reverse; bow and stern) when the sonar device is
towed.
[0085] As shown in various embodiments illustrated in FIGS. 2A-2F,
the angled ultrasound transducers may be provided as discrete
circumferential array. The circumferential array may be located
such that it surrounds, and is optionally centered on, the primary
axis (and/or a central ultrasound transducer). In some non-limiting
example embodiments, the angled ultrasound transducers of the
circumferential array may include 3, 4, 5, 6, 7, 8, 9 or 10 evenly
spaced transducers. In one example implementation, the array of
angled ultrasound transducers may be symmetrically arranged
relative to the longitudinal plane 190 within a suitable spatial
tolerance (e.g. within .+-.100 .mu.m, .+-.200 .mu.m, .+-.500 .mu.m,
.+-.1 mm, .+-.2 mm or .+-.5 mm). In one example implementation, the
array of angled transducers may be spatially arranged such that
their net center of mass is located on or near (e.g. within .+-.100
.mu.m, .+-.200 .mu.m, .+-.500 .mu.m, .+-.1 mm, .+-.2 mm or .+-.5
mm) the primary axis.
[0086] Referring now to FIGS. 3A-3F, an example implementation of
the bottom portion of an example housing 120 of a sonar device is
shown. Example housing 120 includes receptacles 122 for housing a
set of four angled ultrasound transducers and a central ultrasound
transducer. Housing 120 also includes an external tether location
190. In FIG. 3C, external planar surfaces 124, thought which
ultrasound energy is to be emitted and collected by the ultrasound
transducers, are shown in the lower portion of the external surface
of the example housing 120. FIGS. 3D-3F show additional views of
the housing 120, where two electronics platforms 142 and 144 are
shown, with antenna 180 provided in an upper portion of the
housing, above the water line when the sonar device is floated in
water. Lower platform 144 includes through-holes for connecting the
electronics (not shown) to the ultrasound transducers.
[0087] The preceding example embodiments have disclosed sonar
devices, and associated systems, where the sonar device includes a
plurality of angled transducers, and optionally a central
transducer, where the ultrasound transducers are configured to
interrogate different underwater spatial regions.
[0088] Referring now to FIG. 4, an example method of detection of
underwater objects, using a multi-transducer sonar device, is
described. In the example configuration illustrated in FIG. 4, the
example sonar device includes two angled ultrasound transducers
(left and right) and a central transducer. The central transducer
is directed in a downward direction, while the left and right
transducers are at a 30 degree angle from the primary axis. In the
present non-limiting example, each transducer is a dual-frequency
transducer exhibiting a 12-20.degree. beam angle (12.degree. for
high frequency operation and 20.degree. for low frequency
operation). Each transducer generates a receive signal integrating
the response over its entire coverage area. A conventional
pulse-echo depth sounding method may then be employed, using the
shallowest point as the depth, and then detecting the relative
depth of any underwater objects (e.g. fish) within this depth
range.
[0089] FIG. 4 shows the ultrasound beam emitted by the central
transducer (the lateral beams from the angled transducer pair are
not shown), showing the beam angle associated with the 20.degree.
coverage area when sounded with low frequency central beam.
[0090] As can be seen from the figure, the shallowest detected
point in the actual bottom contour is 21.6 ft, while the deepest
detected point is 24.4 ft. Because the transducer integrates over
its entire coverage area, the calculated depth measure will be 21.6
ft, the shallowest point it detects. Once the depth has been
determined, the underwater object detection algorithm searches for
underwater objects up to the processed depth, in this case,
searching for underwater objects within its detection area that is
shallower than the processed depth. In the above graphic, there is
only one fish within the detection area at a depth less than 21.6
ft, being the fish at 16.5 ft. The fish at 13.5 ft is outside of
the ultrasound beams coverage area and will not be detected. The
fish at 22.7 ft is deeper than the processed depth of 21.6 ft and
thus, will not be detected.
[0091] The same method may then be applied to the signals received
from the lateral pair of angled transducers in order to search for,
and optionally determine the depth of, underwater objects within
the lateral spatial regions associated with the angled transducers.
The left and right transducers may operate under the same
constraints as the central transducer, but are adjusted for their
angle relative to the primary axis. Both angled transducers process
depth as the shallowest point within their respective coverage
areas and find fish up to their processed depths, as in the method
described above.
[0092] The aforementioned example method thus provides information
pertaining to the presence, and optional depth, of underwater
objects in a plurality of spatial regions, with each spatial region
associated with a different ultrasound transducer. This spatial
information may be employed to provide a user with a visualization
of the spatial locations of one or more detected underwater
objects.
[0093] FIG. 5 provides a flow chart illustrating an example
embodiment of processing the data to obtain spatial information
associated with the detection of underwater objects, and presenting
this information to a user on a user interface associated with the
remote computing device. In optional step 300, a data connection is
established with the sonar device (alternatively, a data connection
need not be established if the sonar device is broadcasting data
according to a wireless protocol that does not require a
connection). The signals from the angled transducers, and
optionally from a central transducer (if present), and then
received in step 305. These signals are processed, in step 310, in
order to identify the presence of, and optionally the depth of,
underwater objects within the spatial region (e.g. field of view)
associated with each transducer (for example, using the example
methods described above). In step 315, the presence, and optional
depth measure, associated with each detected underwater object, is
displayed on a user interface associated with the remote computing
device (e.g. displayed on a display integrated with, or connected
to, the remote computing device), where a visual representation of
the spatial regions is provided, indicating the presence and
optional depth of the detected underwater objects on a per-region
basis.
[0094] In one example embodiment, the spatial regions are shown, on
the user interface, in a manner that represents their relative
location. For example, if a sonar device is employed having four
angled transducers, with four quadrants associated therewith, a
graphical display may show four regions positioned according to the
relative positions of the four quadrants, with each region showing
information associated with the presence of underwater objects
detected therein. In other words, the geometrical arrangement of
the regions shown in the visualization on the user interface may
correlate with the geometrical arrangement of the transducers of
the sonar device. In embodiments in which a central transducer is
employed along with angled transducers, the visualization may show
a central region associated with the central transducer, surrounded
by regions associated with the angled transducers.
[0095] The user interface may also be configured to display depth
information associated with one or more of the transducers. For
example, in one example embodiment, depth information may be
provided based on a bed depth measure obtained from the central
transducer. In another example embodiment, depth information
associated with any transducer may be selectively displayed based
on input from a user.
[0096] FIGS. 6A and 6B show two example implementations of a
visualization according to the method described in FIG. 5. In FIG.
6A, a smartphone is shown displaying a visualization showing five
quadrants 400A, 400B, 400C, 400D and 400E associated with
geometrical arrangement of five transducers arranged according to
the configuration shown in FIGS. 3A-F. Fish identified based on the
signals obtained from the various transducers of the sonar device
are shown in their associated regions, along with depth
information. The right image shows a depth-based view of the
highlighted quadrant, with a visualization of the different fish
depths.
[0097] FIG. 6B illustrates an example implementation in the
visualization is configured such that if one or more transducers
locates an underwater object (e.g. a fish), the respective
quadrant(s) and/or center region is identified (e.g. lit up or
coloured), also indicating the detected depth(s). In the screenshot
shown in FIG. 6B, the processed depth is 23.3 ft and the sonar
device detected a fish to its left at 12.7 ft.
[0098] It is noted that in some embodiments, the sonar data
received by the remote computing device is raw sonar data, and thus
step 310 is performed remotely relative to the sonar device, by the
remote computing device. Alternatively, the processing circuitry of
the sonar device may be configured to identify the underwater
objects within each spatial region, and optionally calculate a
depth measured associated with each underwater object, and this
information may be provided to the remote computing device as
pre-processed sonar data.
[0099] FIG. 7 shows a flow chart illustrating a method of
generating and rendering a three-dimensional view of a bed surface
(the floor or bottom surface of a body of water) based on
ultrasound signals collected from a sonar device employing a pair
of angled transducers and a central transducer, where the angled
transducers are positioned and directed to interrogate lateral
regions (left and right; port and starboard) when the sonar device
is towed.
[0100] As shown in step 500, ultrasound signals are obtained from
pair of angled transducers and the central transducer over a
plurality of longitudinal locations as the sonar device is towed.
In step 505, the signals from the central transducer are processed
to generate a plurality of central depth measures corresponding to
the different longitudinal locations. Similarly, in step 510, the
signals from the angled transducers are processed to generate a
plurality of lateral depth measures corresponding to the different
longitudinal locations.
[0101] These central and lateral depth measures form a grid of
discrete estimated depth measures (based on an assumption that the
sonar device is towed in a straight line), where the grid has three
values in the lateral direction (two lateral measures from the
angled transducers, and one central measure from the central
transducer, and a plurality of values in the longitudinal
direction. This grid is then processed, in step 515, and a fitting
algorithm (e.g. at least-squares based polynomial algorithm) is
employed to generate a three-dimensional topographical surface
model. It will be understand that any known surface fitting model
may be employed to produce the topographical model, such as those
employed in the cartography arts.
[0102] In step 520, the topographical surface model is processed to
render a topographical surface image that is displayable on a user
interface, such that the image shows the three-dimensional features
(contours) of the surface. Any suitable rendering method may be
employed, such as those known in the computer graphics arts (e.g.
perspective-based rendering methods). It will be understood that
even though the actual surface will have more undulations than
those that are rendered, the simplified rendering will provide the
user with useful and simplified structural information, which may
be employed to gauge the suitability of a location for fishing.
[0103] In one example implementation, the topographical surface
image is rendered based on a viewpoint located between the bed
surface and the water surface. This rendering may be animated to
provide a fly-over view of the surface. FIG. 8 shows an example of
such a rendering.
[0104] In some embodiments, the topographical surface image is
rendered in real-time (or near real-time, delayed by processing and
communication time delays).
[0105] The longitudinal direction of the generated image may be
based on inferred longitudinal position (e.g. based on a GPS device
associated with the sonar device or remote computing device), or
scaled based on associated time stamp values.
[0106] The specific embodiments described above have been shown by
way of example, and it should be understood that these embodiments
may be susceptible to various modifications and alternative forms.
It should be further understood that the claims are not intended to
be limited to the particular forms disclosed, but rather to cover
all modifications, equivalents, and alternatives falling within the
spirit and scope of this disclosure.
* * * * *