U.S. patent application number 15/234836 was filed with the patent office on 2016-12-15 for methods and systems for movement of an automatic cleaning device using video signal.
The applicant listed for this patent is DIVERSEY, INC.. Invention is credited to Vinton Coffman, Daniel M. Daly, Stephen D. Herr, Henry L. Hillman, JR., David M. Knuth, JR., Ralph McCann, Kevin L. Thomas.
Application Number | 20160360940 15/234836 |
Document ID | / |
Family ID | 43973219 |
Filed Date | 2016-12-15 |
United States Patent
Application |
20160360940 |
Kind Code |
A1 |
Hillman, JR.; Henry L. ; et
al. |
December 15, 2016 |
METHODS AND SYSTEMS FOR MOVEMENT OF AN AUTOMATIC CLEANING DEVICE
USING VIDEO SIGNAL
Abstract
A method of causing a mobile robotic device, such as an
automated cleaning device, to navigate an area includes using a
video camera to collect visual information associated with an area
in which a mobile robotic device is located. A system receives
information from the video camera, wherein the information
comprises a value associated with a detected color in the area. The
system compares the value associated with the detected color to a
value of a requested color, and the system determines whether the
value associated with the detected color is within a tolerance
range of the value of the requested color. If the value associated
with the detected color is within a tolerance range of the value of
the requested color, the system accepts the color. The system uses
the received information to adjust a position of the mobile robotic
device in the area.
Inventors: |
Hillman, JR.; Henry L.;
(Vancouver, WA) ; Knuth, JR.; David M.; (East
Dubuque, IL) ; Daly; Daniel M.; (Brownsburg, IN)
; Coffman; Vinton; (Pittsburgh, PA) ; McCann;
Ralph; (Richmond, VA) ; Herr; Stephen D.;
(Richmond, VA) ; Thomas; Kevin L.; (Richmond,
VA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
DIVERSEY, INC. |
PORTLAND |
OR |
US |
|
|
Family ID: |
43973219 |
Appl. No.: |
15/234836 |
Filed: |
August 11, 2016 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
14176386 |
Feb 10, 2014 |
|
|
|
15234836 |
|
|
|
|
12616577 |
Nov 11, 2009 |
8679260 |
|
|
14176386 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
A47L 11/4011 20130101;
A47L 2201/04 20130101; G05D 1/0274 20130101; G05D 1/0246 20130101;
A47L 2201/06 20130101; G05D 2201/0203 20130101 |
International
Class: |
A47L 11/40 20060101
A47L011/40; G05D 1/02 20060101 G05D001/02 |
Claims
1. A method of causing a mobile robotic device to navigate an area,
the method comprising: by a video camera, collecting visual
information associated with an area in which a mobile robotic
device is located; by a processing device, receiving information
from the video camera, wherein the information comprises a value
associated with a detected color in the area; comparing the value
associated with the detected color to a value of a requested color;
determining whether the value associated with the detected color is
within a tolerance range of the value of the requested color; if
the value associated with the detected color is within a tolerance
range of the value of the requested color, accepting the color; and
by the processing device, using the received information to adjust
a position of the mobile robotic device in the area.
2. The method of claim 1: further comprising receiving, from one or
more sensors, positional information for the mobile robotic device;
and wherein using the received information to adjust a position of
the mobile robotic device in the area comprises comparing the
positional information and the information received from the video
camera to a digital representation of a map.
3. The method of claim 2, wherein using the received information to
adjust a position of the mobile robotic device in the area also
comprises: using the map to determine a current position of the
mobile robotic device; determining which of the received
information has a least amount of error as compared to the current
position; and causing the mobile robotic device to adjust its
position based on the information that has the least amount of
error.
4. The method of claim 1, wherein the received information also
comprises one or more of the following: coordinates associated with
a substantially center portion of the area; coordinates associated
with one or more corners of a rectangle that surrounds the area; or
a number of pixels of the detected color within the area.
5. The method of claim 1, wherein using the received information to
adjust a position of the mobile robotic device comprises adjusting
a path of the mobile robotic device based on the received
information in response to an amount of error associated with the
received information as compared to the position being less than an
amount of error associated with the received position data as
compared to the position.
6. The method of claim 1, wherein using the received information to
adjust a position of the mobile robotic device comprises:
determining an estimated location of the mobile robotic device;
using the received information to determine an actual location of
the mobile robotic device; and if the estimated location differs
from the actual location, moving the mobile robotic device to the
estimated location.
7. The method of claim 1, wherein: the mobile robotic device
comprises an automated cleaning device; and the method further
comprises, by the processing device, causing the automated cleaning
device to clean a surface of the area as the automated cleaning
device adjusts its position.
8. The method of claim 7, wherein causing the automated cleaning
device to clean the surface comprises, by the processing device:
determining one or more characteristics associated with the
surface; determining a cleaning method based on the determined one
or more characteristics; and causing the automated cleaning device
to apply the cleaning method to the surface.
9. The method of claim 8, wherein determining one or more
characteristics comprises determining that the surface comprises
one or more of the following: hard wood flooring; tile flooring;
carpeting; a high-traffic area; or a low-traffic area.
10. The method of claim 8, wherein causing the automated cleaning
device to apply the cleaning method comprises causing the automated
cleaning device to perform one or more of the following: vacuum the
surface along a path; scrub the surface along a path; or wax the
surface along a path.
11. A mobile robotic system, comprising: a video camera; a mobile
robotic device; a processing device; and a non-transitory,
computer-readable medium containing programming instructions
configured to cause the processing device to: receive, from the
video camera, visual information associated with an area in which a
mobile robotic device is located, wherein the information comprises
a value associated with a detected color in the area, compare the
value associated with the detected color to a value of a requested
color, determine whether the value associated with the detected
color is within a tolerance range of the value of the requested
color, in response to determining that the detected color is within
a tolerance range of the value of the requested color, accept the
color, and use the received information to adjust a position of the
mobile robotic device in the area.
12. The system of claim 11, further comprising: one or more
sensors; and additional programming instructions configured to
cause the processing device to: receive, from the one or more
sensors, positional information for the mobile robotic device; and
when using the received information to adjust a position of the
mobile robotic device in the area, comparing the positional
information and the information received from the video camera to a
digital representation of a map.
13. The system of claim 12, wherein the instructions to use the
received information to adjust a position of the mobile robotic
device in the area also comprise instructions to: use the map to
determine a current position of the mobile robotic device;
determine which of the received information has a least amount of
error as compared to the current position; and cause the mobile
robotic device to adjust its position based on the information that
has the least amount of error.
14. The system of claim 11, wherein the received information also
comprises one or more of the following: coordinates associated with
a substantially center portion of the area; coordinates associated
with one or more corners of a rectangle that surrounds the area; or
a number of pixels of the detected color within the area.
15. The system of claim 11, wherein the instructions to use the
received information to adjust a position of the mobile robotic
device comprise instructions to adjust a path of the mobile robotic
device based on the received information in response to an amount
of error associated with the received information as compared to
the position being less than an amount of error associated with the
received position data as compared to the position.
16. The system of claim 11, wherein the instructions to use the
received information to adjust a position of the mobile robotic
device comprise instructions to: determine an estimated location of
the mobile robotic device; use the received information to
determine an actual location of the mobile robotic device; and if
the estimated location differs from the actual location, move the
mobile robotic device to the estimated location.
17. The system of claim 11, wherein: the mobile robotic device
comprises an automated cleaning device; and the instructions are
further configured to cause the automated cleaning device to clean
a surface of the area as the automated cleaning device adjusts its
position.
18. The system of claim 17, wherein the instructions to cause the
automated cleaning device to clean the surface comprise
instructions to: determine one or more characteristics associated
with the surface; determine a cleaning method based on the
determined one or more characteristics; and cause the automated
cleaning device to apply the cleaning method to the surface.
19. The system of claim 18, wherein the instructions to determine
one or more characteristics comprise instructions to determine that
the surface comprises one or more of the following: hard wood
flooring; tile flooring; carpeting; a high-traffic area; or a
low-traffic area.
20. The system of claim 18, wherein the instructions to cause the
automated cleaning device to apply the cleaning method comprise
instructions to cause the automated cleaning device to perform one
or more of the following: vacuum the surface along a path; scrub
the surface along a path; or wax the surface along a path.
Description
RELATED APPLICATIONS AND CLAIM OF PRIORITY
[0001] This application claims priority to and is a continuation of
U.S. patent application Ser. No. 14/176,386, filed Feb. 10, 2014,
which is a divisional application that claims priority to U.S.
patent application Ser. No. 12/616,577, filed Nov. 11, 2009, now
U.S. Pat. No. 8,679,260. The disclosures of each priority
application are fully incorporated into this document. This
application is also related to U.S. patent application Ser. No.
12/616,452, filed Nov. 11, 2009, now U.S. Pat. No. 8,423,225, the
disclosure of which is fully incorporated by reference into this
document.
BACKGROUND
[0002] Mobile robotic devices have minimized the human effort
involved in performing everyday tasks. For example, automatic
cleaning devices help maintaining and cleaning surfaces, such as
hardwood floors, carpet and the like. Mobile robotic devices are
useful, but location detection can be a challenge for the operation
of such devices.
SUMMARY
[0003] Before the present methods are described, it is to be
understood that this invention is not limited to the particular
systems, methodologies or protocols described, as these may vary.
It is also to be understood that the terminology used herein is for
the purpose of describing particular embodiments only, and is not
intended to limit the scope of the present disclosure which will be
limited only by the appended claims.
[0004] It must be noted that as used herein and in the appended
claims, the singular forms "a," "an," and "the" include plural
reference unless the context clearly dictates otherwise. Unless
defined otherwise, all technical and scientific terms used herein
have the same meanings as commonly understood by one of ordinary
skill in the art. As used herein, the term "comprising" means
"including, but not limited to."
[0005] In an embodiment, a method of cleaning an area using an
automatic cleaning device may include: by a video camera,
collecting visual information associated with an area in which a
mobile robotic device is located; receiving information from the
video camera, wherein the information comprises a value associated
with a detected color in the area; comparing the value associated
with the detected color to a value of a requested color;
determining whether the value associated with the detected color is
within a tolerance range of the value of the requested color; if
the value associated with the detected color is within a tolerance
range of the value of the requested color, accepting the color; and
using the received information to adjust a position of the mobile
robotic device in the area.
[0006] In an embodiment, the method also includes using one or more
sensors to receive positional information for the mobile robotic
device. In this embodiment, using the received information to
adjust a position of the mobile robotic device in the area
comprises comparing the positional information and the information
received from the video camera to a digital representation of a
map. The map may be used to determine a current position of the
mobile robotic device. The method may then include determining
which of the received information has a least amount of error as
compared to the current position, and causing the mobile robotic
device to adjust its position based on the information that has the
least amount of error.
[0007] The received information also may include one or more of the
following: coordinates associated with a substantially center
portion of the area; coordinates associated with one or more
corners of a rectangle that surrounds the area; or a number of
pixels of the detected color within the area.
[0008] In some embodiments, using the received information to
adjust a position of the mobile robotic device may include
adjusting a path of the mobile robotic device based on the received
information in response to an amount of error associated with the
received information as compared to the position being less than an
amount of error associated with the received position data as
compared to the position. In other embodiments, using the received
information to adjust a position of the mobile robotic device may
include: determining an estimated location of the mobile robotic
device; using the received information to determine an actual
location of the mobile robotic device; and if the estimated
location differs from the actual location, moving the mobile
robotic device to the estimated location.
[0009] In some embodiments, the mobile robotic device may include
an automated cleaning device. If so, the cleaning device may be
caused to clean a surface of the area as the automated cleaning
device adjusts its position. This may include determining one or
more characteristics associated with the surface, determining a
cleaning method based on the determined one or more
characteristics, and causing the automated cleaning device to apply
the cleaning method to the surface.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Aspects, features, benefits and advantages of the present
invention will be apparent with regard to the following description
and accompanying drawings, of which:
[0011] FIG. 1 illustrates an exemplary robotic system according to
an embodiment.
[0012] FIG. 2 illustrates an exemplary method of navigating a
mobile robotic device according to an embodiment.
[0013] FIG. 3 illustrates an exemplary method of cleaning an area
with an automatic cleaning device according to an embodiment.
[0014] FIG. 4 illustrates an exemplary diagram of a mobile robotic
device's movement according to an embodiment.
[0015] FIG. 5 illustrates an exemplary method of navigating a
mobile robotic device according to an embodiment.
[0016] FIG. 6 illustrates an exemplary method of navigating an
automatic cleaning device according to an embodiment.
DETAILED DESCRIPTION
[0017] FIG. 1 illustrates an exemplary robotic system according to
an embodiment. As illustrated by FIG. 1, a robotic system may
include a mobile robotic device 100 and a video camera 105. In an
embodiment, the mobile robotic device 100 may be an autonomous
device that is capable of automatically navigating its environment.
A mobile robotic device 100 may include a processing device, a
computer-readable storage medium and/or the like. In an embodiment,
a mobile robotic device 100 may be in communication with one or
more external processing devices, computer-readable storage mediums
and/or the like.
[0018] In an embodiment, the video camera 105 may be a color video
camera, a black and white video camera and/or the like. In an
embodiment, the video camera 105 may be mounted on a front portion
of a mobile robotic device 100. In an alternate embodiment, the
video camera 105 may be mounted on a rear portion, a side portion
or other portion of the mobile robotic device 100. The video camera
105 may include a processing device, a computer-readable storage
medium and/or the like. In an embodiment, the video camera 105 may
be in communication with the mobile robotic device 100 and/or one
or more external processing devices, computer-readable storage
mediums and/or the like. For example, the video camera 105 may
collect information about the mobile robotic device's surroundings
such as coordinates, navigational information, visual information
and/or the like. The video camera 105 may provide this information
to the mobile robotic device 100.
[0019] In an embodiment, the video camera 105 may be a CMUcam3. The
CMUcam3 is an ARM7TDMI-based programmable embedded computer vision
sensor developed by Carnegie Mellon University. Additional and/or
alternate video cameras may be used within the scope of this
disclosure.
[0020] FIG. 2 illustrates an exemplary method of navigating a
mobile robotic device according to an embodiment. As illustrated by
FIG. 2, a mobile robotic device may receive 200 information
regarding its surrounding environment from a video camera. For
example, in an embodiment, a video camera may perform image
processing on a video frame to locate one or more substantially
straight lines in the image. In an embodiment, a substantially
straight line may be representative of a boundary or edge between
rooms, floor types and/or the like. For example, a video camera may
be able to detect an edge between two adjoining rooms. Similarly, a
video camera may be able to detect an edge between a carpeted floor
and a tile floor. In an embodiment, an edge may be detected based
on color contrast and/or the like.
[0021] In an embodiment, the mobile robotic device may receive 200
information regarding an edge from the video camera. The
information may include values associated with the slope, the
intercept and/or the error of a line detected by the video camera.
For example, the slope, intercept and/or error associated with a
substantially straight line from an image of a surface, such as a
ground surface, a floor surface or other surface over which the
mobile robotic device is travelling may be provided to the mobile
robotic device. In an embodiment, the video camera may repetitively
send updated information to the mobile robotic device. For example,
the video camera may send the mobile robotic device updated slope
values, intercept values and/or error values associated with a
detected line several times per second.
[0022] In an embodiment, a mobile robotic device may use this
information to determine its position relative to the detected
edge. For example, a mobile robotic device may compare information
about its current position, such as its coordinates, heading and/or
the like, with information received from the video camera regarding
the edge. The mobile robotic device may use this comparison to
determine its position relative to the edge, its distance away from
the edge and/or the like.
[0023] In an embodiment, a mobile robotic device may detect an edge
located in the direction of the mobile robotic device's movement.
For example, a mobile robotic device may detect an edge in front of
the mobile robotic device when the mobile robotic device is moving
forward. A mobile robotic device may detect an edge that is
substantially perpendicular to its movement. For example, a mobile
robotic device may detect an edge associated with a wall or other
barrier that the mobile robotic device is approaching. In an
embodiment, a mobile robotic device may adjust its motion based on
its proximity to a detected edge. For example, a mobile robotic
device may turn itself around or otherwise change its course when
it comes within two feet of the detected edge. Additional and/or
alternate proximities may be used within the scope of this
disclosure.
[0024] In an embodiment, the mobile robotic device may determine
205 whether to use any of the information received from the video
camera to adjust its position. For example, the mobile robotic
device may receive 210 positional information from one or more
sensors, such as sonar, radar and/or the like. The mobile robotic
device may compare 215 the information that it receives from its
sensors and the information that it receives from the video camera
to information from a map of an area or environment in which the
mobile robotic device is navigating.
[0025] In an embodiment, a map may be generated by the mobile
robotic device and/or another computing device during the mobile
robotic device's first navigation of the area. For example, the
mobile robotic device may be manually controlled through an area
during which time the mobile robotic device may generate and store
a map of the navigated area. In an embodiment, the mobile robotic
device may compare 215 the information it receives 210 from its
sensors and the information it receives from the video camera to
the digital representation of the map to determine which
information to use. In an embodiment, the mobile robotic device may
use the information that has the least amount of error as compared
to the mobile robotic device's current position. For example, a
mobile robotic device may determine a current position using a map.
It may compare the information received from its sensor and the
information it receives from its video camera to determine which
information has the least amount of error relative to its current
position. The mobile robotic device may adjust its position based
on the information associated with the least amount of error.
[0026] In an embodiment, the mobile robotic device may use the
information associated with an edge and its determined position to
navigate 220 a path. In an embodiment, the mobile robotic device
may navigate 220 a path relative to the edge. For example, the
mobile robotic device may navigate 220 a path that begins at the
determined position and that runs parallel to the edge.
Alternatively, the mobile robotic device may navigate 220 a path
that begins at the determined position and that is not parallel to
the edge.
[0027] In an embodiment, the mobile robotic device may navigate 220
a path such that the edge is located a certain distance from a
reference point on the mobile robotic device. The reference point
may be a side portion, a front portion, a back portion and/or the
like of the mobile robotic device. For example, a mobile robotic
device may use a detected line as a side registration by tracking
the line on the right or left side of the mobile robotic
device.
[0028] In an embodiment, a mobile robotic device may include an
automatic cleaning device. An automatic cleaning device may be a
mobile robotic device that can automatically navigate and clean
surfaces, such as floors. FIG. 3 illustrates an exemplary method of
cleaning an area with an automatic cleaning device according to an
embodiment.
[0029] As described above with respect to a mobile robotic device,
an automatic cleaning device may receive 300 information regarding
an edge from the video camera. The information may include values
associated with the slope, the intercept and/or the error of a line
detected by the video camera.
[0030] In an embodiment, an automatic cleaning device may use this
received information to determine its position relative to the
detected edge. For example, an automatic cleaning device may
compare information about its current position, such as its
coordinates, heading and/or the like, with information received
from the video camera regarding the edge.
[0031] In an embodiment, the automatic cleaning device may
determine 305 whether to use any of the information received from
the video camera to adjust its position. For example, the automatic
cleaning device may receive 310 positional information from one or
more sensors, such as sonar, radar and/or the like. The automatic
device may compare 315 the information that it receives from its
sensors and the information that it receives from the video camera
to information from a map of an area in which the automatic
cleaning device is navigating.
[0032] In an embodiment, the automatic cleaning device may compare
315 the information it receives 310 from its sensors and the
information it receives from the video camera to the digital
representation of the map to determine which information to use. In
an embodiment, the automatic cleaning device may use the
information that has the least amount of error as compared to the
automatic cleaning device's current position. For example, an
automatic cleaning device may determine a current position using a
map. It may compare the information received from its sensor and
the information it receives from its video camera to determine
which information has the least amount of error relative to its
current position. The automatic cleaning device may adjust its
position based on the information associated with the least amount
of error.
[0033] In an embodiment, the automatic cleaning device may use the
information associated with an edge and its determined position to
navigate 320 a path. The automatic cleaning device may navigate 320
a path relative to the edge. For example, the automatic cleaning
device may navigate 320 a path that begins at the determined
position and that runs parallel to the edge. In an embodiment, the
automatic cleaning device may navigate 320 the path such that the
edge is located a certain distance from a reference point on the
automatic cleaning device. The reference point may be a side
portion, a front portion, a back portion and/or the like of the
automatic cleaning device. For example, an automatic cleaning
device may use a detected line as a side registration by tracking
the line on the right or left side of the automatic cleaning
device.
[0034] In an embodiment, the automatic cleaning device may clean
325 at least a portion of its navigated path. For example, an
automatic cleaning device may be able to determine one or more
characteristics corresponding to its determined position. A
characteristic may include, for example, a floor type, such as hard
wood, tile, carpet and/or the like. A characteristic may include
whether the position is in a high-traffic area, a low-traffic area
and/or the like. In an embodiment, an automatic cleaning device may
determine an appropriate cleaning method based on the determined
characteristics. For example, the automatic cleaning device may
vacuum a carpeted area, scrub and/or wax a hardwood or tile area
and/or the like.
[0035] FIG. 4 illustrates an exemplary diagram of a mobile robotic
device's movement according to an embodiment. As illustrated by
FIG. 4, a mobile robotic device 400 may track an edge 405 between a
carpeted portion of a floor 410 and a tiled portion of a floor 415.
In an embodiment, the mobile robotic device 400 may move relative
to the edge 405. For example, as illustrated in FIG. 3, the mobile
robotic device 400 may move along the illustrated path 420, which
may run parallel to the edge 405.
[0036] In an embodiment, a mobile robotic device may navigate its
surroundings based on its location relative to an area of a
particular color. An area may be a surface area, an item, a
landmark and/or the like. For example, a mobile robotic device may
navigate an area based on its relative position to a green chair
located in a room.
[0037] FIG. 5 illustrates an exemplary method of navigating a
mobile robotic device according to an embodiment. As illustrated by
FIG. 5, a mobile robotic device may send 500 an information request
to the video camera. The information request may include an
indication of a color and a tolerance associated with the color. In
an embodiment, an indication of a color may be a color name, a
number associated with a color and/or the like. For example, a
mobile robotic device may send 500 an instruction to the video
camera asking it to search for the color green. The instruction may
also include a tolerance value which may instruct the video camera
as to a range of colors that may be accepted.
[0038] In an embodiment, the mobile robotic device may receive 505
information from the video camera. For example, the mobile robotic
device may receive 505 from the video camera coordinates associated
with a substantially central portion of an area corresponding to
the specified color. The mobile robotic device may also receive 505
coordinates of one or more corners of a rectangle that encompasses
the detected area. For example, the video camera may send the
mobile robotic device coordinates of the corners of a bounding box
that is just large enough to completely encompass the detected
area. In an embodiment, the mobile robotic device may receive a
number of pixels of the color that were detected in the area.
[0039] In an embodiment, the mobile robotic device may receive 505
from the video camera a numerical value associated with a detected
color and/or an associated tolerance range. For example, each color
that is detected by the video camera may be assigned a numerical
value. The numerical value associated with a detected color may be
compared to a numerical value associated with the requested color.
If the numerical value associated with the detected color is within
a tolerance range (i.e., the numerical value associated with the
requested color +/- the tolerance value), the detected color may be
accepted. The mobile robotic device may receive from the video
camera the value of the detected color and/or the tolerance
range.
[0040] In an embodiment, the mobile robotic device may determine
510 whether to use any of the information received from the video
camera in navigating its path. For example, the mobile robotic
device may receive 515 positional information from one or more
sensors, such as sonar, radar and/or the like. In an embodiment,
the mobile robotic device may compare 520 the information that it
receives from its sensors and the information that it receives from
the video camera to information from a map of an area in which the
mobile robotic device is navigating.
[0041] In an embodiment, a map may be generated by the mobile
robotic device and/or another computing device during the mobile
robotic device's first navigation of the area. For example, the
mobile robotic device may be manually controlled through an area
during which time the mobile robotic device may generate and store
a map of the navigated area. In an alternate embodiment, a map of
an area may be generated manually. For example, a map may be
generated by a user and a digital representation of the map may be
stored electronically.
[0042] In an embodiment, the mobile robotic device may compare 520
the information it receives 515 from its sensors and the
information it receives from the video camera to the digital
representation of the map to determine which information to use. In
an embodiment, the mobile robotic device may use the information
that has the least amount of error as compared to the mobile
robotic device's current position. For example, a mobile robotic
device may determine a current position using a map. It may compare
the information received from its sensor and the information it
receives from its video camera to determine which information has
the least amount of error relative to its current position. The
mobile robotic device may adjust its position based on the
information associated with the least amount of error.
[0043] In an embodiment, the mobile robotic device may adjust 525
its position based on the most accurate information. The mobile
robotic device may use the information to adjust 525 its position,
heading and/or the like. For example, an estimated position of the
mobile robotic device may be determined. The estimated position may
include coordinates, a heading and/or the like. In an embodiment,
an actual position may be determined based on the information
associated with the detected area that is received by the mobile
robotic device. The estimated position may be compared to the
actual position, and the mobile robotic device may adjust 525 its
position to compensate for any positional error that may exist. For
example, the mobile robotic device may navigate from its actual
position to the estimated position.
[0044] FIG. 6 illustrates an exemplary method of navigating an
automatic cleaning device according to an embodiment. As
illustrated by FIG. 6, an automatic cleaning device may send 600 an
information request to the video camera. The information request
may include an indication of a color and a tolerance associated
with the color. The instruction may also include a tolerance value
which may instruct the video camera as to a range of colors that
may be accepted.
[0045] In an embodiment, the automatic cleaning device may receive
605 information from the video camera. For example, the automatic
cleaning device may receive 605 from the video camera coordinates
associated with a substantially central portion of an area
corresponding to the specified color. The automatic cleaning device
may also receive 605 coordinates of one or more corners of a
rectangle that encompasses the detected area. In an embodiment, the
automatic cleaning device may receive a number of pixels of the
color that were detected in the area.
[0046] In an embodiment, the automatic cleaning device may receive
605 from the video camera a numerical value associated with a
detected color and/or an associated tolerance range. For example,
each color that is detected by the video camera may be assigned a
numerical value. The numerical value associated with a detected
color may be compared to a numerical value associated with the
requested color. If the numerical value associated with the
detected color is within a tolerance range (i.e., the numerical
value associated with the requested color +/- the tolerance value),
the detected color may be accepted. The automatic cleaning device
may receive from the video camera the value of the detected color
and/or the tolerance range.
[0047] In an embodiment, the automatic cleaning device may
determine 610 whether to use any of the information received from
the video camera in navigating its path. For example, the automatic
cleaning device may receive 515 positional information from one or
more sensors, such as sonar, radar and/or the like. In an
embodiment, the automatic cleaning device may compare 620 the
information that it receives from its sensors and the information
that it receives from the video camera to information from a map of
an area in which the automatic cleaning device is navigating.
[0048] In an embodiment, a map may be generated by the automatic
cleaning device and/or another computing device during the
automatic cleaning device's first navigation of the area. For
example, the automatic cleaning device may be manually controlled
through an area during which time the mobile robotic device may
generate and store a map of the navigated area. In an embodiment,
the mobile robotic device may compare 620 the information it
receives 615 from its sensors and the information it receives from
the video camera to the digital representation of the map to
determine which information to use.
[0049] In an embodiment, the automatic cleaning device may use the
information that has the least amount of error as compared to the
automatic cleaning device's current position. For example, an
automatic cleaning device may determine a current position using a
map. It may compare the information received from its sensor and
the information it receives from its video camera to determine
which information has the least amount of error relative to its
current position. The automatic cleaning device may adjust its
position based on the information associated with the least amount
of error.
[0050] In an embodiment, the automatic cleaning device may adjust
625 its position based on the most accurate information. The
automatic cleaning device may use the information to adjust 625 its
position, heading and/or the like. For example, an estimated
position of the automatic cleaning device may be determined. The
estimated position may include coordinates, a heading and/or the
like. In an embodiment, an actual position may be determined based
on the information associated with the detected area that is
received by the automatic cleaning device. The estimated position
may be compared to the actual position, and the automatic cleaning
device may adjust 625 its position to compensate for any positional
error that may exist. For example, the automatic cleaning device
may navigate from its actual position to the estimated
position.
[0051] In an embodiment, the automatic cleaning device may clean
630 at least a portion of its navigated path. For example, an
automatic cleaning device may be able to determine one or more
characteristics corresponding to its determined position. A
characteristic may include, for example, a floor type, such as hard
wood, tile, carpet and/or the like. A characteristic may include
whether the position is in a high-traffic area, a low-traffic area
and/or the like. In an embodiment, an automatic cleaning device may
determine an appropriate cleaning method based on the determined
characteristics. For example, the automatic cleaning device may
vacuum a carpeted area, scrub and/or wax a hardwood or tile area
and/or the like.
[0052] It will be appreciated that various of the above-disclosed
and other features and functions, or alternatives thereof, may be
desirably combined into many other different systems or
applications. Also that various presently unforeseen or
unanticipated alternatives, modifications, variations or
improvements therein may be subsequently made by those skilled in
the art which are also intended to be encompassed by the following
claims.
* * * * *