U.S. patent application number 12/240596 was filed with the patent office on 2009-03-26 for multiple index mixed media reality recognition using unequal priority indexes.
Invention is credited to Berna Erol, Jonathan J. Hull, Jorge Moraleda.
Application Number | 20090080800 12/240596 |
Document ID | / |
Family ID | 40471708 |
Filed Date | 2009-03-26 |
United States Patent
Application |
20090080800 |
Kind Code |
A1 |
Moraleda; Jorge ; et
al. |
March 26, 2009 |
Multiple Index Mixed Media Reality Recognition Using Unequal
Priority Indexes
Abstract
An MMR system for processing image queries across index tables
with unequal priority comprises a plurality of mobile devices, a
pre-processing server or MMR gateway, and an MMR matching unit, and
may include an MMR publisher. The MMR matching unit receives an
image query from the pre-processing server or MMR gateway and sends
it to one or more of the recognition units to identify a result
including a document, the page, and the location on the page. The
MMR matching unit includes a dispatcher, a plurality of recognition
units, and index tables, as well as an image registration unit. In
one embodiment, the system includes an MMR matching plug-in
installed on the mobile device. The present invention also includes
methods for processing image queries across index tables of unequal
priority and updating a high priority index based on received or
projected image queries.
Inventors: |
Moraleda; Jorge; (Menlo
Park, CA) ; Erol; Berna; (San Jose, CA) ;
Hull; Jonathan J.; (San Carlos, CA) |
Correspondence
Address: |
RICOH/FENWICK
SILICON VALLEY CENTER, 801 CALIFORNIA STREET
MOUNTAIN VIEW
CA
94041
US
|
Family ID: |
40471708 |
Appl. No.: |
12/240596 |
Filed: |
September 29, 2008 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
11461017 |
Jul 31, 2006 |
|
|
|
12240596 |
|
|
|
|
11461279 |
Jul 31, 2006 |
|
|
|
11461017 |
|
|
|
|
11461286 |
Jul 31, 2006 |
|
|
|
11461279 |
|
|
|
|
11461294 |
Jul 31, 2006 |
|
|
|
11461286 |
|
|
|
|
11461300 |
Jul 31, 2006 |
|
|
|
11461294 |
|
|
|
|
11461126 |
Jul 31, 2006 |
|
|
|
11461300 |
|
|
|
|
11461143 |
Jul 31, 2006 |
|
|
|
11461126 |
|
|
|
|
11461268 |
Jul 31, 2006 |
|
|
|
11461143 |
|
|
|
|
11461272 |
Jul 31, 2006 |
|
|
|
11461268 |
|
|
|
|
11461064 |
Jul 31, 2006 |
|
|
|
11461272 |
|
|
|
|
11461075 |
Jul 31, 2006 |
|
|
|
11461064 |
|
|
|
|
11461090 |
Jul 31, 2006 |
|
|
|
11461075 |
|
|
|
|
11461037 |
Jul 31, 2006 |
|
|
|
11461090 |
|
|
|
|
11461085 |
Jul 31, 2006 |
|
|
|
11461037 |
|
|
|
|
11461091 |
Jul 31, 2006 |
|
|
|
11461085 |
|
|
|
|
11461095 |
Jul 31, 2006 |
|
|
|
11461091 |
|
|
|
|
11466414 |
Aug 22, 2006 |
|
|
|
11461095 |
|
|
|
|
11461147 |
Jul 31, 2006 |
|
|
|
11466414 |
|
|
|
|
11461164 |
Jul 31, 2006 |
|
|
|
11461147 |
|
|
|
|
11461024 |
Jul 31, 2006 |
|
|
|
11461164 |
|
|
|
|
11461032 |
Jul 31, 2006 |
|
|
|
11461024 |
|
|
|
|
11461049 |
Jul 31, 2006 |
|
|
|
11461032 |
|
|
|
|
11461109 |
Jul 31, 2006 |
|
|
|
11461049 |
|
|
|
|
11827530 |
Jul 11, 2007 |
|
|
|
11461109 |
|
|
|
|
12060194 |
Mar 31, 2008 |
|
|
|
11827530 |
|
|
|
|
12059583 |
Mar 31, 2008 |
|
|
|
12060194 |
|
|
|
|
12060198 |
Mar 31, 2008 |
|
|
|
12059583 |
|
|
|
|
12060200 |
Mar 31, 2008 |
|
|
|
12060198 |
|
|
|
|
12060206 |
Mar 31, 2008 |
|
|
|
12060200 |
|
|
|
|
12121275 |
May 15, 2008 |
|
|
|
12060206 |
|
|
|
|
11776510 |
Jul 11, 2007 |
|
|
|
12121275 |
|
|
|
|
11776520 |
Jul 11, 2007 |
|
|
|
11776510 |
|
|
|
|
11776530 |
Jul 11, 2007 |
|
|
|
11776520 |
|
|
|
|
11777142 |
Jul 12, 2007 |
|
|
|
11776530 |
|
|
|
|
12210511 |
Sep 15, 2008 |
|
|
|
11777142 |
|
|
|
|
12210519 |
Sep 15, 2008 |
|
|
|
12210511 |
|
|
|
|
12210532 |
Sep 15, 2008 |
|
|
|
12210519 |
|
|
|
|
12210540 |
Sep 15, 2008 |
|
|
|
12210532 |
|
|
|
|
Current U.S.
Class: |
382/276 |
Current CPC
Class: |
G06F 21/78 20130101;
G06F 21/6218 20130101; G06K 9/00463 20130101; G06F 2221/2115
20130101; G06F 16/583 20190101; G06F 16/955 20190101; G06K 9/6262
20130101; G06K 9/00993 20130101; G06F 21/80 20130101 |
Class at
Publication: |
382/276 |
International
Class: |
G06K 9/36 20060101
G06K009/36 |
Claims
1. A method of processing image queries across index tables with
unequal priority, comprising: receiving an image query and an index
priority; submitting the image query to a high priority index for
recognition according to the index priority; responsive to
unsuccessful recognition at the high priority index, submitting the
image query to one or more lower priority indexes for recognition
according to the index priority; receiving recognition results; and
transmitting the recognition results.
2. The method of claim 1, wherein the high priority index is
specific to a user of a mobile device issuing the received image
query.
3. The method of claim 1, wherein the high priority index is on a
mobile device.
4. The method of claim 1, wherein the high priority index is
selected based on a geographical location of a user of a mobile
device issuing the received image query.
5. The method of claim 4, wherein the geographical location of the
high priority index is separate from a location associated with the
one or more lower priority indexes
6. A method of updating a high priority index based on received
image queries, comprising: receiving an image query and associated
query receipt information; determining whether a document page
corresponding to the image query exists in the high priority index;
responsive to the document page not existing in the high priority
index, determining whether a document page count is greater than a
threshold for the high priority index; and responsive to the
document page existing in the high priority index or the document
page count exceeding the threshold for the high priority index,
updating the high priority index with the associated query receipt
information for the received image query.
7. The method of claim 6, wherein content of the high priority
index is based upon popularity of image queries across a set of
users.
8. The method of claim 6, wherein content of the high priority
index is based upon popularity of image queries for a particular
user.
9. The method of claim 6, wherein updating the high priority index
occurs after each image query is received.
10. The method of claim 6, wherein updating the high priority index
occurs as a batch update at the end of a selected time
interval.
11. The method of claim 6, wherein updating the high priority index
further comprises: responsive to a determination that the high
priority index is full, selecting a document page for removal from
the high priority index; upon removal of the selected document page
from the high priority index, adding the received image query to
the high priority index.
12. The method of claim 11, wherein the selected image query is
selected based upon one from the group of oldest timestamp, lowest
count, and lowest weight.
13. A method of updating a high priority index based on image
queries likely to be received, comprising: predicting an image
query expected to be submitted to the high priority index during a
selected future interval; determining whether a document page
associated with the image query exists in the high priority index;
responsive to the document page not existing in the high priority
index, adding the document page to the high priority index.
14. The method of claim 13, wherein content of the high priority
index is based upon popularity of document pages across a set of
users.
15. The method of claim 13, wherein content of the high priority
index is based upon popularity of document pages for a particular
user.
16. The method of claim 13, wherein predicting an image query
expected to be submitted to the high priority index during the
selected future interval is based upon specificity of a given
document page to the selected future interval.
17. The method of claim 13, wherein predicting an image query
expected to be submitted to the high priority index during the
selected future interval is based upon similarity of a given query
to recently received image queries.
18. The method of claim 13, wherein updating the high priority
index occurs as a batch update at the end of a selected time
interval.
19. The method of claim 13, wherein updating the high priority
index further comprises: responsive to a determination that the
high priority index is full, selecting a document page for removal
from the high priority index; upon removal of the selected document
page from the high priority index, adding the received image query
to the high priority index.
20. The method of claim 19, wherein the selected document page is
selected based upon one from the group of oldest timestamp, lowest
count, and lowest weight.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation-in-part of U.S. patent
application Ser. No. 11/461,017, titled "System And Methods For
Creation And Use Of A Mixed Media Environment," filed Jul. 31,
2006, attorney docket #20412-11713; U.S. patent application Ser.
No. 11/461,279, titled "Method And System For Image Matching In A
Mixed Media Environment," filed Jul. 31, 2006, attorney docket
#20412-11714; U.S. patent application Ser. No. 11/461,286, titled
"Method And System For Document Fingerprinting Matching In A Mixed
Media Environment," filed Jul. 31, 2006, attorney docket
#20412-11715; U.S. patent application Ser. No. 11/461,294, titled
"Method And System For Position-Based Image Matching In A Mixed
Media Environment," filed Jul. 31, 2006, attorney docket
#20412-11716; U.S. patent application Ser. No. 11/461,300, titled
"Method And System For Multi-Tier Image Matching In A Mixed Media
Environment," filed Jul. 31, 2006, attorney docket #20412-11717;
U.S. patent application Ser. No. 11/461,126, titled "Integration
And Use Of Mixed Media Documents," filed Jul. 31, 2006, attorney
docket #20412-11718; U.S. patent application Ser. No. 11/461,143,
titled "User Interface For Mixed Media Reality," filed Jul. 31,
2006, attorney docket #20412-11719; U.S. patent application Ser.
No. 11/461,268, titled "Authoring Tools Using A Mixed Media
Environment," filed Jul. 31, 2006, attorney docket #20412-11720;
U.S. patent application Ser. No. 11/461,272, titled "System And
Methods For Creation And Use Of A Mixed Media Environment With
Geographic Location Information," filed Jul. 31, 2006, attorney
docket #20412-11721; U.S. patent application Ser. No. 11/461,064,
titled "System And Methods For Portable Device For Mixed Media
System," filed Jul. 31, 2006, attorney docket #20412-11722; U.S.
patent application Ser. No. 11/461,075, titled "System And Methods
For Use Of Voice Mail And Email In A Mixed Media Environment,"
filed Jul. 31, 2006, attorney docket #20412-11723; U.S. patent
application Ser. No. 11/461,090, titled "System And Method For
Using Individualized Mixed Document," filed Jul. 31, 2006, attorney
docket #20412-11724; U.S. patent application Ser. No. 11/461,037,
titled "Embedding Hot Spots In Electronic Documents," filed Jul.
31, 2006, attorney docket #20412-11725; U.S. patent application
Ser. No. 11/461,085, titled "Embedding Hot Spots In Imaged
Documents," filed Jul. 31, 2006, attorney docket #20412-11726; U.S.
patent application Ser. No. 11/461,091, titled "Shared Document
Annotation," filed Jul. 31, 2006, attorney docket #20412-11727;
U.S. patent application Ser. No. 11/461,095, titled
"Visibly-Perceptible Hot Spots In Documents," filed Jul. 31, 2006,
attorney docket #20412-11728; U.S. patent application Ser. No.
11/466,414, titled "Mixed Media Reality Brokerage Network and
Methods of Use," filed Aug. 22, 2006, attorney docket #20412-11729;
U.S. patent application Ser. No. 11/461,147, titled "Data
Organization and Access for Mixed Media Document System," filed
Jul. 31, 2006, attorney docket #20412-11730; U.S. patent
application Ser. No. 11/461,164, titled "Database for Mixed Media
Document System," filed Jul. 31, 2006, attorney docket
#20412-11731; U.S. patent application Ser. No. 11/461,024, titled
"Triggering Actions With Captured Input In A Mixed Media
Envornment," filed Jul. 31, 2006, attorney docket #20412-11732;
U.S. patent application Ser. No. 11/461,032, titled "Triggering
Applications Based On A Captured Text In A Mixed Media
Environment," filed Jul. 31, 2006, attorney docket #20412-11733;
U.S. patent application Ser. No. 11/461,049, titled "Triggering
Applications For Distributed Action Execution And Use Of Mixed
Media Recognition As A Control Input," filed Jul. 31, 2006,
attorney docket #20412-11734; U.S. patent application Ser. No.
11/461,109, titled "Searching Media Content For Objects Specified
Using Identifiers," filed Jul. 31, 2006, attorney docket
#20412-11735; U.S. patent application Ser. No. 11/827,530, titled
"User Interface For Three-Dimensional Navigation," filed Jul. 11,
2007, attorney docket #20412-13180; U.S. patent application Ser.
No. 12/060,194, titled "Document-Based Networking With Mixed Media
Reality," filed Mar. 31, 2008, attorney docket #20412-13396; U.S.
patent application Ser. No. 12/059,583, titled "Invisible Junction
Feature Recognition For Document Security Or Annotation," filed
Mar. 31, 2008, attorney docket #20412-13397; U.S. patent
application Ser. No. 12/060,198, titled "Document Annotation
Sharing," filed Mar. 31, 2008, attorney docket #20412-13901; U.S.
patent application Ser. No. 12/060,200, titled "Ad Hoc Paper-Based
Networking With Mixed Media Reality," filed Mar. 31, 2008, attorney
docket #20412-13902; U.S. patent application Ser. No. 12/060,206,
titled "Indexed Document Modification Sharing With Mixed Media
Reality," filed Mar. 31, 2008, attorney docket #20412-13903; U.S.
patent application Ser. No. 12/121,275, titled "Web-Based Content
Detection In Images, Extraction And Recognition," filed May 15,
2008, attorney docket #20412-14041; U.S. patent application Ser.
No. 11/776,510, titled "Invisible Junction Features For Patch
Recognition," filed Jul. 11, 2007, attorney docket #20412-12829;
U.S. patent application Ser. No. 11/776,520, titled "Information
Retrieval Using Invisible Junctions and Geometric Constraints,"
filed Jul. 11, 2007, attorney docket #20412-13136; U.S. patent
application Ser. No. 11/776,530, titled "Recognition And Tracking
Using Invisible Junctions," filed Jul. 11, 2007, attorney docket
#20412-13137; and U.S. patent application Ser. No. 11/777,142,
titled "Retrieving Documents By Converting Them to Synthetic Text,"
filed Jul. 12, 2007, attorney docket #20412-12590; U.S. patent
application Ser. No. 12/210,511, titled "Architecture For Mixed
Media Reality Retrieval Of Locations And Registration Of Images,"
filed Sep. 15, 2008; U.S. patent application Ser. No. 12/210,519,
titled "Automatic Adaption Of An Image Recognition System To Image
Capture Devices," filed Sep. 15, 2008; U.S. patent application Ser.
No. 12/210,532, titled "Computation Of A Recognizability Score
(Quality Predictor) For Image Retrieval," filed Sep. 15, 2008; U.S.
patent application Ser. No. 12/210,540, titled "Combining Results
Of Image Retrieval Processes" filed Sep. 15, 2008; and is related
to U.S. patent application Ser. No. ______, titled "Mixed Media
Reality Recognition Using Multiple Specialized Indexes," filed Sep.
29, 2008; all of which are incorporated by reference herein in
their entirety.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] The invention relates to techniques for indexing and
searching for mixed media documents formed from at least two media
types, and more particularly, to recognizing images and other data
using multiple-index Mixed Media Reality (MMR) recognition that
uses printed media in combination with electronic media to retrieve
mixed media documents.
[0004] 2. Background of the Invention
[0005] Document printing and copying technology has been used for
many years in many contexts. By way of example, printers and
copiers are used in commercial office environments, in home
environments with personal computers, and in document printing and
publishing service environments. However, printing and copying
technology has not been thought of previously as a means to bridge
the gap between static printed media (i.e., paper documents), and
the "virtual world" of interactivity that includes the likes of
digital communication, networking, information provision,
advertising, entertainment and electronic commerce.
[0006] Printed media has been the primary source of communicating
information, such as news papers and advertising information, for
centuries. The advent and ever-increasing popularity of personal
computers and personal electronic devices, such as personal digital
assistant (PDA) devices and cellular telephones (e.g., cellular
camera phones), over the past few years has expanded the concept of
printed media by making it available in an electronically readable
and searchable form and by introducing interactive multimedia
capabilities, which are unparalleled by traditional printed
media.
[0007] Unfortunately, a gap exists between the electronic
multimedia-based world that is accessible electronically and the
physical world of print media. For example, although almost
everyone in the developed world has access to printed media and to
electronic information on a daily basis, users of printed media and
of personal electronic devices do not possess the tools and
technology required to form a link between the two (i.e., for
facilitating a mixed media document).
[0008] Moreover, there are particular advantageous attributes that
conventional printed media provides such as tactile feel, no power
requirements, and permanency for organization and storage, which
are not provided with virtual or digital media. Likewise, there are
particular advantageous attributes that conventional digital media
provides such as portability (e.g., carried in storage of cell
phone or laptop) and ease of transmission (e.g., email).
[0009] One particular problem is that a publisher cannot allow
access to electronic versions of content using printed versions of
the content. For example, for the publisher of a newspaper there is
no mechanism that allows its users who receive the printed
newspaper on a daily basis to use images of the newspaper to access
the same online electronic content as well as augmented content.
Moreover, while the publisher typically has the content for the
daily newspaper in electronic form prior to printing, there
currently does not exist a mechanism to easily migrate that content
into an electronic form with augmented content.
[0010] A second problem in the prior art is that the image capture
devices that are most prevalent and common as part of mobile
computing devices (e.g., cell phones) produce low-quality images.
In attempting to compare the low-quality images to pristine
versions of printed documents, recognition is very difficult if not
impossible. Thus there is a need for a method for recognizing
low-quality images.
[0011] A third problem in the prior art is that the image
recognition process is computationally very expensive and can
require seconds if not minutes to accurately recognize the page and
location of a pristine document from an input query image. This can
especially be a problem with a large data set, for example,
millions of pages of documents. Thus, there is a need for
mechanisms to improve the speed in which recognition can be
performed.
[0012] A fourth problem in the prior is that comparing low-quality
images to a database of pristine images often produces a number of
possible matches. Furthermore, when low-quality images are used as
the query image, multiple different recognition algorithms may be
required in order to produce any match. Currently the prior art
does not have a mechanism to combine the recognition results into a
single result that can be presented to the user.
SUMMARY OF THE INVENTION
[0013] The present invention overcomes the deficiencies of the
prior art with an MMR system for use in processing image queries
across index tables with unequal index priority. The system is
particularly advantageous because it provides smaller, more
specialized indexes that provide faster and/or more accurate search
results. The system is also advantageous because its unique
architecture can be easily adapted and updated.
[0014] In one embodiment, the MMR system comprises a plurality of
mobile devices, a computer, a pre-processing server or MMR gateway,
and an MMR matching unit. Some embodiments also include an MMR
publisher. The mobile devices are communicatively coupled to the
pre-processing server or MMR gateway to send retrieval requests
including image queries and other contextual information. The
pre-processing server or MMR gateway processes the retrieval
request and generates an image query that is passed on to the MMR
matching unit. The MMR matching unit includes a dispatcher, a
plurality of recognition units, and index tables, as well as an
image registration unit. The MMR matching unit receives the image
query and identifies a result including a document, the page, and
the location on the page corresponding to the image query. The MMR
matching unit includes a segmenter for segmenting received images
by content type, a distributor for distributing the images to
corresponding content type index tables, and an integrator for
integrating recognition results according to one embodiment. The
result is returned to the mobile device via the pre-processing
server or MMR gateway. In one embodiment, the system includes an
MMR matching plug-in installed on the mobile device to filter,
pre-process, or search for images on the device before they are
included as part of a retrieval request.
[0015] The features and advantages described herein are not
all-inclusive and many additional features and advantages will be
apparent to one of ordinary skill in the art in view of the figures
and description. Moreover, it should be noted that the language
used in the specification has been principally selected for
readability and instructional purposes, and not to limit the scope
of the inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The invention is illustrated by way of example, and not by
way of limitation in the figures of the accompanying drawings in
which like reference numerals are used to refer to similar
elements.
[0017] FIG. 1A is a block diagram of one embodiment of a system of
mixed media reality using multiple indexes in accordance with the
present invention.
[0018] FIG. 1B is a block diagram of another embodiment of a system
of mixed media reality using multiple indexes in accordance with
the present invention.
[0019] FIG. 2A is a block diagram of a first embodiment of a mobile
device, network, and pre-processing server or MMR gateway
configured in accordance with the present invention.
[0020] FIG. 2B is a block diagram of a second embodiment of a
mobile device, network, and pre-processing server or MMR gateway
configured in accordance with the present invention.
[0021] FIGS. 2C-2H are block diagrams of various embodiments of a
mobile device plug-in, pre-processing server or MMR gateway, and
MMR matching unit showing various possible configurations in
accordance with the present invention.
[0022] FIG. 3A is a block diagram of an embodiment of a
pre-processing server in accordance with the present invention.
[0023] FIG. 3B is a block diagram of an embodiment of an MMR
gateway in accordance with the present invention.
[0024] FIG. 4A is a block diagram of a first embodiment of a MMR
matching unit in accordance with the present invention.
[0025] FIG. 4B is a block diagram of a second embodiment of the MMR
matching unit in accordance with the present invention.
[0026] FIG. 4C is a block diagram of a third embodiment of the MMR
matching unit in accordance with the present invention.
[0027] FIG. 5 is a block diagram of an embodiment of a dispatcher
in accordance with the present invention.
[0028] FIGS. 6A-6F are block diagrams showing several
configurations of an image retrieval unit in accordance with
various embodiment of the present invention.
[0029] FIG. 7 is a block diagram of an embodiment of a registration
unit in accordance with the present invention.
[0030] FIG. 8 is a block diagram of an embodiment of a quality
predictor in accordance with the present invention.
[0031] FIG. 9 is a flowchart of an embodiment of a method for
retrieving a document and location from an input image in
accordance with the present invention.
[0032] FIG. 10A is a flowchart of a method of updating a high
priority index using actual image queries received in accordance
with one embodiment of the present invention.
[0033] FIG. 10B is a flowchart of a method of updating a high
priority index using image query projections in accordance with one
embodiment of the present invention.
[0034] FIG. 11 is a flowchart of a method for updating a high
priority index in accordance with an embodiment of the present
invention.
[0035] FIG. 12 is a flowchart of a method for image-feature-based
ordering in accordance with an embodiment of the present
invention.
[0036] FIG. 13 is a flowchart of a method for processing image
queries across multiple index tables in accordance with an
embodiment of the present invention.
[0037] FIG. 14 is a flowchart of a method for segmenting and
processing image queries in accordance with an embodiment of the
present invention.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0038] An architecture for a mixed media reality (MMR) system 100
capable of receiving the query images and returning document pages
and location as well as receiving images, hot spots, and other data
and adding such information to the MMR system is described. In the
following description, for purposes of explanation, numerous
specific details are set forth in order to provide a thorough
understanding of the invention. It will be apparent, however, to
one skilled in the art that the invention can be practiced without
these specific details. In other instances, structures and devices
are shown in block diagram form in order to avoid obscuring the
invention. For example, the present invention is described in one
embodiment below with reference to use with a conventional mass
media publisher, in particular a newspaper publisher. However, the
present invention applies to any type of computing systems and data
processing in which multiple types of media including electronic
media and print media are used.
[0039] Reference in the specification to "one embodiment" or "an
embodiment" means that a particular feature, structure, or
characteristic described in connection with the embodiment is
included in at least one embodiment of the invention. The
appearances of the phrase "in one embodiment" in various places in
the specification are not necessarily all referring to the same
embodiment. In particular the present invention is described below
in the content of two distinct architectures and some of the
components are operable in both architectures while others are
not.
[0040] Some portions of the detailed descriptions that follow are
presented in terms of algorithms and symbolic representations of
operations on data bits within a computer memory. These algorithmic
descriptions and representations are the means used by those
skilled in the data processing arts to most effectively convey the
substance of their work to others skilled in the art. An algorithm
is here, and generally, conceived to be a self consistent sequence
of steps leading to a desired result. The steps are those requiring
physical manipulations of physical quantities. Usually, though not
necessarily, these quantities take the form of electrical or
magnetic signals capable of being stored, transferred, combined,
compared, and otherwise manipulated. It has proven convenient at
times, principally for reasons of common usage, to refer to these
signals as bits, values, elements, symbols, characters, terms,
numbers, or the like.
[0041] It should be borne in mind, however, that all of these and
similar terms are to be associated with the appropriate physical
quantities and are merely convenient labels applied to these
quantities. Unless specifically stated otherwise as apparent from
the following discussion, it is appreciated that throughout the
description, discussions utilizing terms such as "processing" or
"computing" or "calculating" or "determining" or "displaying" or
the like, refer to the action and processes of a computer system,
or similar electronic computing device, that manipulates and
transforms data represented as physical (electronic) quantities
within the computer system's registers and memories into other data
similarly represented as physical quantities within the computer
system memories or registers or other such information storage,
transmission or display devices.
[0042] The present invention also relates to an apparatus for
performing the operations herein. This apparatus may be specially
constructed for the required purposes, or it may comprise a
general-purpose computer selectively activated or reconfigured by a
computer program stored in the computer. Such a computer program
may be stored in a computer readable storage medium, such as, but
not limited to, any type of disk including floppy disks, optical
disks, CD-ROMs, and magnetic-optical disks, read-only memories
(ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or
optical cards, or any type of media suitable for storing electronic
instructions, each coupled to a computer system bus.
[0043] Finally, the algorithms and displays presented herein are
not inherently related to any particular computer or other
apparatus. Various general-purpose systems may be used with
programs in accordance with the teachings herein, or it may prove
convenient to construct more specialized apparatuses to perform the
required method steps. The required structure for a variety of
these systems will appear from the description below. In addition,
the present invention is described without reference to any
particular programming language. It will be appreciated that a
variety of programming languages may be used to implement the
teachings of the invention as described herein.
System Overview
[0044] FIG. 1A shows an embodiment of an MMR system 100a in
accordance with the present invention. The MMR system 100a
comprises a plurality of mobile devices 102a-102n, a pre-processing
server 103, and an MMR matching unit 106. In an alternative
embodiment, the pre-processing server 103 and its functionality are
integrated into the MMR matching unit 106. The present invention
provides an MMR system 100a for processing image queries across
multiple indexes, including high priority indexes, and updating the
same. The MMR system 100a is particularly advantageous because it
provides smaller, more specialized indexes that provide faster
and/or more accurate search results. The MMR system 100a is also
advantageous because its unique architecture can be easily adapted
and updated.
[0045] The mobile devices 102a-102n are communicatively coupled by
signal lines 132a-132n, respectively, to the pre-processing server
103 to send a "retrieval request." A retrieval request includes one
or more of "image queries," other contextual information, and
metadata. In one embodiment, an image query is an image in any
format, or one or more features of an image. Examples of image
queries include still images, video frames and sequences of video
frames. The mobile devices 102a-102n are mobile computing devices
such as mobile phones, which include a camera to capture images. It
should be understood that the MMR system 100a will be utilized by
thousands or even millions of users. Thus, even though only two
mobile devices 102a, 102n are shown, those skilled in the art will
appreciate that the pre-processing server 103 may be simultaneously
coupled to, receive and respond to retrieval requests from numerous
mobile devices 102a-102n. Alternate embodiments for the mobile
devices 102a-102n are described in more detail below with reference
to FIGS. 2A and 2B.
[0046] As noted above, the pre-processing server 103 is able to
couple to thousands if not millions of mobile computing devices
102a-102n and service their retrieval requests. The pre-processing
server 103 also may be communicatively coupled to the computer 110
by signal line 130 for administration and maintenance of the
pre-processing server 103. The computer 110 can be any conventional
computing device such as a personal computer. The main function of
the pre-processing server 103 is processing retrieval requests from
the mobile devices 102a-102n and returning recognition results back
to the mobile devices 102a-102n. In one embodiment, the recognition
results include one or more of a Boolean value (true/false) and if
true, a page ID, and a location on the page. In other embodiments,
the recognition results also include one or more from the group of
actions, a message acknowledging that the recognition was
successful (or not) and consequences of that decision, such as the
sending of an email message, a document, actions defined within a
portable document file, addresses such as URLs, binary data such as
video, information capable of being rendered on the mobile device
102, menus with additional actions, raster images, image features,
etc. The pre-processing server 103 generates an image query and
recognition parameters from the retrieval request according to one
embodiment, and passes them on to the MMR matching unit 106 via
signal line 134. Embodiments and operation of the pre-processing
server 103 are described in greater detail below with reference to
FIG. 3A.
[0047] The MMR matching unit 106 receives the image query from the
pre-processing server 103 on signal line 134 and sends it to one or
more of recognition units to identify a result including a
document, the page and the location on the page corresponding to
the image query, referred to generally throughout this application
as the "retrieval process." The result is returned from the MMR
matching unit 106 to the pre-processing server 103 on signal line
134. In addition to the result, the MMR matching unit 106 may also
return other related information such as hotspot data. The MMR
matching unit 106 also includes components for receiving new
content and updating and reorganizing index tables used in the
retrieval process. The process of adding new content to the MMR
matching unit 106 is referred to generally throughout this
application as the "registration process." Various embodiments of
the MMR matching unit 106 and is components are described in more
detail below with reference to FIG. 4A-8.
[0048] FIG. 1B shows an embodiment of a MMR system 100b in
accordance with the present invention. The MMR system 100b
comprises a plurality of mobile devices 102a-102n, an MMR gateway
104, an MMR matching unit 106, an MMR publisher 108 and a computer
110. The present invention provides, in one aspect, an MMR system
100b for use in newspaper publishing. The MMR system 100b for
newspaper publishing is particularly advantageous because provides
an automatic mechanism for a newspaper publisher to register images
and content with the MMR system 100b. The MMR system 100b for
newspaper publishing is also advantageous because it has a unique
architecture adapted to respond to image queries formed of image
portions or pages of a printed newspaper. The MMR system 100b is
also advantageous because it provides smaller, more specialized
indexes that provide faster and/or more accurate search results,
and its unique architecture can be easily adapted and updated.
[0049] The mobile devices 102a-102n are similar to those described
above, except that they are communicatively coupled by signal lines
132a-132n, respectively, to the MMR gateway 104 to send a
"retrieval request," rather than to the pre-processing server 103.
It should be understood that the MMR system 100b will be utilized
by thousands or even millions of users that receive a traditional
publication such as a daily newspaper.
[0050] As noted above, the MMR gateway 104 is able to couple to
hundreds if not thousands of mobile computing devices 102a-102n and
service their retrieval requests. The MMR gateway 104 is also
communicatively coupled to the computer 110 by signal line 130 for
administration and maintenance of the MMR gateway 104 and running
business applications. In one embodiment, the MMR gateway 104
creates and presents a web portal for access by the computer 110 to
run business applications as well as access logs of use of the MMR
system 100b. The computer 110 in any conventional computing device
such as a personal computer. The main function of the MMR gateway
104 is processing retrieval requests from the mobile devices
102a-102n and return recognition results back to the mobile devices
102a-102n. The types of recognition results produced by the MMR
gateway 104 are similar as to those described above in conjunction
with pre-processing server 103. The MMR gateway 104 processes
received retrieval requests by performing user authentication,
accounting, analytics and other communication. The MMR gateway 104
also generates an image query and recognition parameters from the
retrieval request, and passes them on to the MMR matching unit 106
via signal line 134. Embodiments and operation of the MMR gateway
104 are described in greater detail below with reference to FIG.
3B.
[0051] The MMR matching unit 106 is similar to that described above
in conjunction with FIG. 1A, except that the MMR matching unit 106
receives the image query from the MMR gateway 104 on signal line
134 as part of the "retrieval process." The result is returned from
the MMR matching unit 106 to the MMR gateway 104 on signal line
134. In one embodiment, the MMR matching unit 106 is coupled to the
output of the MMR publisher 108 via signal lines 138 and 140 to
provide new content used to update index tables of the MMR matching
unit 106. In an alternate embodiment, the MMR publisher 108 is
coupled to the MMR gateway 104 by signal line 138 and the MMR
gateway 104 is in turn coupled by signal line 136 to the MMR
matching unit 106. In this alternate environment, MMR gateway 104
extracts augmented data such as hotspot information stores it and
passes the images page references and other information to the MMR
matching unit 106 for updating of the index tables.
[0052] The MMR publisher 108 includes a conventional publishing
system used to generate newspapers or other types of periodicals.
In one embodiment, the MMR publisher 108 also includes components
for generating additional information needed to register images of
printed documents with the MMR system 100. The information provided
by the MMR publisher 108 to the MMR matching unit 106 includes an
image file, bounding box data, hotspot data, and a unique page
identification number. In one embodiment, this is a document in
portable document format by Adobe Corp. of San Jose Calif. and
bounding box information.
Mobile Device 102
[0053] Referring now to FIGS. 2A and 2B, the first and second
embodiment for the mobile device 102 will be described.
[0054] FIG. 2A shows a first embodiment of the coupling 132 between
the mobile device 102 and the pre-processing server 103 or MMR
gateway 104, according to the above-described embodiments of system
100a, 100b. In the embodiment of FIG. 2A, the mobile device 102 is
any mobile phone (or other portable computing device with
communication capability) that includes a camera. For example, the
mobile device 102 may be a smart phone such as the Blackberry.RTM.
manufactured and sold by Research In Motion. The mobile device 102
is adapted for wireless communication with the network 202 by a
communication channel 230. The network 202 is a conventional type
such as a cellular network maintained by wireless carrier and may
include a server. In this embodiment, the mobile device 102
captures an image and sends the image to the network 202 over
communications channel 230 such as by using a multimedia messaging
service (MMS). The network 202 can also use the communication
channel 230 to return results such as using MMS or using a short
message service (SMS). As illustrated, the network 202 is in turn
coupled to the pre-processing server 103 or MMR gateway 104 by
signal lines 232. Signal lines 232 represent a channel for sending
MMS or SMS messages as well as a channel for receiving hypertext
transfer protocol (HTTP) requests and sending HTTP responses. Those
skilled in the art will recognize that this is just one example of
the coupling between the mobile device 102 and the pre-processing
server 103 or MMR gateway 104. In an alternate embodiment for
example, Bluetooth.RTM., WiFi, or any other wireless communication
protocol may be used as part of communication coupling between the
mobile device 102 and the pre-processing server 103 or MMR gateway
104. The mobile device 102 and the pre-processing server 103 or MMR
gateway 104 could be coupled in any other ways understood by those
skilled in the art (e.g., direct data connection, SMS, WAP, email)
so long as the mobile device 102 is able to transmit images to the
pre-processing server 103 or MMR gateway 104 and the pre-processing
server 103 or MMR gateway 104 is able to respond by sending
document identification, page number, and location information.
[0055] Referring now to FIG. 2B, a second embodiment of the mobile
device 102 is shown. In this second embodiment, the mobile device
102 is a smart phone such as the iPhone.TM. manufactured and sold
by Apple Computer Inc. of Cupertino Calif. The second embodiment
has a number of components similar to those of the first
embodiment, and therefore, like reference numbers are used to
reference like components with the same or similar functionality.
Notable differences between the first embodiment and the second
embodiment include an MMR matching plug-in 205 that is installed on
the mobile device 102, and a Web server 206 coupled by signal line
234 to the network 202. The MMR matching plug-in 205 analyzes the
images captured by the mobile device 102, acting similar to
dispatcher 402 as discussed in conjunction with FIG. 4A. The MMR
matching plug-in 205 provides additional information produced by
its analysis and includes that information as part of the retrieval
request sent to the pre-processing server 103 or MMR gateway 104 to
improve the accuracy of recognition. In an alternate embodiment,
the output of the MMR matching plug-in 205 is used to select which
images are transmitted from the mobile device 102 to the
pre-processing server 103 or MMR gateway 104. For example, only
those images that have a predicted quality above a predetermined
threshold (e.g., images capable of being recognized) are
transmitted from the mobile device 102 to the pre-processing server
103 or MMR gateway 104. Since transmission of images requires
significant bandwidth and the communication channel 230 between the
mobile device 102 and the network 202 may have limited bandwidth,
using the MMR matching plug-in 205 to select which images to
transmit is particularly advantageous. In addition, the MMR
matching plug-in 205 may allow for recognition on the mobile device
102 it sells, e.g., using a device HPI 411' such as will be
discussed in conjunction with FIG. 6F. Thus, in one embodiment, the
MMR matching plug-in 205 acts as a mini-MMR matching unit 104.
[0056] The second embodiment shown in FIG. 2B also illustrates how
the results returned from the pre-processing server 103 or MMR
gateway 104, or other information provided by the MMR matching
plug-in 205, can be used by the mobile device 102 to access hotspot
or augmented information available on a web server 206. In such a
case, the results from the pre-processing server 103 or MMR gateway
104 or output of the MMR matching plug-in 205 would include
information that can be used to access Web server 206 such as with
a conventional HTTP request and using web access capabilities of
the mobile device 102.
[0057] It should be noted that regardless of whether the first
embodiment or the second embodiment of the mobile device 102 is
used, the mobile device 102 generates the retrieval request that
may include: a query image, a user or device ID, a command, and
other contact information such as device type, software, plug-ins,
location (for example if the mobile device includes a GPS
capability), device and status information (e.g., device model,
macro lens on/off status, autofocus on/off, vibration on/off, tilt
angle, etc), context-related information (weather at the phone's
location, time, date, applications currently running on the phone),
user-related information (e.g., id number, preferences, user
subscriptions, user groups and social structures, action and
action-related meta data such as email actions and emails waiting
to be sent), quality predictor results, image features, etc.
[0058] Referring now to FIGS. 2C-2H, various embodiments are shown
of a plug-in (client 250) for the mobile device 102, the
pre-processing server 103 or MMR gateway 104 (referred to as just
MMR gateway for FIGS. 2C-2H), and MMR matching unit 106 represented
generally as including a server 252 that has various possible
configurations in accordance with the present invention. More
particularly, FIGS. 2C-2H illustrate how the components of the
plug-in or client 250 can have varying levels of functionality and
the server 252 can also have varying levels of functionality that
parallel or match with the functionality of the client 250. In the
various embodiments of FIGS. 2C-2H, either the client 250 or the
server 252 includes: an MMR database 254; a capture module 260 for
capturing an image or video; a preprocessing module 262 for
processing the image before feature extraction for improved
recognition such as quality prediction; a feature extraction module
264 for extracting image features; a retrieval module 266 for using
features to retrieve information from the MMR database 254; a send
message module 268 for sending messages from the server 252 to the
client 250; an action module 270 for performing an action; a
preprocessing and prediction module 272 for processing the image
prior to feature extraction; a feedback module 274 for presenting
information to the user and receiving input; a sending module 276
for sending information from the client 250 to the server 252; and
a streaming module 278 for streaming video from the client 250 to
the server 252.
[0059] FIG. 2C illustrates one embodiment for the client 250 and
the server 252 in which the client 250 sends an image or video
and/or metadata to the server 252 for processing. In this
embodiment, the client 250 includes the capture module 260. The
server 252 includes: the MMR database 254, the preprocessing module
262, the feature extraction module 264, the retrieval module 266,
the send message module 268 and the action module 270.
[0060] FIG. 2D illustrates another embodiment for the client 250
and the server 252 in which the client 250 captures an image or
video, runs quality prediction, and sends an image or video and/or
metadata to the server 252 for processing. In this embodiment, the
client 250 includes: the capture module 260, the preprocessing and
prediction module 272, the feedback module 274 and the sending
module 276. The server 252 includes: the MMR database 254, the
preprocessing module 262, the feature extraction module 264, the
retrieval module 266, the send message module 268 and the action
module 270. It should be noted that in this embodiment the image
sent to the server 252 may be different than the captured image.
For example, it may be digitally enhanced, sharpened, or may be
just binary data.
[0061] FIG. 2E illustrates another embodiment for the client 250
and the server 252 in which the client 250 captures an image or
video, performs feature extraction and sends image features to the
server 252 for processing. In this embodiment, the client 250
includes: the capture module 260, the feature extraction module
264, the preprocessing and prediction module 272, the feedback
module 274 and the sending module 276. The server 252 includes: the
MMR database 254, the retrieval module 266, the send message module
268 and the action module 270. It should be noted that in this
embodiment feature extraction may include preprocessing. After
features are extracted, the preprocessing and prediction module 272
may run on these features and if the quality of the features is not
satisfactory, the user may be asked to capture another image.
[0062] FIG. 2F illustrates another embodiment for the client 250
and the server 252 in which the entire retrieval process is
performed at the client 250. In this embodiment, the client 250
includes: the capture module 260, the feature extraction module
264, the preprocessing and prediction module 272, the feedback
module 274 and the sending module 276, the MMR database 254, and
the retrieval module 266. The server 252 need only have the action
module 270.
[0063] FIG. 2G illustrates another embodiment for the client 250
and the server 252 in which the client 250 streams video to the
server 252. In this embodiment, the client 250 includes the capture
module 260 and a streaming module 278. The server 252 includes the
MMR database 254, the preprocessing module 262, the feature
extraction module 264, the retrieval module 266, the send message
module 268 and the action module 270. Although not shown, the
client 250 can run a predictor in the captured video stream and
provide user feedback on where to point the camera or how to
capture better video for retrieval. In a modification of this
embodiment, the server 252 streams back information related to the
captured video and the client 250 can overlay that information on a
video preview screen.
[0064] FIG. 2H illustrates another embodiment for the client 250
and the server 252 in which the client 250 runs a recognizer and
the server 252 streams MMR database information to a local database
operable with the client 250 based upon a first recognition result.
This embodiment is similar to that described above with reference
to FIG. 2F. For example, the entire retrieval process for one
recognition algorithm is run at the client 250. If the recognition
algorithm fails, the query is handed to the server 252 for running
more complex retrieval algorithm. In this embodiment, the client
250 includes: the capture module 260, the feature extraction module
264, the preprocessing and prediction module 272, the feedback
module 274, the sending module 276, the MMR database 254 (a local
version), and the retrieval module 266. The server 252 includes
another retrieval module 266, the action module 270 and the MMR
database 254 (a complete and more complex version). In one
embodiment, if the query image cannot be recognized with the local
MMR database 254, the client 250 sends an image for retrieval to
the server 252 and that initiates an update of the local MMR
database 254. Alternatively, the client 250 may contain an updated
version of a database for one recognizer, but if the query image
cannot be retrieved from the local MMR database 254, then a
database for another retrieval algorithm may be streamed to the
local MMR database 254.
Pre-Processing Server 103
[0065] Referring now to FIG. 3A, one embodiment of the
pre-processing server 103 is shown. This embodiment of the
pre-processing server 103 comprises an operating system (OS) 301, a
controller 303, a communicator 305, a request processor 307, and
applications 312, connected to system bus 325. Optionally, the
pre-processing server 103 also may include a web server 304, a
database 306, and/or a hotspot database 404.
[0066] As noted above, one of the primary functions of the
pre-processing server 103 is to communicate with many mobile
devices 102 to receive retrieval requests and send responses
including a status indicator (true=recognized/false=not
recognized), a page identification number, a location on the page
and other information, such as hotspot data. A single
pre-processing server 103 can respond to thousands or millions of
retrieval requests. For convenience and ease of understanding only
a single pre-processing server 103 is shown in FIGS. 1A and 3A,
however, those skilled in the art will recognize that in other
embodiments any number of pre-processing servers 103 may be
utilized to service the needs of a multitude of mobile devices 102.
More particularly, the pre-processing server 103 system bus 325 is
coupled to signal lines 132a-132n for communication with various
mobile devices 102. The pre-processing server 103 receives
retrieval requests from the mobile devices 102 via signal lines
132a-132n and sends responses back to the mobile devices 102 using
the same signal lines 132a-132n. In one embodiment, the retrieval
request includes: a command, a user identification number, an image
and other context information. For example, other context
information may include: device information such as the make, model
or manufacture of the mobile device 102; location information such
as provided by a GPS system that is part of the mobile device or by
triangulation; environmental information such as time of day,
temperature, weather conditions, lighting, shadows, object
information; and placement information such as distance, location,
tilt and jitter.
[0067] The pre-processing server 103 also is coupled to signal line
130 for communication with the computer 110. Again, for convenience
and ease of understanding only a single computer 110 and signal
line 130 are shown in FIGS. 1A and 3A, but any number of computing
devices may be adapted for communication with the pre-processing
server 103. The pre-processing server 103 facilitates communication
between the computer 110 and the operating system (OS) 301, a
controller 303, a communicator 305, a request processor 307, and
applications 312. The OS 301, controller 303, communicator 305,
request processor 307, and applications 312 are coupled to system
bus 325 by signal line 330.
[0068] The pre-processing server 103 processes the retrieval
request and generates an image query and recognition parameters
that are sent via signal line 134, which also is coupled to system
bus 325, to the MMR matching unit 106 for recognition. The
pre-processing server 103 also receives recognition responses from
the MMR matching unit 106 via signal line 134. More specifically,
the request processor 307 processes the retrieval request and sends
information via signal line 330 to the other components of the
pre-processing server 103 as will be described below.
[0069] The operating system 301 is preferably a custom operating
system that is accessible to computer 110, and otherwise configured
for use of the pre-processing server 103 in conjunction with the
MMR matching unit 106. In an alternate embodiment, the operating
system 301 is one of a conventional type such as, WINDOWS.RTM., Mac
OS X.RTM., SOLARIS.RTM., or LINUX.RTM. based operating systems. The
operating system 301 is connected to system bus 325 via signal line
330.
[0070] The controller 303 is used to control the other modules 305,
307, 312, per the description of each below. While the controller
303 is shown as a separate module, those skilled in the art will
recognize that the controller 303 in another embodiment may be
distributed as routines in other modules. The controller 303 is
connected to system bus 325 via signal line 330.
[0071] The communicator 305 is software and routines for sending
data and commands among the pre-processing server 103, mobile
devices 102, and MMR matching unit 106. The communicator 305 is
coupled to signal line 330 to send and receive communications via
system bus 325. The communicator 305 communicates with the request
processor 307 to issue image queries and receive results.
[0072] The request processor 307 processes the retrieval request
received via signal line 330, performing preprocessing and issuing
image queries for sending to MMR matching unit 106 via signal line
134. In some embodiments, the preprocessing may include feature
extraction and recognition parameter definition, in other
embodiments these parameters are obtained from the mobile device
102 and are passed on to the MMR matching unit 106. The request
processor 307 also sends information via signal line 330 to the
other components of the pre-processing server 103. The request
processor 307 is connected to system bus 325 via signal line
330.
[0073] The one or more applications 312 are software and routines
for providing functionality related to the processing of MMR
documents. The applications 312 can be any of a variety of types,
including without limitation, drawing applications, word processing
applications, electronic mail applications, search application,
financial applications, and business applications adapted to
utilize information related to the processing of retrieval quests
and delivery of recognition responses such as but not limited to
accounting, groupware, customer relationship management, human
resources, outsourcing, loan origination, customer care, service
relationships, etc. In addition, applications 312 may be used to
allow for annotation, linking additional information, audio or
video clips, building e-communities or social networks around the
documents, and associating educational multimedia with recognized
documents.
[0074] System bus 325 represents a shared bus for communicating
information and data throughout pre-processing server 103. System
bus 325 may represent one or more buses including an industry
standard architecture (ISA) bus, a peripheral component
interconnect (PCI) bus, a universal serial bus (USB), or some other
bus known in the art to provide similar functionality. Additional
components may be coupled to pre-processing server 103 through
system bus 325 according to various embodiments.
[0075] The pre-processing server 103 optionally also includes a web
server 304, a database 306, and/or a hotspot database 404 according
to one embodiment.
[0076] The web server 304 is a conventional type and is responsible
for accepting requests from clients and sending responses along
with data contents, such as web pages, documents, and linked
objects (images, etc.) The Web server 304 is coupled to data store
306 such as a conventional database. The Web server 304 is adapted
for communication via signal line 234 to receive HTTP requests from
any communication device, e.g., mobile devices 102, across a
network such as the Internet. The Web server 304 also is coupled to
signal line 330 as described above to receive Web content
associated with hotspots for storage in the data store 306 and then
for later retrieval and transmission in response to HTTP requests.
Those skilled in the art will understand that inclusion of the Web
server 304 and data store 306 as part of the pre-processing server
103 is merely one embodiment and that the Web server 304 and the
data store 306 may be operational in any number of alternate
locations or configuration so long as the Web server 304 is
accessible to mobile devices 102 and computers 110 via the
Internet.
[0077] In one embodiment, the pre-processing server 103 also
includes a hotspot database 404. The hotspot database 404 is shown
in FIG. 3A with dashed lines to reflect inclusion in the
pre-processing server 103 is an alternate embodiment. The hotspot
database 404 is coupled by signal line 436 to receive the
recognition responses via line 134. The hotspot database 404 uses
these recognition responses to query the database and output via
line 432 and system bus 325 the hotspot content corresponding to
the recognition responses. This hotspot content is included with
the recognition responses sent to the requesting mobile device
102.
MMR Gateway 104
[0078] Referring now to FIG. 3B, one embodiment of the MMR gateway
104 is shown. This embodiment of the MMR gateway 104 comprises a
server 302, a Web server 304, a data store 306, a portal module
308, a log 310, one or more applications 312, an authentication
module 314, an accounting module 316, a mail module 318, and an
analytics module 320.
[0079] As noted above, one of the primary functions of the MMR
gateway 104 is to communicate with many mobile devices 102 to
receive retrieval requests and send responses including a status
indicator (true=recognized/false=not recognized), a page
identification number, a location on the page and other information
such as hotspot data. A single MMR gateway 104 can respond to
thousands or millions of retrieval requests. For convenience and
ease of understanding only a single MMR gateway 104 is shown in
FIGS. 1B and 3B, however, those skilled in the art will recognize
that in other embodiments any number of MMR gateways 104 may be
utilized to service the needs of a multitude of mobile devices 102.
More particularly, the server 302 of the MMR gateway 104 is coupled
to signal lines 132a-132n for communication with various mobile
devices 102. The server 302 receives retrieval requests from the
mobile devices 102 via signal lines 132a-132n and sends responses
back to the mobile devices 102 using the same signal lines
132a-132n. In one embodiment, the retrieval request includes: a
command, a user identification number, an image and other context
information. For example, other context information may include:
device information such as the make, model or manufacture of the
mobile device 102; location information such as provided by a GPS
system that is part of the mobile device or by triangulation;
environmental information such as time of day, temperature, weather
conditions, lighting, shadows, object information; and placement
information such as distance, location, tilt, and jitter.
[0080] The server 302 is also coupled to signal line 130 for
communication with the computer 110. Again, for convenience and
ease of understanding only a single computer 110 and signal line
130 are shown in FIGS. 1B and 3B, but any number of computing
devices may be adapted for communication with the server 302. The
server 302 facilitates communication between the computer 110 and
the portal module 308, the log module 310 and the applications 312.
The server 302 is coupled to the portal module 308, the log module
310 and the applications 312 by signal line 330. As will be
described in more detail below, the module cooperate with the
server 302 to present a web portal that provides a user experience
for exchanging information. The Web portal 308 can also be used for
system monitoring, maintenance and administration.
[0081] The server 302 processes the retrieval request and generates
an image query and recognition parameters that are sent via signal
line 134 to the MMR matching unit 106 for recognition. The server
302 also receives recognition responses from the MMR matching unit
106 via 5 signal line 134. The server 302 also processes the
retrieval request and sends information via signal line 330 to the
other components of the MMR gateway 104 as will be described below.
The server 302 is also adapted for communication with the MMR
publisher 108 by signal line 138 and the MMR matching unit 106 via
signal line 136. The signal line 138 provides a path for the MMR
publisher 108 to send Web content for hotspots to the Web server
304 and to provide other information to the server 302. In one
embodiment, the server 302 receives information from the MMR
publisher 108 and sends that information via signal line 136 for
registration with the MMR matching unit 106.
[0082] The web server 304 is a conventional type and is responsible
for accepting requests from clients and sending responses along
with data contents, such as web pages, documents, and linked
objects (images, etc.) The Web server 304 is coupled to data store
306 such as a conventional database. The Web server 304 is adapted
for communication via signal line 234 to receive HTTP requests from
any communication device across a network such as the Internet. The
Web server 304 is also coupled to signal line 138 as described
above to receive web content associated with hotspots for storage
in the data store 306 and then for later retrieval and transmission
in response to HTTP requests. Those skilled in the art will
understand that inclusion of the Web server 304 and data store 306
as part of the MMR gateway 104 is merely one embodiment and that
the Web server 304 and the data store 306 may be operational in any
number of alternate locations or configuration so long as the Web
server 304 is accessible to mobile devices 102 and computers 110
via the Internet.
[0083] In one embodiment, the portal module 308 is software or
routines operational on the server 302 for creation and
presentation of the web portal. The portal module 308 is coupled to
signal line 330 for communication with the server 302. In one
embodiment, the web portal provides an access point for
functionality including administration and maintenance of other
components of the MMR gateway 104. In another embodiment, the web
portal provides an area where users can share experiences related
to MMR documents. In yet another embodiment, the web portal is an
area where users can access business applications and the log 310
of usage.
[0084] The log 310 is a memory or storage area for storing a list
of the retrieval request received by the server 302 from mobile
devices 102 and all corresponding responses sent by the server 302
to the mobile device. In another embodiment, the log 310 also
stores a list of the image queries generated and sent to the MMR
matching unit 106 and the recognition responses received from the
MMR matching unit 106. The log 310 is communicatively coupled to
the server 302 by signal line 330.
[0085] The one or more business applications 312 are software and
routines for providing functionality related to the processing of
MMR documents. In one embodiment the one or more business
applications 312 are executable on the server 302. The business
applications 312 can be any one of a variety of types of business
applications adapted to utilize information related to the
processing of retrieval quests and delivery of recognition
responses such as but not limited to accounting, groupware,
customer relationship management, human resources, outsourcing,
loan origination, customer care, service relationships, etc.
[0086] The authentication module 314 is software and routines for
maintaining a list of authorized users and granting access to the
MMR system 110. In one embodiment, the authentication module 314
maintains a list of user IDs and passwords corresponding to
individuals who have created an account in the system 100, and
therefore, are authorized to use MMR gateway 104 and the MMR
matching unit 106 to process retrieval requests. The authentication
module 314 is communicatively coupled by signal line 330 to the
server 302. But as the server 302 receives retrieval requests they
can be processed and compared against information in the
authentication module 314 before generating and sending the
corresponding image query on signal line 134. In one embodiment,
the authentication module 314 also generates messages for the
server 302 to return to the mobile device 102 instances when the
mobile device is not authorized, the mobile device has not
established an account, or the account for the mobile device 102 is
locked such as due to abuse or lack of payment.
[0087] The accounting module 316 is software and routines for
performing accounting related to user accounts and use of the MMR
system 100. In one embodiment, the retrieval services are provided
under a variety of different economic models such as but not
limited to use of the MMR system 100 under a subscription model, a
charge per retrieval request model or various other pricing models.
In one embodiment, the MMR system 100 provides a variety of
different pricing models and is similar to those currently offered
for cell phones and data networks. The accounting module 316 is
coupled to the server 302 by signal line 330 to receive an
indication of any retrieval request received by the server 302. In
one embodiment, the accounting module 316 maintains a record of
transactions (retrieval request/recognition responses) processed by
the server 302 for each mobile device 102. Although not shown, the
accounting module 316 can be coupled to a traditional billing
system for the generation of an electronic or paper bill.
[0088] The mail module 318 is software and routines for generating
e-mail and other types of communication. The mail module 318 is
coupled by signal at 330 to the server 302. In one embodiment, the
mobile device 102 can issue retrieval request that include a
command to deliver a document or a portion of a document or other
information via e-mail, facsimile or other traditional electronic
communication means. The mail module 318 is adapted to generate and
send such information from the MMR gateway 104 to an addressee as
prescribed by the user. In one embodiment, each user profile has
associated addressees which are potential recipients of information
retrieved.
[0089] The analytics module 320 is software and routines for
measuring the behavior of users of the MMR system 100. The
analytics module 320 is also software and routines for measuring
the effectiveness and accuracy of feature extractors and
recognition performed by the MMR matching unit 106. The analytics
module 320 measures use of the MMR system 100 including which
images are most frequently included as part of retrieval requests,
which hotspot data is most often accessed, the order in which
images are retrieved, the first image in the retrieval process, and
other key performance indicators used to improve the MMR experience
and/or a marketing campaign's audience response. In one embodiment,
the analytics module 320 measures metrics of the MMR system 100 and
analyzes the metrics used to measure the effectiveness of hotspots
and hotspot data. The analytics module 320 is coupled to the server
302, the authentication module 314 and the accounting module 316 by
signal line 330. The analytics module 320 is also coupled by the
server 302 to signal line 134 and thus can access the components of
the MMR matching unit 106 to retrieve recognition parameter, images
features, quality recognition scores and any other information
generated or use by the MMR matching unit 106. The analytics module
320 can also perform a variety of data retrieval and segmentation
based upon parameters or criteria of users, mobile devices 102,
page IDs, locations, etc.
[0090] In one embodiment, the MMR gateway 104 also includes a
hotspot database 404. The hotspot database 404 is shown in FIG. 3B
with dashed lines to reflect that inclusion in the MMR gateway 104
is an alternate embodiment. The hotspot database 404 is coupled by
signal line 436 to receive the recognition responses via line 134.
The hotspot database 404 uses these recognition responses to query
the database and output via line 432 the hotspot content
corresponding to the recognition responses. This hotspot content is
sent to the server 302 so that it can be included with the
recognition responses and sent to the requesting mobile device
102.
MMR Matching Unit 106
[0091] Referring now to FIGS. 4A-4C, three embodiments for the MMR
matching unit 106 will be described. The basic function of the MMR
matching unit 106 is to receive an image query, send the image
query for recognition, perform recognition on the images in the
image query, retrieve hotspot information, combine the recognition
result with hotspot information, and send it back to the
pre-processing server 103 or MMR gateway 104.
[0092] FIG. 4A illustrates a first embodiment of the MMR matching
unit 106. The first embodiment of the MMR matching unit 106
comprises a dispatcher 402, a hotspot database 404, an acquisition
unit 406, an image registration unit 408, and a dynamic load
balancer 418. The acquisition unit 406 further comprises a
plurality of the recognition units 410a-410n and a plurality of
index tables 412a-412n. The image registration unit 408 further
comprises an indexing unit 414 and a master index table 416.
[0093] The dispatcher 402 is coupled to signal line 134 for
receiving an image query from and sending recognition results to
the pre-processing server 103 or MMR gateway 104. The dispatcher
402 is responsible for assigning and sending an image query to
respective recognition units 410a-410n. In one embodiment, the
dispatcher 402 receives an image query, generates a recognition
unit identification number and sends the recognition unit
identification number and the image query to the acquisition unit
406 for further processing. The dispatcher 402 is coupled to signal
line 430 to send the recognition unit identification number and the
image query to the recognition units 410a-410n. The dispatcher 402
also receives the recognition results from the acquisition unit 406
via signal line 430. One embodiment for the dispatcher 402 will be
described in more detail below with reference to FIG. 5.
[0094] An alternate embodiment for the hotspot database 404 has
been described above with reference to FIGS. 3A-3B wherein the
hotspot database is part of the pre-processing server 103 or MMR
gateway 104. However, the preferred embodiment for the hotspot
database 404 is part of the MMR matching unit 106 as shown in FIG.
4A. Regardless of the embodiment, the hotspot database 404 has a
similar functionality. The hotspot database 404 is used to store
hotspot information. Once an image query has been recognized and
recognition results are produced, these recognition results are
used as part of a query of the hotspot database 404 to retrieve
hotspot information associated with the recognition results. The
retrieved hotspot information is then output on signal line 134 to
the pre-processing server 103 or MMR gateway 104 for packaging and
delivery to the mobile device 102. As shown in FIG. 4A, the hotspot
database 404 is coupled to the dispatcher 402 by signal line 436 to
receive queries including recognition results. The hotspot database
404 is also coupled by signal line 432 and signal line 134 to the
pre-processing server 103 or MMR gateway 104 for delivery of query
results. The hotspot database 404 is also coupled to signal line
136 to receive new hotspot information for storage from the MMR
publisher 108, according to one embodiment.
[0095] The acquisition unit 406 comprises the plurality of the
recognition units 410a-410n and a plurality of index tables
412a-412n. Each of the recognition units 410a-410n has and is
coupled to a corresponding index table 412a-412n. In one
embodiment, each recognition unit 410/index table 412 pair is on
the same server. The dispatcher 402 sends the image query to one or
more recognition units 410a-410n. In one embodiment that includes
redundancy, the image query is sent from the dispatcher 402 to a
plurality of recognition units 410 for recognition and retrieval
and the index tables 412a-n index the same data. In the serial
embodiment, the image query is sent from the dispatcher 402 to a
first recognition unit 410a. If recognition is not successful on
the first recognition unit 410a, the image query is passed on to a
second recognition unit 410b, and so on. In yet another embodiment,
the dispatcher 402 performs some preliminary analysis of the image
query and then selects a recognition unit 410a-410n best adapted
and most likely to be successful at recognizing the image query.
Those skilled in the art will understand that there are a variety
of configurations for the plurality of recognition units 410a-410n
and the plurality of index tables 412a-412n. Example embodiments
for the acquisition unit 406 will be described in more detail below
with reference to FIGS. 6A-6F. It should be understood that the
index tables 412a-412n can be updated at various times as depicted
by the dashed lines 434 from the master index table 416.
[0096] The image registration unit 408 comprises the indexing unit
414 and the master index table 416. The image registration unit 408
has an input coupled to signal on 136 to receive updated
information from the MMR publisher 108, according to one
embodiment, and an input coupled to signal line 438 to receive
updated information from the dynamic load balancer 418. The image
registration unit 408 is responsible for maintaining the master
index table 416 and migrating all or portions of the master index
table 416 to the index tables 412a-412n (slave tables) of the
acquisition unit 406. In one embodiment, the indexing unit 414
receives images, unique page IDs, and other information; and
converts it into index table information that is stored in the
master index table 416. In one embodiment, the master index table
416 also stores the record of what is migrated to the index table
412. The indexing unit 414 also cooperates with the MMR publisher
108 according to one embodiment to maintain a unique page
identification numbering system that is consistent across image
pages generated by the MMR publisher 108, the image pages stored in
the master index table 416, and the page numbers used in
referencing data in the hotspot database 404.
[0097] One embodiment for the image registration unit 408 is shown
and described in more detail below with reference to FIG. 7.
[0098] The dynamic load balancer 418 has an input coupled to signal
line 430 to receive the query image from the dispatcher 402 and the
corresponding recognition results from the acquisition unit 406.
The output of the dynamic load balancer 418 is coupled by signal
line 438 to an input of the image registration unit 408. The
dynamic load balancer 418 provides input to the image registration
unit 408 that is used to dynamically adjust the index tables
412a-412n of the acquisition unit 406. In particular, the dynamic
load balancer 418 monitors and evaluates the image queries that are
sent from the dispatcher 402 to the acquisition unit 406 for a
given period of time. Based on the usage, the dynamic load balancer
418 provides input to adjust the index tables 412a-412n. For
example, the dynamic load balancer 418 may measure the image
queries for a day. Based on the measured usage for that day, the
index tables may be modified and configured in the acquisition unit
406 to match the usage measured by the dynamic load balancer
418.
[0099] FIG. 4B illustrates a second embodiment of the MMR matching
unit 106. In the second embodiment, many of the components of the
MMR matching unit 106 have the same or a similar function to
corresponding elements of the first embodiment. Thus, like
reference numbers have been used to refer to like components with
the same or similar functionality. The second embodiment of the MMR
matching unit 106 includes the dispatcher 402, the hotspot database
404, and the dynamic load balancer 418 similar to the first
embodiment of the MMR matching unit 106. However, the acquisition
unit 406 and the image registration unit 408 are different than
that described above with reference to FIG. 4A. In particular, the
acquisition unit 406 and the image registration unit 408 utilize a
shared SQL database for the index tables and the master table. More
specifically, there is the master index table 416 and a mirrored
database 418 that includes the local index tables 412a-n. Moreover,
conventional functionality of SQL database replication is used to
generate the mirror images of the master index table 416 stored in
the index tables 412a-n for use in recognition. The image
registration unit 408 is configured so that when new images are
added to the master index table 416 they are immediately available
to all the recognition units 410. This is done by mirroring the
master index table 416 across all the local index tables 412a-n
using large RAM (not shown) and database mirroring technology.
[0100] FIG. 4C illustrates a third embodiment of the MMR matching
unit 106. In the third embodiment, many of the components of the
MMR matching unit 106 have the same or a similar function to
corresponding elements of the first and second embodiments. Thus,
like reference numbers have been used to refer to like components
with the same or similar functionality. The third embodiment of the
MMR matching unit 106 includes the dispatcher 402, the hotspot
database 404, and the dynamic load balancer 418 similar to the
first and second embodiments of the MMR matching unit 106. However,
the acquisition unit 406 and the image registration unit 408 are
different than that described above with reference to FIGS. 4A and
4B. In particular, the acquisition unit 406 and the image
registration unit 408 utilize a single shared SQL database (master
index table 416). The image registration unit 408 is configured so
that when new images are added to the master index table 416 they
are immediately available to all the recognition units 410. The
recognition units 410a-410n and the index unit 414 are
communicatively coupled to the master index table 416. This
embodiment is particularly advantageous because it uses a
simplified version of the recognition servers, as each of them need
not maintain a separate index table that needs updating. In
addition, consistency among multiple databases is not a
concern.
Dispatcher 402
[0101] Referring now to FIG. 5, an embodiment of the dispatcher 402
shown. The dispatcher 402 comprises a quality predictor 502, an
image feature order unit 504, a distributor 506, a segmenter 505,
and an integrator 509. The quality predictor 502, the image feature
order unit 504, the segmenter 505, and the distributor 506 are
coupled to signal line 532 to receive image queries from the
pre-processing server 103 or MMR gateway 104. The distributor 506
is also coupled to receive the output of the quality predictor 502,
image feature order unit 504, and segmenter 505. The distributor
506 includes a FIFO queue 508 and a controller 510. The distributor
506 generates an output on signal line 534 that includes the image
query and a recognition unit identification number (RUID). Those
skilled in the art will understand that in other embodiments the
image query may be directed to any particular recognition unit
using a variety of means other than the RUID. As image queries are
received on the signal line 532, the distributor 506 receives the
image queries and places them in the order in which they are
received into the FIFO queue 508. The controller 510 receives a
recognizability score for each image query from the quality
predictor 502 and also receives an ordering signal from the image
feature order unit 504. In some embodiments, the segmenter 505
determines image content type and segments the image query into
content-type specific image queries. Using this information from
the quality predictor 502, the image feature order unit 504, and
the segmenter 505, the controller 510 selects image queries from
the FIFO queue 508, assigns them to particular recognition units
410 and sends the image query to the assigned recognition unit 410
for processing. The controller 510 maintains a list of image
queries assigned to each recognition unit 410 and the expected time
to completion for each image (as predicted by the image feature
order unit 504). The total expected time to empty the queue for
each recognition unit 410 is the sum of the expected times for the
images assigned to it. The controller 510 can execute several queue
management strategies. In a simple assignment strategy, image
queries are removed from the FIFO queue 508 in the order they
arrived and assigned to the first available recognition unit 410.
In a balanced response strategy, the total expected response time
to each query is maintained at a uniform level and query images are
removed from the FIFO queue 508 in the order they arrived, and
assigned to the FIFO queue 508 for a recognition unit so that its
total expected response time is as close as possible to the other
recognition units. In an easy-first strategy, images are removed
from the FIFO queue 508 in an order determined by their expected
completion times--images with the smallest expected completion
times are assigned to the first available recognition unit. In this
way, users are rewarded with faster response time when they submit
an image that's easy to recognize. This could incentivize users to
carefully select the images they submit. Other queue management
strategies are possible.
[0102] The dispatcher 402 also receives the recognition results
from the recognition units 410 on signal line 530. The recognition
results include a Boolean value (true/false) and if true, a page
ID, and a location on the page. In one embodiment, the dispatcher
402 merely receives and retransmits the data to the pre-processing
server 103 or MMR gateway 104. In another embodiment, the
dispatcher 402 includes an integrator 509 for integrating the
received recognition results into an integrated result, similar to
the functionality described for result combiner 610, as discussed
in conjunction with FIG. 6B.
[0103] The quality predictor 502 receives image queries and
generates a recognizability score used by the dispatcher 402 to
route the image query to one of the plurality of recognition units
410. In one embodiment, the quality predictor 502 also receives as
inputs context information and device parameters. In one
embodiment, the recognizability score includes information
specifying the type of recognition algorithm most likely to produce
a valid recognition result.
[0104] The image feature order unit 504 receives image queries and
outputs an ordering signal. The image feature order unit 504
analyzes an input image query and predicts the time required to
recognize an image by analyzing the image features it contains. The
difference between the actual recognition time and the predicted
time is used to adjust future predictions thereby improving
accuracy, as further described in conjunction with FIG. 12. In the
simplest of embodiments, simple images with few features are
assigned to lightly loaded recognition units 410 so that they will
be recognized quickly and the user will see the answer immediately.
In one embodiment, the features used by the image order feature
unit 504 to predict the time are different than the features used
by recognition units 410 for actual recognition. For example, the
number of corners detected in an image is used to predict the time
required to analyze the image. The feature set used for prediction
need only be correlated with the actual recognition time. In one
embodiment, several different features sets are used and the
correlations to recognition time measured over some period.
Eventually, the feature set that is the best predictor and lowest
cost (most efficient) would be determined and the other feature
sets could be discarded. The operation of the image feature order
unit 504 is described in more detail below and can be better
understood with reference to FIG. 12.
Acquisition Unit 406
[0105] Referring now to FIGS. 6A-6F, embodiments of the acquisition
unit 406 will be described.
[0106] FIG. 6A illustrates one embodiment for the acquisition unit
406 where the recognition unit 410 and index table 412 pairs are
partitioned based on the content or images that they index. This
configuration is particularly advantageous for mass media
publishers that provide content on a periodic basis. The
organization of the content in the index tables 412 can be
partitioned such that the content most likely to be accessed will
be available on the greatest number of recognition unit 410 and
index table 412 pairs. Those skilled in the art will recognize that
the partition described below is merely one example and that
various other partitions based on actual usage statistics measured
over time can be employed. As shown in FIG. 6A, the acquisition
unit 406 comprises a plurality of recognition units 410a-h and a
plurality of index tables 412a-h. The plurality of recognition
units 410a-h is coupled to signal line 430 to receive image queries
from the dispatcher 402. Each of the plurality of recognition units
410a-h is coupled to a corresponding index table 412a-h. The
recognition units 410 extract features from the image query and
compare those image features to the features stored in the index
table to identify a matching page and location on that page.
Example recognition and retrieval systems and methods are disclosed
in U.S. patent application Ser. No. 11/461,017, titled "System And
Methods For Creation And Use Of A Mixed Media Environment," filed
Jul. 31, 2006, attorney docket no. 20412-11713; U.S. patent
application Ser. No. 11/461,279, titled "Method And System For
Image Matching In A Mixed Media Environment," filed Jul. 31, 2006,
attorney docket no. 20412-11714; U.S. patent application Ser. No.
11/461,286, titled "Method And System For Document Fingerprinting
Matching In A Mixed Media Environment," filed Jul. 31, 2006,
attorney docket no. 20412-11715; U.S. patent application Ser. No.
11/461,294, titled "Method And System For Position-Based Image
Matching In A Mixed Media Environment," filed Jul. 31, 2006,
attorney docket no. 20412-11716; U.S. patent application Ser. No.
11/461,300, titled "Method And System For Multi-Tier Image Matching
In A Mixed Media Environment," filed Jul. 31, 2006, attorney docket
no. 20412-11717; U.S. patent application Ser. No. 11/461,147,
titled "Data Organization and Access for Mixed Media Document
System," filed Jul. 31, 2006, attorney docket no. 20412-11730; U.S.
patent application Ser. No. 11/461,164, titled "Database for Mixed
Media Document System," filed Jul. 31, 2006, attorney docket no.
20412-11731; U.S. patent application Ser. No. 11/461,109, titled
"Searching Media Content For Objects Specified Using Identifiers,"
filed Jul. 31, 2006, attorney docket no. 20412-11735; U.S. patent
application Ser. No. 12/059,583, titled "Invisible Junction Feature
Recognition For Document Security Or Annotation," filed Mar. 31,
2008, attorney docket no. 20412-13397; U.S. patent application Ser.
No. 12/121,275, titled "Web-Based Content Detection In Images,
Extraction And Recognition," filed May 15, 2008, attorney docket
no. 20412-14041; U.S. patent application Ser. No. 11/776,510,
titled "Invisible Junction Features For Patch Recognition," filed
Jul. 11, 2007, attorney docket no. 20412-12829; U.S. patent
application Ser. No. 11/776,520, titled "Information Retrieval
Using Invisible Junctions and Geometric Constraints," filed Jul.
11, 2007, attorney docket no. 20412-13136; U.S. patent application
Ser. No. 11/776,530, titled "Recognition And Tracking Using
Invisible Junctions," filed Jul. 11, 2007, attorney docket no.
20412-13137; and U.S. patent application Ser. No. 11/777,142,
titled "Retrieving Documents By Converting Them to Synthetic Text,"
filed Jul. 12, 2007, attorney docket no. 20412-12590; and U.S.
patent application Ser. No. 11/624,466, titled "Synthetic Image and
Video Generation From Ground Truth Data," filed Jan. 18, 2007,
attorney docket no. 20412-12219; which are incorporated by
reference in their entirety.
[0107] As shown in FIG. 6A, the recognition unit 410/index table
412 pairs are grouped according to the content that in the index
tables 412. In particular, the first group 612 of recognition units
410a-d and index tables 412a-d is used to index the pages of a
publication such as a newspaper for a current day according to one
embodiment. For example, four of the eight recognition units 410
are used to index content from the current day's newspaper because
most of the retrieval requests are likely to be related to the
newspaper that was published in the last 24 hours. A second group
614 of recognition units 410e-g and corresponding index tables
412e-g are used to store pages of the newspaper from recent past
days, for example the past week. A third group 616 of recognition
unit 410h and index table 412h is used to store pages of the
newspaper from older past days, for example for the past year. This
allows the organizational structure of the acquisition unit 406 to
be optimized to match the profile of retrieval requests received.
Moreover, the operation of the acquisition unit 406 can be modified
such that a given image query is first sent to the first group 612
for recognition, and if the first group 612 is unable to recognize
the image query, it is sent to the second group 614 for recognition
and so on.
[0108] It should be noted that the use of four recognition units
410 and index tables 412 as the first group 612 is merely be by way
example and used to demonstrate a relative proportion as compared
with the number of recognition units 410 and index tables 412 in
the second group 614 and the third group 616. The number of
recognition units 410 and index tables 412 in any particular group
612, 614 and 616 may be modified based on the total number of
recognition units 410 and index tables 412. Furthermore, the number
of recognition units 410 and index tables 412 in any particular
group 612, 614 and 616 may adapted so that it matches the profile
of all users sending retrieval request to the acquisition unit 406
for a given publication.
[0109] Alternatively, the recognition unit 410 and index tables 412
pairs may be partitioned such that there is overlap in the
documents they index, e.g., such as segments of a single image
according to content type, such as discussed in conjunction with
FIG. 14. In this example, image queries are sent to index tables
412 in parallel rather than serially.
[0110] FIG. 6B illustrates a second embodiment for the acquisition
unit 406 where the recognition units 410 and index tables 412 are
partitioned based upon the type of recognition algorithm they
implement. In the second embodiment, the recognition units 410 are
also coupled such that the failure of a particular recognition unit
to generate a registration result causes the input image query to
be sent to another recognition unit for processing. Furthermore, in
the second embodiment, the index tables 412 include feature sets
that are varied according to different device and environmental
factors of image capture devices (e.g., blur, etc.).
[0111] The second embodiment of the acquisition unit 406 includes a
plurality of recognition units 410a-410e, a plurality of the index
tables 412a-412e and a result combiner 610. In this embodiment, the
recognition units 410a-410e each utilizes a different type of
recognition algorithm. For example, recognition units 410a, 410b
and 410c use a first recognition algorithm; recognition unit 410d
uses a second recognition algorithm; and recognition unit 410e uses
a third recognition algorithm for recognition and retrieval of page
numbers and locations. Recognition units 410a, 410d, and 410e each
have an input coupled signal line 430 by signal line 630 for
receiving the image query. The recognition results from each of the
plurality of recognition units 410a-410e are sent via signal lines
636, 638, 640, 642 and 644 to the result combiner 610. The output
of the result combiner 610 is coupled to signal line 430.
[0112] In one embodiment, the recognition units 410a, 410b, and
410c cooperate together with index tables 1, 2, and 3 412a-412c
each storing image features corresponding to the same pages but
with various modifications, e.g., due to different device and
environmental factors. For example, index table 1 412a may store
images features for pristine images of pages such as from a PDF
document, while index table 2 412b stores images of the same pages
but with a first level of modification and index table 3 412c
stores images of the same pages but with the second level of
modification. In one embodiment, the index tables 1, 2, and 3
412a-412c are quantization trees. The first recognition unit 410a
receives the image query via signal line 630. The first recognition
unit 410a comprises a first type of feature extractor 602 and a
retriever 604a. The first type of feature extractor 602 receives
the image query, extracts the Type 1 features, and provides them to
the retriever 604a. The retriever 604a uses the extracted Type 1
features and compares them to the index table 1 412a. If the
retriever 604a identifies a match, the retriever 604a sends the
recognition results via signal line 636 to the result combiner 610.
If however, the retriever 604a was unable to identify a match or
identifies a match with low confidence, the retriever 604a sends
the extracted Type 1 features to the retriever 604b of the second
recognition unit 410b via signal line 632. It should be noted that
since the Type 1 features already have been extracted, the second
recognition unit 410b does not require a feature extractor 602. The
second recognition unit 410b performs retrieval functions similar
to the first recognition unit 410a, but cooperates with index table
2 412b that has Type 1 features for modified images. If the
retriever 604b identifies a match, the retriever 604b sends the
recognition results via signal line 638 to the result combiner 610.
If the retriever 604b of the second recognition unit 410b is unable
to identify a match or identifies a match with low confidence, the
retriever 604b sends the extracted features to the retriever 604c
of the third recognition unit 410c via signal line 634. The
retriever 604c then performs a similar retrieval function but on
index table 3 412c. Those skilled in the art will understand that
while one pristine set of images and two levels of modification are
provided, this is only by way of example and that any number of
additional levels of modification from 0 to n may be used.
[0113] The recognition units 410d and 410e operate in parallel with
the other recognition units 410a-c. The fourth recognition unit
410d comprises a second type of feature extractor 606 and a
retriever 604d. The Type 2 feature extractor 606 received the image
query, possibly with other image information, parses the image, and
generates Type 2 features. These Type 2 features are provided to
the retriever 604d and the retriever 604d compares them to the
features stored in index table 4 412d. In one embodiment, index
table 4 412d is a hash table. The retriever 604d identifies any
matching pages and returns the recognition results to the result
combiner 610 via signal line 642. The fifth recognition unit 410e
operates in a similar manner but for a third type of feature
extraction. The fifth recognition unit 410e comprises a Type 3
feature extractor 608 and a retriever 604e. The Type 3 feature
extractor 608 receives the image query, parses the image and
generates Type 3 features and the features are provided to the
retriever 604e and the retriever 604e compares them to features
stored in the index table 5 412e. In one embodiment, the index
table 5 412e is a SQL database of character strings. The retriever
604e identifies any matching strings and returns the recognition
results to the result combiner 610 via signal line 644.
[0114] In one exemplary embodiment the three types of feature
extraction include and invisible junction recognition algorithm,
brick wall coding, and path coding.
[0115] The result combiner 610 receives recognition results from
the plurality of recognition units 410a-e and produces one or a
small list of matching results. In one embodiment, each of the
recognition results includes an associated confidence factor. In
another embodiment, context information such as date, time,
location, personal profile, or retrieval history is provided to the
result combiner 610. These confidence factors along with other
information are used by the result combiner 610 to select the
recognition results most likely to match the input image query.
[0116] FIGS. 6C-6F are block diagrams of additional embodiments
including a high priority index 411 and one or more lower priority
indexes (e.g., general index 413).
[0117] Referring now to FIG. 6C, another embodiment of the
acquisition unit 406 is described. FIG. 6C illustrates one
embodiment of the acquisition unit 406 where the recognition unit
410 and index table pairs are partitioned using a high priority
index (HPI) 411 and a general index 413. The recognition unit 410
is as described above. Similar to index tables 412, the HPI 411 and
general index 413 may be storage devices storing data and
instructions, e.g., a hard disk drive, a floppy disk drive, a
CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device,
a flash memory device, or some other mass storage device known in
the art, or may be conventional-type databases that store indices,
electronic documents, and other electronic content, feature
descriptions, and other information used in the content type
comparison and retrieval process.
[0118] The HPI 411 stores, in addition to document pages to be
searched according to received image queries, for each image, a
timestamp corresponding to the most recently received image query
associated with that document page, a count of the total image
queries matching that document page, and a weight corresponding to
that document page, if any. Weights are used, e.g., as a means of
maintaining an image in the HPI 411, even if it would be selected
for removal based on its timestamp and count. Creation and updating
of timestamp, counts, and weights, in conjunction with building of
the HPI 411, are discussed in greater detail in conjunction with
FIGS. 10A, 10B, and 11.
[0119] The general index 413 stores all document pages to be
searched according to the received image queries, and may be a
duplicate of the master index table 416, or any of index tables 412
according to various embodiments. Similar to the HPI 411,
timestamps, counts, and weights are stored in the general index
413. The timestamps, counts, and weights in the general index 413
are counts of the document pages matching all image queries
received, whereas the corresponding timestamps, counts, and weights
in the HPI 411 correspond only to document pages stored by the HPI
411.
[0120] The partitioning of the recognition unit 410 and index table
pairs using an HPI 411 provides faster and/or more accurate results
for frequent and/or probable queries due to the relatively small
size of the HPI 411. When new image queries arrive they are
searched first against the HPI 411. The HPI 411 can be built
according to various factors depending upon the search priority.
For example, in one embodiment, the HPI 411 could be time-based,
such as prioritizing a current time interval periodical, such as
today's newspaper, by putting it in the HPI 411, while newspapers
for past days are placed in the general index 413. In another
embodiment, the HPI 411 is built based on the popularity of
received image queries, either based on popularity of the image
query by the individual user, across all users, or across some set
(or subset) of users (e.g., users within a geographical area). In
yet another embodiment, the HPI 411 is built based on self-reported
user preferences, such as age, gender, magazine one subscribes to,
and preference in music. Those skilled in the art will recognize
that these partitioning options are merely examples and that
various other partitions based on actual user statistics measured
over time, or based on other factors, can be employed. The process
used to build the HPI 411 is described in greater detail in
conjunction with FIGS. 10A, 10B, and 11.
[0121] As shown in FIG. 6C, the acquisition unit 406 comprises
recognition unit 410, HPI index 411, and general index 413. The
recognition unit 410 is coupled to signal line 430 to receive image
queries from the dispatcher 402. The recognition unit 410 is
coupled to corresponding HPI 411, and via HPI 411 to general index
413. The example recognition and retrieval systems and methods
referenced in conjunction with FIGS. 6A and 6B also apply to FIG.
6C.
[0122] As shown in FIG. 6C, the HPI 411 is smaller than the general
index 413, and contains a subset of the data within general index
413. The operation of acquisition unit 406 shown in FIG. 6C is such
that a given image query is first sent to the HPI 411 for
recognition, and if the HPI 411 is unable to recognize the image
query, it is sent to the general index 413 for recognition.
[0123] The general index 413 according to various embodiments may
be a complete index, e.g., identical to master index table 416 as
shown in FIG. 4A, or may include any subset of the data from master
index table 416, as distinguished from the subset within the HPI
411.
[0124] It should be noted that the use of a single HPI 411 and a
single general index 413 is merely by way of example and is used to
demonstrate the priority of the HPI(s) 411 over the general index
413. The number of HPIs 411 and general indexes 413 in any
particular configuration may vary. For example, in some embodiments
multiple HPIs 411 may be used in sequence. In this example, each
HPI 411 would provide a greater level of generalization over the
previous HPI 411. For example, there could be an HPI 411 for the
most common image queries of the last 24 hours that would be
searched first, and if no match is found, a second HPI 411 of the
most common image queries of all time that would be searched next,
and if no match is found, then the general index 413 would be
searched. This example advantageously allows for greater degrees of
granularity between the HPIs 411.
[0125] Referring now to FIG. 6D, another embodiment of the
acquisition unit 406 is described. FIG. 6D illustrates one
embodiment of the acquisition unit 406 in which the recognition
unit 410 and index table pairs are partitioned into multiple HPIs
411 by mobile device 102 user. This configuration is particularly
advantageous for personalizing each HPI 411 according to an
individual user or user group. A user group may include users with
shared preferences, with the same imaging device, with shared
demographics, etc.
[0126] The organization of the contents in the user recognition
units 410 and HPIs 411 can be partitioned such that the content
most likely to be accessed by each user or user group will be
available using a recognition unit 410 and an HPI 411 specific to
the user or user group. Those skilled in the art will recognize
that the partition described below is merely one example and that
various other partition configurations may be employed.
[0127] As shown in FIG. 6D, the acquisition unit 406 comprises a
plurality of recognition units 410a-d, a plurality of HPIs 411a-d,
and a general index 413. As in the above described embodiments, the
recognition units 410a-d are coupled to signal line 430 to received
image queries from the dispatcher 402, are coupled to a
corresponding HPI 411a-d, and function as described above. The
operation of acquisition unit 406 is such that for a given User A,
an image query first is sent to HPI 411a, and if no match is found,
then is sent to general index 413. By establishing an HPI 411 on
the individual level, a smaller index specific to the user or user
group, and populated with data more likely to be contained in the
image query from that user or user group, can be searched first, at
savings of time and computation.
[0128] Referring now to FIG. 6E, another embodiment of the
acquisition unit 406 is described. FIG. 6E illustrates one
embodiment of the acquisition unit 406 in which the recognition
unit 410 and index pairs are partitioned by geographical location.
This configuration is particularly advantageous for image queries
with a geographical aspect involved. For example, using information
about the location of the user of the mobile device 102, e.g.,
received as part of the retrieval request using GPS, image queries
received from that user can first be submitted to an HPI 411
specific to the location of the user at the time that the image
query is received. The recognition unit 410 and HPI 411 may be
physically located within the geographical location of the user
according to embodiment. This is advantageous with respect to
bandwidth and distance considerations when using mobile device
communication. For example, a geographical area may be the campus
of a university. In this example, the HPI 411 for that location may
be located at the university, and may contain images most accessed
by other users within the geographical area contained by the
university.
[0129] The organization of the contents in the user recognition
units 410 and HPIs 411 can be partitioned such that the content
most likely to be accessed by a user according to his or her
location is available using a recognition unit 410 and an HPI 411
pair specific to that location. Those skilled in the art will
recognize that the partition described below is merely one example
and that various other partition configurations may be
employed.
[0130] As shown in FIG. 6E, the acquisition unit 406 comprises a
plurality of recognition units 410a-b, a plurality of HPIs 411a-b,
and a general index 413. As in the above described embodiments, the
recognition units 410a-b are coupled to signal line 430 to receive
image queries from the dispatcher 402, are coupled to a
corresponding HPI 411a-b, and function as described above. For the
embodiment shown in FIG. 6E, the location of the recognition unit
410 and HPI 411 may not be proximate to the primary acquisition
unit 406 in this example, additional signal lines such as signal
line 132 connecting mobile device 102 and pre-processing server
103/MMR gateway 104, signal line 134 connecting pre-processing
server 103/MMR gateway 104, and MMR matching unit 106, or other
additional communication means between dispatcher 402 and
recognition units 410, also may be used. The operation of
acquisition unit 406 is modified such that an image query received
from a user within Geographical Area A first is submitted to HPI A
411a, and if no match is found then is submitted to the general
index 413.
[0131] Referring now to FIG. 6F, and alternative embodiment of the
acquisition unit 406 is described. FIG. 6F illustrates an
embodiment for the acquisition unit 406 where the acquisition unit
is split into two parts: the acquisition unit 406 within MMR
matching unit 106 and a device acquisition unit 406' on mobile
device 102, which is coupled via signal line 650 to device
recognition unit for his 410' and device HPI 411'. In this example,
device acquisition unit 406' may be integrated within MMR matching
plug-in 405 as discussed in conjunction with FIG. 2B. This
configuration is particularly advantageous because it provides a
fast response for the user of mobile device 102, because no
communication to the pre-processing server 103 or MMR gateway 104
is required if the match is found in device HPI 411'. The device
HPI 411', and successive HPIs 411 and general indexes 413 may be
adjusted in size to account for storage capacity and expected
communication delay of mobile device 102. In addition, this
embodiment allows for customization specific to the mobile device
102 characteristics, the location of the mobile device 102, and
known imaging variations, e.g., blur, typical distance to paper,
jitter, specific to the mobile device 102 or device user. In
addition, advantages inherent in distributed computing also would
be realized with this embodiment.
[0132] Acquisition unit 406 includes dispatcher 402 connected via
signal line 430 to recognition unit 410, one or more HPIs 411, and
one or more general indexes 413. Device acquisition unit 406'
includes device dispatcher 402', device recognition unit 410', and
device HPI 411'. In one embodiment, device acquisition unit 406'
and its functionality correspond with MMR matching plug-in 205.
Like numerals have been used for acquisition unit 406', dispatcher
402', recognition unit 410', and HPI 411' to denote like
functionality. In this example, device dispatcher 402' is connected
to device recognition unit 410' via signal line 650. Device HPI
411' may be connected to dispatcher 402 within acquisition unit 406
via the typical path between mobile device 102 and acquisition unit
406, i.e., signal lines 132, 134, and 430 (represented by a dashed
line). Those skilled in the art will recognize that the
partitioning described in this embodiment is merely one example and
the various other partitioning schemes may be employed.
[0133] In operation, device acquisition unit 406' operates directly
on images captured by mobile device 102. Similar to dispatcher 402,
device dispatcher 402', operating in conjunction with other
portions of MMR matching plug-in 205, sends image queries to device
recognition unit 410'. Received image queries first are submit to
device HPI 411', then to the indexes on acquisition unit 406, e.g.,
411, 413. For example a received image query from device dispatcher
402', first would be submit to device HPI 411', then to HPI 411,
then to general index 413.
[0134] The above described embodiments in FIG. 6A-6F are not meant
to be exclusive or limiting, and may be combined according to other
embodiments. For example, an HPI 411 may be based on multiple
factors such as age and popularity, and as discussed above in
conjunction with FIG. 6C, multiple HPIs 411 may be utilized within
a single system 100. In addition, the HPI 411 may be combined with
indexes segmented by other aspects such as quality (e.g., blur),
content type, or other document imaging-based parameters, such as
described in conjunction with FIG. 14.
Image Registration Unit 408
[0135] FIG. 7 shows an embodiment of the image registration unit
408. The image registration unit 408 comprises an image alteration
generator 703, a plurality of Type 1 feature extractors 704a-c, a
plurality of Type 1 index table updaters 706a-c, a Type 2 feature
extractor 708, a Type 2 index table updater 710, a Type 3 feature
extractor 712, a Type 3 index table updater 714 and a plurality of
master index tables 416a-e. The image registration unit 408 also
includes other control logic (not shown) that controls the updating
of the working index tables 411-413 from the master index table
416. The image registration unit 408 can update the index tables
411-413 of the acquisition unit 406 in a variety of different ways
based on various criteria such performing updates on a periodic
basis, performing updates when new content is added, performing
updates based on usage, performing updates for storage efficiency,
etc.
[0136] The image alteration generator 703 has an input coupled in
signal line 730 to receive an image and a page identification
number. The image alteration generator 703 has a plurality of
outputs and each output is coupled by signal lines 732, 734, and
736 to Type 1 extractors 704a-c, respectively. The image alteration
generator 703 passes a pristine image and the page identification
number to the output and signal line 732. The image alteration
generator 703 then generates a first altered image and outputs it
and the page identification number on signal line 734 to Type 1
feature extractor 704b, and a second altered image, alter
differently than the first altered image, and outputs it and page
identification number on signal line 736 to Type 1 feature
extractor 704c.
[0137] The Type 1 feature extractors 704 receive the image and page
ID, extract the Type 1 features from the image and send them along
with the page ID to a respective Type 1 index table updater 706.
The outputs of the plurality of Type 1 feature extractors 704a-c
are coupled to input of the plurality of Type 1 index table
updaters 706a-c. For example, the output of Type 1 feature
extractor 704a is coupled to an input of Type 1 index table updater
706a. The remaining Type 1 feature extractors 704b-c similarly are
coupled to respective Type 1 index table updaters 706b-c. The Type
1 index table updaters 706 are responsible for formatting the
extracted features and storing them in a corresponding master index
table 416. While the master index table 416 is shown as five
separate master index tables 416a-e, those skilled in the art will
recognize that all the master index tables could be combined into a
single master index table or into a few master index tables. In the
embodiment including the MMR publisher 108, once the Type 1 index
table updaters 706 have stored the extracted features in the index
table 416, they issue a confirmation signal that is sent via signal
lines 740 and 136 back to the MMR publisher 108.
[0138] The Type 2 feature extractor 708 and the Type 3 feature
extractor 712 operate in a similar fashion and are coupled to
signal line 738 to receive the image, a page identification number,
and possibly other image information. The Type 2 feature extractor
708 extracts information from the input needed to update its
associated index table 416d. The Type 2 index table updater 710
receives the extracted information from the Type 2 feature
extractor 708 and stores it in the index table 416d. The Type 3
feature extractor 712 and the Type 3 index table updater 714
operate in a like manner but for Type 3's feature extraction
algorithm. The Type 3 feature extractor 712 also receives the
image, a page number, and possibly other image information via
signal line 738. The Type 3 feature extractor 712 extracts Type 3
information and passes it to the Type 3 index table updater 714.
The Type 3 index table updater 714 stores the information in index
table 5 416e. The architecture of the registration unit 408 is
particularly advantageous because it provides an environment in
which the index tables can be automatically updated, simply by
providing images and page numbers to the image registration unit
408. According to one embodiment, Type 1 feature extraction is
invisible junction recognition, Type 2 feature extraction is brick
wall coding, and Type 3 feature extraction is path coding.
Quality Predictor 502
[0139] Referring now to FIG. 8, an embodiment of the quality
predictor 502 and its operation will be described in more detail.
The quality predictor 502 produces a recognizability score (aka
Quality Predictor) that can be used for predicting whether or not
an image is a good candidate for a particular available image
recognition algorithm. An image may not be recognizable based on
many reasons, such as motion blur, focus blur, poor lighting,
and/or lack of sufficient content. The goal of computing a
recognizability score is to be able to label the non-recognizable
images as "poor quality," and label the recognizable images as
"good quality." Besides this binary classification, the present
invention also outputs a "recognizability score," in which images
are assigned a score based on the probability of their
recognition.
[0140] The quality predictor 502 will now be described with
reference to an embodiment in which the quality predictor 502 is
part of the dispatcher 402 as has been described above and as
depicted in FIG. 5. In this embodiment, the quality predictor 502
provides a recognizability score as input to the distributor 506
that decides which recognition unit 410 (and thus which recognition
algorithm) to run. However, those skilled in the art will realize
that there are numerous system configurations in which the quality
predictor 502 and the recognizability score are useful and
advantageous. In a second embodiment, the quality predictor 502 is
run on a capture device (mobile device 102 phone, digital camera,
computer 110) to determine if the quality of the captured image is
sufficient to be recognized by one of the recognition units 410 of
the MMR matching unit 106, or device recognition unit 410' on the
mobile device, e.g., as part of the functionality of device
acquisition unit 406'. If the quality of the captured image is
sufficient, it is sent to the MMR matching unit 106, or processed
within device acquisition unit 406', if not, the user is simply
asked to capture another image. Alternatively, the captured image
and the quality predictor score are shown to the user and he/she
decides whether it should be submitted to the MMR matching unit
106. In a third embodiment, the quality predictor 502 is part of
the result combiner 610, in which there are multiple recognition
units 410 and the recognizability score determines how the
recognition results are evaluated. In a fourth embodiment, the
quality predictor 502 is part of the indexing unit 414 and
computation of a recognizability score precedes the indexing
process, and the score is used in deciding which indexer/indexers
need to be used for indexing the input document page. For example,
if the recognizability score is low for the image to be indexed
using the brick wall coding (BWC) algorithm, then the image may be
indexed using only the invisible junction (IJ) algorithm. Further,
the same quality predictor can be used for both indexing and
recognition. In a fifth embodiment, the quality predictor 502 is
used before the "image capture" process on a mobile device 102. The
recognizability score is computed prior to capturing the image, and
the mobile device 102 captures an image only if the recognizability
score is higher than a threshold. The quality predictor 502 can be
embedded in a camera chip according to one embodiment, and can be
used to control the mobile device 102 camera's hardware or
software. For example, camera aperture, exposure time, flash, macro
mode, stabilization, etc. can be turned on based on the
requirements of recognition unit 410 and based on the captured
image. For example, BWC can recognize blurry text images and
capturing blurry images can be achieved by vibrating the mobile
device 102.
[0141] As shown in FIG. 8, one embodiment of the quality predictor
502 comprises recognition algorithm parameters 802, a vector
calculator 804, a score generator 806, and a scoring module 808.
The quality predictor 502 has as inputs coupled to signal line 532
to receive an image query, context and metadata, and device
parameters. The image query may be video frames, a single frame, or
image features. The context and metadata includes time, date,
location, environmental conditions, etc. The device parameters
include brand, type, macro block on/off, gyro or accelerometer
reading, aperture, time, exposure, flash, etc. Additionally, the
quality predictor 502 uses certain recognition algorithm parameters
802. These recognition algorithm parameters 802 can be provided to
the quality predictor 502 from the acquisition unit 406 or the
image registration unit 408. For example, the recognition
algorithms 802 may recognize different content types associated
with the received image query. The vector calculator 804 computes
quality feature vectors from the image to measure its content and
distortion, such as its blurriness, existence and amount of
recognizable features, luminosity etc. The vector calculator 804
computes any number of quality feature vectors from one to n. In
some cases, the vector calculator 804 requires knowledge of the
recognition algorithm(s) to be used, and the vector calculator 804
is coupled by signal line 820 the recognition algorithm parameters
802. For example, if an Invisible Junctions algorithm is employed,
the vector calculator 804 computes how many junction points are
present in the image as a measure of its recognizability.
Continuing the above example of recognition of content types within
the received image query, in addition the vector calculator 804 may
perform a segmenting function on the image query, segmenting it by
content type. All or some of these computed features are then input
to score generator 806 via signal line 824. The score generator 806
is also coupled by signal line 822 to receive recognition
parameters for the recognition algorithm parameters 802. The output
of the score generator 806 is provided to the scoring module 808.
The scoring module 808 generates a score using the recognition
scores provided by the score generator 806 and by applies weights
to those scores. In one embodiment, the result is a single
recognizability score. In another embodiment, the result is a
plurality of recognizability scores ranked from highest to lowest.
In some embodiments, the recognizability scores are associated with
particular index tables 412.
Methods
[0142] FIG. 9 is a flowchart of a general method for generating and
sending a retrieval request and processing the retrieval request
with an MMR system 100. The method begins with the mobile device
102 capturing 902 an image. A retrieval request that includes the
image, a user identifier, and other context information is
generated by the mobile device 102 and sent 904 to the
pre-processing server 103 or MMR gateway 104. The pre-processing
server 103 or MMR gateway 104 processes 906 the retrieval request
by extracting the user identifier from the retrieval request and
verifying that it is associated with a valid user. The MMR gateway
104 also performs other processing such as recording the retrieval
request in the log 310, performing any necessary accounting
associated with the retrieval request and analyzing any MMR
analytics metrics. Next, the pre-processing server 103 or MMR
gateway 104 generates 908 an image query and sends it to the
dispatcher 402. The dispatcher 402 performs load-balancing and
sends the image query to the acquisition unit 406. In one
embodiment, the dispatcher 402 specifies the particular recognition
unit 410 of the acquisition unit 406 that should process the image
query. Then the acquisition unit 406 performs 912 image recognition
to produce recognition results. The recognition results are
returned 914 to the dispatcher 402 and in turn the pre-processing
server 103 or MMR gateway 104. The recognition results are also
used to retrieve 916 hotspot data corresponding to the page and
location identified in the recognition results. Finally, the
hotspot data and the recognition results are sent 918 from the
pre-processing server 103 or MMR gateway 104 to the mobile device
102.
[0143] FIG. 10A is a flowchart showing a method of updating an HPI
411 using the actual image queries received by an MMR matching unit
106 according to one embodiment of the present invention. The
method begins with the MMR matching unit 106 receiving 1002 an
image query. In addition to the responsive retrieval process
described herein, the image registration unit 408 queries 1004
whether the received image matches a document page in the current
HPI 411. For purposes of this method the HPI 411 can be any of the
HPIs 411 described herein, individually or in combination. The
method may update one HPI 411 at a time, or several serially or in
unison. If the image matches a document page in the current HPI
411, the image registration unit 408 updates 1006 the timestamp,
count, and weight associated with the matching document page. The
process then ends for the yes branch. If the matching document page
does not exist in the current HPI 411, the image registration unit
408 queries 1008 whether the document page count for the received
image exceeds a threshold for addition to the HPI 411. The
threshold for inclusion in the HPI 411 varies according to
different embodiments. According to some embodiments, the threshold
may be as low receiving two image queries that match the same
document page. For the initial build of the HPI 411, e.g., every
document page receiving at least one query image may be added to
the HPI 411, and a higher threshold may apply after the HPI 411 has
been established for a longer period of time. If the count does not
exceed the threshold the process ends. If the count does exceed the
threshold, the image registration unit 408 selects 1010 the image
for addition to the HPI 411. The remainder of the method proceeds
according to FIG. 11.
[0144] The method of FIG. 10A thus is based on actual image queries
received. Building the HPI 411 according to the method of FIG. 10A
may be done on an individual user basis, or may be based on the
overall "popularity" of image queries received to be updated to a
popularity-based HPI 411. In addition, the process of FIG. 10A may
proceed on a real-time or batch update basis. For example, the
steps of the method may occur as each image query is received, such
that the HPI 411 is updated on a rolling, real-time basis.
Alternatively, the steps of the method may occur at the end of the
time interval according to all image queries received during that
time, such that the HPI 411 is updated on a batch basis, e.g., once
a day. While real-time updates will provide the most accurate HPI
411, continuously updating the HPI is computation intensive. Batch
updates are less computationally intensive, but are delayed
according to the selected time interval for the updates. In some
embodiments, a combination of real-time and batch updates may be
used. For example, real-time updates may be used for individual
HPIs 411, while batch updates may be used for popularity-based HPIs
411.
[0145] FIG. 10B is a flowchart showing a method of updating an HPI
411 using image query projections according to one embodiment of
the present invention. The method begins with receiving 1003 a set
of document pages for which image queries are projected to be
received during a selected next interval. The time interval varies
according to different documents, but may be selected according to
time to update the HPI 411, in conjunction accuracy of searching
and number of updates required.
[0146] The data for the image query projections may be determined
by the image registration unit 408 and/or may be based on data
received from third-party content providers, e.g. publisher 108
according to the embodiment shown in FIG. 1B. For example,
tomorrow's newspaper may be projected to be a likely subject of
image queries for the interval tomorrow. The image query
projections may be based and other data is well, e.g., document
pages belonging to the same document as document pages matching
received image queries, similarity to recently matched document
pages, specificity of a document page to a selected future time
interval. For example, assuming a real-time update of the HPI 411,
an image query recently received may be more likely to be received
again in the near future, and thus in one embodiment real-time
updates to the HPI 411 consistently cycle through image queries
recently received. In addition, additional data received from the
dynamic load balancer 418 may be used to establish the image group
rejections.
[0147] Once the projected document pages are received, i the image
registration unit 408 queries 1004 whether the received document
page exists in the current HPI 411. If the image exists in the
current HPI 411, the image registration unit 408 updates 1006 the
timestamp, count, and weight associated with the predicted image
query. The process then ends for the yes branch. If the image does
not exist in the current HPI 411, the image registration unit 408
queues 1010 the image for addition to the HPI 411. The remainder of
the method proceeds according to FIG. 11.
[0148] The method of FIG. 10B thus is based on projected document
pages for which matching image queries are expected to be received
during an upcoming time interval. Building the HPI 411 according to
the method of FIG. 10B may be based on individual user image query
projections, or image query projections for a larger population.
Typically, the process of FIG. 10B occurs on a batch update basis,
e.g., once a day.
[0149] FIG. 11 is a flowchart of a method for updating a high
priority index in accordance with an embodiment of the present
invention.
[0150] FIG. 11 is a flowchart depicting a method for updating a HPI
411, using received or projected document pages, according to one
embodiment of the present invention. The method considers image
queries corresponding to document pages selected for inclusion in
the HPI 411, and implements a removal policy for document pages in
the HPI, e.g., when the HPI 411 is too full to receive the image
queries selected for inclusion. The method begins by receiving a
document page selected for inclusion in step 1010 of FIG. 10A or
10B. The image registration unit 408 determines 1102 whether the
HPI 411 is full. This step is similar to the process described in
conjunction with the description of indexing unit 414 and master
index table 416 in conjunction with FIG. 4A. As discussed in
conjunction with FIG. 10A, the method may be implemented using
various HPIs 411. If the HPI 411 is not full, the document page is
added 1104 to the HPI 411, and a timestamp, count, and weight is
logged. According to one embodiment, features extracted from the
image associated with the image query received are added to the HPI
411. In other embodiments, features extracted from the entire
document page that matched the received image query are stored, or
features extracted from all the pages belonging to the same
document that matched the image query are stored in the HPI 411.
Storing data in addition to the specific image queried can be
advantageous in the context of a popularity based HPI 411, e.g.,
because if one portion of a document is popular, other parts of the
document may be popular as well. In addition, adding 1104 the
document page to the HPI 411 may include sending the document page
to an HPI 411 remote from the image registration unit 408, e.g., a
device HPI 411', as described in conjunction with FIG. 6F. This
ends the process for the no branch.
[0151] If the HPI 411 is full, the image registration unit 408
determines 1106 a document page for removal from the HPI 411.
Document pages may be selected for removal according to various
methods, e.g., using the oldest timestamp, lowest count, lowest
weight, and/or some combination thereof. In one embodiment, the
number of document pages selected for removal from the HPI 411 is
equal to the number selected/queued 1010 to be added to the HPI
411. Once selected, the document page(s) is removed 1108 from the
HPI. The process then can proceed to step 1104, as described above,
to allow the document page to be added to the HPI 411, now that the
HPI 411 has space available. As discussed above for FIGS. 10A and
10B, adding document pages to the HPI 411 may occur on a real-time
or batch update basis. For example, if the updates occur a
real-time, document pages may be selected for addition, and
document pages selected for removal, from the HPI 411 as each image
query is received. Alternatively, document pages may be removed,
and added, to the HPI 411 as a group at the end of the time
interval, e.g., a day. This ends the process for the yes branch. In
one embodiment, the dynamic load balancer 418 operates in
conjunction with the image registration unit 408 for propagating
updates to the master indexed table 416 and/or the index tables
411, 412, 413.
[0152] Referring now to FIG. 12, one embodiment of a method for
performing image feature-based ordering will be described. This
functionality of this method is generally implemented by the image
feature order unit 504 of the dispatcher 402. Feature-based
ordering is a mechanism for organizing the priorities of the image
queries waiting to be recognized. The default is FIFO (First-In,
First-Out), that is, servicing the image queries in the order they
were received. In the case of feature-based ordering, image queries
instead are processed based on an estimate on the speed of
recognition, e.g., wherein image queries expected to be recognized
in a short amount of time being processed earlier, with the goal of
maximizing the response time for the largest number of users. The
speed of recognition would be estimated in one embodiment by
counting the number of features in each received image query; image
queries with fewer features would be processed first.
[0153] The method begins by receiving 1202 an image query. Next,
the image feature order unit 504 of the dispatcher 402 analyzes
1204 the image features in the image query. It should be noted that
the image features used in the analysis of step 1204 need not be
the same image features used by the recognition units 410. It is
only necessary to correlate the image features to recognition. In
yet another embodiment, several different feature sets are used and
correlations are measured over time. Eventually, the feature set
that provides the best predictor and has the lowest computational
cost is determined and the other feature sets are discarded. The
image feature order unit 504 measures 1206 the time required to
recognize the image features and thereby generates a predicted
time. Next, the method creates 1208 correlations between features
and predicted times. Next method measures 1210 the time actually
required by the acquisition unit 406 to recognize the image query.
This time required by the acquisition unit 406 is referred to as an
actual time. Then the image feature order unit 504 adjusts 1212 the
correlations generated in step 1208 by the actual time. The
adjusted correlations are then used 1214 to reorder and assign
image queries to recognition units. For example, simple images with
few features are assigned to lightly loaded servers (recognition
units 410 and index table 412 pairs) so that they will be recognize
quickly and the user will see receive the answer quickly. While the
method shown in FIG. 12 illustrates the process for an image or a
small set of images, those skilled in the art will recognize that
once many images have been processed with the above method, a
number of correlations will be created and the image feature order
unit 504 essentially learns the distribution of image features
against processing time and then the controller 501 of the
distributor 506 can use the distribution to load balance and
redirect image queries with particular image features
accordingly
[0154] FIG. 13 is a flowchart showing a method for processing image
queries across multiple index tables 412 according to one
embodiment of the present invention. The image queries are
processed according to different image processing techniques and
recognition parameters. For example, the method may be used for the
embodiment described in conjunction with FIG. 6B in which the same
image or page is placed in multiple index tables 412 having been
processed according to different image processing techniques using
various recognition parameters, e.g., such as being blurred by
different amounts, before computing features of the image to be
indexed. In one embodiment, quality feature vectors may be
calculated by the vector calculator 804 as described in conjunction
with the quality predictor 502 of FIG. 8. This embodiment is
advantageous because more accurate results are produced and
recognition is faster using index tables 412 that are tailored to
specific recognition parameters.
[0155] The method begins with the dispatcher 402 receiving 1302 an
image along with recognition parameters corresponding to the image.
Continuing the example from above, the image query may be received
with recognition parameters corresponding to the level of blur for
the received image. In one embodiment, the dispatcher 402 also may
receive computation drivers with the image query. Computation
drivers include expected recognition speed, maximum recognition
accuracy, or perceived recognition speed, and are received from
dynamic load balancer 418 according to one embodiment, and may be
used e.g., according to the method discussed in conjunction with
FIG. 12. Next, recognition parameters are analyzed as associated
with the respective index tables 412. In one embodiment, this
information may be received from the dynamic load balancer 418.
Based on the received information, the dispatcher 402 formulates
1306 an index table 412 submission strategy and index priority. In
one embodiment, the output of step 1306 includes a recognition unit
identification number (RUID) as discussed in conjunction with the
dispatcher 402 of FIG. 5.
[0156] One exemplary submission strategy is parallel submission to
multiple index tables 412. In this example, the index queries are
sent in parallel, and results are received 1308 from each of the
index tables 412, e.g. Index 1-n, as shown in FIG. 13. The results
from the multiple index tables 412 are then integrated 1310, and
the process ends. According to one embodiment, the integration is
performed by the integrator 509 of dispatcher 402. According to
another embodiment, steps 1308 and 1310 are equivalent to those
discussed in FIG. 6B, and the integration is performed by other
units, such as the result combiner 610 shown in FIG. 6B. According
to yet another embodiment, the dispatcher 402 merely receives and
retransmits the data to the pre-processing server 103 or MMR
gateway 104, for integration therein.
[0157] A second exemplary submission strategy is serial submission
to multiple index tables 412. In this example, the index queries
are sent to multiple index tables 412 according to the priority
established in step 1306. E.g., a query is first submit 1312a to
Index A, and if a result is found, the process ends. If the result
is not found, the query is next submit 1312b to each index table
412 in turn, through index m, until a result is found and the
process ends. The final result is provided via signal line 134 to
the pre-processing server 103 or MMR gateway 104.
[0158] FIG. 14 is a flowchart showing a method for segmenting
received image queries and processing the segmented queries
according to one embodiment present invention. In this embodiment,
the multiple index tables 412 correspond to different content
types. This embodiment is advantageous because more accurate
results are produced and recognition is faster using index tables
412 that are tailored to specific image content types. The method
begins with the dispatcher 402 receiving 1402 an image query, and
other receipt information, for processing. The dispatcher 402
segments 1404 the image query into image segments corresponding to
various content types contained with an image. In one embodiment,
the segmenter 505 detects content of various types within the
received image query and segments the content accordingly. In
another embodiment, this function is performed by the quality
predictor 502, e.g., using vector calculator 804. Content types may
include, among others, black text on white background, black and
white natural images, color natural images, tables, black and white
bar charts, color bar charts, black and white pie charts, color pie
charts, black and white diagrams, color diagrams, headings, and
color text. This list of content types is exemplary and not meant
to be limiting; other content types may be used. In an alternative
embodiment, the dispatcher 402 receives a pre-segmented image
query, e.g., the segmenting may be performed by the pre-processing
server 103 or MMR gateway 104. Next, the dispatcher 402 submits
1406 the segmented queries to one or more corresponding content
type index tables 412. In one embodiment, the step includes
analysis of the indexed content types, similar to the analysis of
step 1304 of FIG. 13. In one embodiment, the output of step 1406
includes an RUID along with the appropriate image query. The
dispatcher 402 then receives 1408 results and image associated
metrics from each of the index tables 412. In its metrics may
include, among others, a competence factor associated with the
results, context information such as date, time, location, personal
profile, retrieval history, and surface area of the image segment.
This information, along with prior probability of correctness for
the index tables 412 and level of agreement between results from
various index tables, along with other factors, can be used by the
dispatcher 402, e.g., at integrator 509, to integrate 1410 the
received results. For example, given four index tables 412
corresponding to image types header, black and white text, image,
and color text, integrating the results may include ascertaining
whether the same result image was produced by each of the index
tables 412, at what level of confidence for each, and the
probability of correctness associated with each of the index tables
412. In an alternative embodiment, the result integration is
performed outside of the dispatcher 402, e.g., by a result combiner
610 as discussed in conjunction with FIG. 6B. The final result is
provided via signal line 134 to the pre-processing server 103 or
MMR gateway 104.
[0159] The foregoing description of the embodiments of the present
invention has been presented for the purposes of illustration and
description. It is not intended to be exhaustive or to limit the
present invention to the precise form disclosed. Many modifications
and variations are possible in light of the above teaching. It is
intended that the scope of the present invention be limited not by
this detailed description, but rather by the claims of this
application. As will be understood by those familiar with the art,
the present invention may be embodied in other specific forms
without departing from the spirit or essential characteristics
thereof. Likewise, the particular naming and division of the
modules, routines, features, attributes, methodologies and other
aspects are not mandatory or significant, and the mechanisms that
implement the present invention or its features may have different
names, divisions and/or formats. Furthermore, as will be apparent
to one of ordinary skill in the relevant art, the modules,
routines, features, attributes, methodologies and other aspects of
the present invention can be implemented as software, hardware,
firmware or any combination of the three. Also, wherever a
component, an example of which is a module, of the present
invention is implemented as software, the component can be
implemented as a standalone program, as part of a larger program,
as a plurality of separate programs, as a statically or dynamically
linked library, as a kernel loadable module, as a device driver,
and/or in every and any other way known now or in the future to
those of ordinary skill in the art of computer programming.
Additionally, the present invention is in no way limited to
implementation in any specific programming language, or for any
specific operating system or environment. Accordingly, the
disclosure of the present invention is intended to be illustrative,
but not limiting, of the scope of the present invention, which is
set forth in the following claims.
* * * * *