U.S. patent application number 14/459115 was filed with the patent office on 2015-10-01 for data mesh based zero effort shopping.
The applicant listed for this patent is Robert Lee, Ryan Melcher, John Tapley. Invention is credited to Robert Lee, Ryan Melcher, John Tapley.
Application Number | 20150278912 14/459115 |
Document ID | / |
Family ID | 54191029 |
Filed Date | 2015-10-01 |
United States Patent
Application |
20150278912 |
Kind Code |
A1 |
Melcher; Ryan ; et
al. |
October 1, 2015 |
DATA MESH BASED ZERO EFFORT SHOPPING
Abstract
A system and method for data mesh-based zero effort shopping is
provided. In example embodiments, attribute data associated with a
user is received from a plurality of attribute sources. Demand
indications are extracted from the attribute data. The demand
indications may be indicative of anticipatory demand by the user
for a particular item. An item is identified from the attribute
data based on the extracted demand indications. User
characteristics pertaining to the user are inferred from the
attribute data. A purchase associated with the identified item is
facilitated based, at least in part, on the user
characteristics.
Inventors: |
Melcher; Ryan; (Ben Lomond,
CA) ; Tapley; John; (San Jose, CA) ; Lee;
Robert; (Burlingame, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Melcher; Ryan
Tapley; John
Lee; Robert |
Ben Lomond
San Jose
Burlingame |
CA
CA
CA |
US
US
US |
|
|
Family ID: |
54191029 |
Appl. No.: |
14/459115 |
Filed: |
August 13, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61970263 |
Mar 25, 2014 |
|
|
|
Current U.S.
Class: |
705/26.7 |
Current CPC
Class: |
G06Q 30/0633 20130101;
H04L 43/04 20130101; H04L 61/609 20130101; H04L 67/10 20130101;
A61B 5/04 20130101; H04L 67/1042 20130101; G06T 11/206 20130101;
G06Q 30/0631 20130101; H04L 63/107 20130101; H04L 63/08 20130101;
H04W 84/18 20130101; H04W 76/14 20180201; H04W 4/80 20180201 |
International
Class: |
G06Q 30/06 20060101
G06Q030/06 |
Claims
1. A system comprising: an attribute module to receive attribute
data associated with a user from a plurality of attribute sources;
an item module to extract demand indications from the attribute
data, the demand indications being indicative of anticipatory
demand by the user for a particular item; an analysis module,
implemented by a hardware processor of a machine, to identify a
pertinent item from the attribute data based on the extracted
demand indications; a characteristic module to infer user
characteristics pertaining to the user from the attribute data; and
an order module to determine transaction parameters for a suggested
transaction associated with the pertinent item based, at least in
part, on the user characteristics and facilitating the suggested
transaction according to the determined transaction parameters.
2. The system of claim 1, wherein the at least one order parameter
includes at least one of a quantity, a delivery time, a payment
time, a delivery method, a delivery destination, a merchant, or a
product.
3. A method comprising: receiving attribute data associated with a
user from a plurality of attribute sources; extracting demand
indications from the attribute data, the demand indications being
indicative of anticipatory demand by the user for a particular
item; identifying, using a hardware processor of a machine, a
commerce item from the attribute data based on the extracted demand
indications; inferring user characteristics pertaining to the user
from the attribute data; determining order parameters for a user
purchase associated with the commerce item based, at least in part,
on the inferred user characteristics; and facilitating the user
purchase according to the determined order parameters.
4. The method of claim 3, wherein the at least one order parameter
includes at least one of a quantity, a delivery time, a payment
time a delivery method, a delivery destination, a merchant, or a
product.
5. The method of claim 3, further comprising: extracting a current
inventory level of the commerce item from the attribute data;
determining an inventory threshold for the commerce item by
modeling usage of the commerce item based on the extracted current
inventory level and the inferred user characteristics; identifying
a mismatch between the inventory threshold and the current
inventory level; and based on the mismatch, automatically
performing the user purchase on behalf of the user.
6. The method of claim 3, further comprising: identifying a
purchase motive of the user for the commerce item by analyzing the
inferred user characteristics, the purchase motive corresponding to
a motive time; determining temporal order parameters, included in
the order parameters, based on the motive time; and facilitating
the user purchase according to the determined temporal order
parameters.
7. The method of claim 3, further comprising: identifying similar
users, from among a plurality of other users, that are similar to
the user based on the inferred user characteristics and respective
user characteristics of the plurality of other users; and
determining the order parameters based on the user characteristics
of the identified similar users.
8. The method of claim 3, further comprising: accessing purchase
criteria corresponding to the user; and automatically purchasing
the commerce item on behalf of the user according to the purchase
criteria.
9. The method of claim 8, wherein the purchase criteria include at
least one criterion corresponding to a budget; and wherein the
automatically purchasing the commerce item on behalf of the user is
based, at least in part, on the budget.
10. The method of claim 8, further comprising: determining an item
category for the commerce item, the purchase criteria including
criteria corresponding to the item category; and facilitating the
user purchase of the commerce item according to the purchase
criteria corresponding to the determined item category.
11. The method of claim 3, further comprising: generating a
notification that includes an option to make the user purchase, the
notification including the determined order parameters; causing
presentation of the notification to the user; receiving a user
selection of the option to make the user purchase; and responsive
to receiving the user selection, performing the user purchase
according to the determined order parameters.
12. The method of claim 11, further comprising: identifying
presentation parameters for presentation of the notification to the
user based on the inferred user characteristics, the presentation
parameters including a presentation time and a presentation device;
and causing presentation of the notification according o the
presentation. parameters.
13. The method of claim 11, further comprising: adapting the
presentation of the notification to the user based, at least in
part, on the inferred user characteristics.
14. The method of claim 1 further comprising: detecting a trigger
action of the user based on real-time data included in the
attribute data; and based on the detected trigger action, causing
presentation of the notification to the user.
15. The method of claim 3, further comprising: calculating a demand
metric for the commerce item based on the demand indications
corresponding to the commerce item; and facilitating the user
purchase associated with the commerce item based, at least in part,
on the demand metric.
16. The method of claim 15, further comprising: automatically
performing the user purchase on behalf of the user based on the
demand metric exceeding a threshold.
17. The method of claim 15, further comprising: based on demand
metric exceeding a threshold, generating a notification providing
the user an option to purchase the commerce item, the notification
including the determined order parameters; and causing presentation
of the notification to the user.
18. A machine readable medium having no transitory signals and
storing instructions that, when executed by at least one processor
of a machine, cause the machine to perform operations comprising:
receiving attribute data associated with a user from a plurality of
attribute sources; extracting demand indications from the attribute
data, the demand indications being indicative of anticipatory
demand by the user for a particular item; identifying an item from
the attribute data based on the extracted demand indications;
inferring user characteristics pertaining to the user from the
attribute data; determining order parameters for a user purchase
associated with the commerce item based, at least in part, on the
inferred user characteristics; and facilitating the user purchase
according to the determined order parameters.
19. The machine-readable medium of claim 18, wherein the at least
one order parameter includes at least one of a quantity, a delivery
time, a payment time, a deliver method, a delivery destination, a
merchant, or a product.
20. The machine-readable medium of claim 18, wherein the operations
further comprise: extracting a current inventory level of the
commerce item from the attribute data; determining an inventory
threshold for the commerce item by modeling usage of the commerce
item based on the extracted current inventory level and the
inferred user characteristics; identifying a mismatch between the
inventory threshold and the current inventory level; and based on
the mismatch, automatically performing the user purchase on behalf
of the user.
Description
RELATED APPLICATIONS
[0001] This application claims the priority benefit of U.S.
Provisional Application No. 61/970,263, entitled "PORTABLE PROFILE
PLATFORM," filed Mar. 25, 2014, which is hereby incorporated by
reference in its entirety.
TECHNICAL FIELD
[0002] Embodiments of the present disclosure relate generally to
e-commerce and, more particularly, but not by way of limitation, to
data mesh based zero effort shopping.
BACKGROUND
[0003] In recent years mobile devices, wearable devices, smart
devices, and the like have pervaded nearly every aspect of modern
life. Such devices are increasingly incorporating sensors to
monitor everything from the moisture level of houseplants to the
dribbling of a basketball. Network connected devices like these are
capable of providing a near real-time and constant data feed. These
trends have provided a vast amount of rich, constantly updated
data.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Various ones of the appended drawings merely illustrate
example embodiments of the present disclosure and cannot be
considered as limiting its scope.
[0005] FIG. 1A is a block diagram illustrating a networked system,
according to some example embodiments.
[0006] FIG. 1B illustrates a block diagram showing components
provided within the system of FIG. 1A, according to some example
embodiments.
[0007] FIG. 2A is a block diagram illustrating an example
embodiment of a data mesh system, according to some example
embodiments.
[0008] FIG. 2B is a block diagram illustrating an example
embodiment of a. user analytics system, according to some example
embodiments.
[0009] FIG. 3 is a flow diagram illustrating an example method for
identifying an item and facilitating a purchase associated with the
identified item, according to some example embodiments.
[0010] FIG. 4 is a flow diagram illustrating further operations for
facilitating the purchase based, at least in part, on an evaluation
of an inventory level, according to some example embodiments.
[0011] FIG. 5 is a flow diagram illustrating further operations for
facilitating the purchase including operations to determine
parameters for the purchase, according to some example
embodiments.
[0012] FIG. 6 is a flow diagram illustrating further operations for
determining order parameters including operations to determine a
temporal parameter associated with the purchase, according to some
example embodiments.
[0013] FIG. 7 is a flow diagram illustrating further operations to
facilitate the purchase based, at least in part, on purchase
criteria, according to some example embodiments.
[0014] FIG. 8 is a flow diagram illustrating a further example
method for identifying an item and facilitating a purchase,
according to some example embodiments.
[0015] FIG. 9 is a flow diagram illustrating an alternative example
method for identifying an item and facilitating a purchase,
according to some example embodiments.
[0016] FIG. 10 is a flow diagram illustrating further operations to
facilitate the purchase based, at least in part, on a demand
metric, according to some example embodiments.
[0017] FIG. 11 is a flow diagram illustrating further operations to
facilitate the purchase using a notification, according to some
example embodiments.
[0018] FIGS. 12 and 13 are flow diagrams illustrating further
operations for presenting a notification, according to some example
embodiments.
[0019] FIG. 14 is a flow diagram illustrating communication between
various devices in relation to presenting a notification to the
user, according to some example embodiments.
[0020] FIG. 15 depicts an example user interface to facilitate the
purchase, according to some example embodiments.
[0021] FIGS. 16 and 17 illustrate examples of identifying an item
and facilitating a purchase associated with the identified item,
according to some example embodiments.
[0022] FIGS. 18A and 18B depict example configurations for
communicatively coupling attribute sources, according to some
example embodiments.
[0023] FIG. 19 depicts various example attribute sources, according
to some example embodiments.
[0024] FIG. 20 depicts various components that provide attribute
data, according to some example embodiments.
[0025] FIG. 21 is a block diagram of an example data structure for
example attribute data associated with a user, according to some
example embodiments.
[0026] FIG. 22 is a block diagram of an example data structure for
example attribute data associated with a device, according to some
example embodiments.
[0027] FIG. 23 depicts an example mobile device and mobile
operating system interface, according to some example
embodiments.
[0028] FIG. 24 is a block diagram illustrating an example of a
software architecture that may be installed on a machine, according
to some example embodiments.
[0029] FIG. 25 illustrates a diagrammatic representation of a
machine in the form of a computer system within which a set of
instructions may be executed for causing the machine to perform any
one or more of the methodologies discussed herein, according to an
example embodiment.
[0030] The headings provided herein are merely for convenience and
do not necessarily affect the scope or meaning of the terms
used.
DETAILED DESCRIPTION
[0031] The description that follows includes systems, methods,
techniques, instruction sequences, and computing machine program
products that embody illustrative embodiments of the disclosure.
111 the following description, for the purposes of explanation,
numerous specific details are set forth in order to provide an
understanding of various embodiments of the inventive subject
matter. It will be evident, however, to those skilled in the art,
that embodiments of the inventive subject matter may be practiced
without these specific details. In general, well-known instruction
instances, protocols, structures, and techniques are not
necessarily shown in detail.
[0032] The objective of zero effort shopping is to reduce or
eliminate effort by a consumer user to purchase various products.
To this end, the systems and methods described herein may, among
other functions, access a wealth of attribute data associated with
a user, analyze the attribute data to identify items that the user
may have demand for, and facilitate a purchase associated with the
identified items. For instance, the user may be characterized based
on an analysis of the attribute data and the user characterization
may be used as a basis for identifying items and order parameters
for a purchase associated with identified items. The collective,
aggregated attribute data may be referred to as a "data mesh."
[0033] In various example embodiments, the attribute data is
received or accessed from a broad gamut of attribute sources such
as, for example, from mobile devices, smart devices, smart homes,
social network services, user profiles, browsing histories,
purchase histories, and so forth, Demand indications that are
indicative of anticipatory demand by the user for a particular item
are extracted from the attribute data. For example, purchase
histories may indicate prior purchases for coffee products,
location data (e.g., as determined by a Global Positioning System
(GPS) component of a mobile device, beacon detections, or other
location services) may indicate frequent trips to coffee shops, or
social media data such as check-ins or user postings may indicate
an affinity for coffee. Subsequent to extracting the demand
indications, a commerce item may be identified from the attribute
data based on the extracted demand indications. In continuing with
the example above, the identified commerce item may comprise coffee
beans, coffee filters, or other coffee related items.
[0034] In further example embodiments, user characteristics
pertaining to the user are inferred based on an analysis of a
portion of the attribute data. The user characteristics include,
for instance, a trait, quality, action, activity, attitude, health
condition, habit, behavior, and the like. For example, the user
characteristics may include a particular medical condition of that
user associated with diet restrictions. The systems and methods
described herein may facilitate a purchase associated with the
commerce item based, at least in part, on the user characteristics.
In an example embodiment, a notification, including an option to
make the purchase, is presented to the user. In some instances, the
notification is personalized to the user based on the user
characteristics (e.g., the notification may be presented on a
preferred device of the user at a time of day corresponding to user
available such as after the user is done working). In some example
embodiments, the purchase is made automatically on behalf of the
user. For instance, a demand metric is calculated based on the
demand indications and if the demand metric exceeds a threshold,
the purchase may be performed automatically. Thus, the systems and
methods described herein may facilitate commerce on behalf of the
user to increase convenience and reduce time and effort of the user
to conduct commerce. In some cases, the systems and methods analyze
the attribute data to simulate decisions of the user regarding
various shopping or purchasing related activities.
[0035] With reference to FIG. 1A, an example embodiment of a
high-level client-server-based network architecture 100 is shown. A
networked system 102 provides server-side functionality via a
network 104 (e.g., the Internet or wide area network (WAN)) to a
client device 110. In some implementations, a user (e.g., user 106)
interacts with the networked system 102 using the client device
110. FIG. 1A illustrates, for example, a web client 112 (e,g., a
browser, such as the Internet Explorer.RTM. browser developed by
Microsoft.RTM. Corporation of Redmond, Wash. State), client
application(s) 114, and a programmatic client 116 executing on the
client device 110. The client device 110 includes the web client
112, the client application(s) 114, and the programmatic client 116
alone, together, or in any suitable combination, Although FIG. 1A
shows one client device 110, multiple client devices may be
included in the network architecture 100.
[0036] In various implementations, the client device 110 comprises
a computing device that includes at least a display and
communication capabilities that provide access to the networked
system 102 via the network 104. The client device 110 may comprise,
but is not limited to, a remote device, work station, computer,
general purpose computer, Internet appliance, hand-held device,
wireless device, portable device, wearable computer, cellular or
mobile phone, personal digital assistant (PDA), smart phone,
tablet, ultrabook, netbook, laptop, desktop, multi-processor
system, microprocessor-based or programmable consumer electronic,
game consoles, set-top box, network PC, mini-computer, and the
like. In further example embodiments, the client device 110 may
comprise one or more of a touch screen, accelerometer, gyroscope,
biometric sensor, camera, microphone, global positioning system
(GPS) device, and the like.
[0037] The client device 110 communicates with the network 104 via
a wired or wireless connection. For example, one or more portions
of the network 104 comprises an ad hoc network, an intranet, an
extranet, a Virtual Private Network (VPN), a Local Area Network
(LAN), a wireless LAN (WIAN), a Wide Area Network (WAN), a wireless
WAN (WWAN), a Metropolitan Area Network (MAN), a portion of the
Internet, a portion of the Public Switched Telephone Network
(PSTN), a cellular telephone network, a wireless network, a
Wireless Fidelity (Wi-Fi.RTM.) network, a Worldwide
Interoperability for Microwave Access (WiMax) network, another type
of network, or any suitable combination thereof.
[0038] In some example embodiments, the client device 110 includes
one or more of the applications (also referred to as "apps") such
as, but not limited to, web browsers, book reader apps (operable to
read e-books), media apps (operable to present various media forms
including audio and video), fitness apps, biometric monitoring
apps, messaging apps, electronic mail (email) apps, and e-commerce
site apps (also referred to as "marketplace apps"). In some
implementations, the client application(s) 114 include various
components operable to present information to the user and
communicate with networked system 102. In some embodiments, if the
e-commerce site application is included in the client device 110,
then this application may be configured to locally provide the user
interface and at least some of the functionalities with the
application configured to communicate with the networked system
102, on an as needed basis, for data or processing capabilities not
locally available (e.g., access to a database of items available
for sale, to authenticate a user, to verify a method of payment).
Conversely, if the e-commerce site application is not included in
the client device 110, the client device 110 can use its web
browser to access the e-commerce site (or a variant thereof) hosted
on the networked system 102.
[0039] In various example embodiments, the user (e.g., the user
106) comprises a person, a machine, or other means of interacting
with the client device 110. In some example embodiments, the user
is not be part of the network architecture 100, but interacts with
the network architecture 100 via the client device 110 or another
means. For instance, the user provides input (e.g., touch screen
input or alphanumeric input) to the client device 110 and the input
is communicated to the networked system 102 via the network 104. In
this instance, the networked system 102, in response to receiving
the input from the user, communicates information to the client
device 110 via the network 104 to be presented to the users. In
this way, the user can interact with the networked system 102 using
the client device 110.
[0040] An Application Program Interface (API) server 120 and a web
server 122 may be coupled to, and provide programmatic and web
interfaces respectively to, one or more application server(s) 140.
The application server(s) 140 may host one or more publication
system(s) 142, payment system(s) 144, and a data mesh system 150,
each of which may comprise one or more modules or applications and
each of which may be embodied as hardware, software, firmware, or
any combination thereof. The application server(s) 140 are, in
turn, shown to be coupled to one or more database server(s) 124
that facilitate access to one or more information storage
repositories or database(s) 126. In an example embodiment, the
database(s) 126 are storage devices that store information to be
posted (e.g., publications or listings) to the publication
system(s) 142. The database(s) 126 may also store digital goods
information in accordance with some example embodiments.
[0041] Additionally, a third party application 132, executing on a
third party server 130, is shown as having programmatic access to
the networked system 102 via the programmatic interface provided by
the API server 120. For example, the third party application 132,
utilizing information retrieved from the networked system 102, may
support one or more features or functions on a website hosted by
the third party. The third party website may, for example, provide
one or more promotional, marketplace, or payment functions that are
supported by the relevant applications of the networked system
102.
[0042] The publication system(s) 142 may provide a number of
publication functions and services to the users that access the
networked system 102. The payment system(s) 144 may likewise
provide a number of functions to perform or facilitate payments and
transactions. White the publication system(s) 142 and payment
system(s) 144 are shown in FIG. 1A to both form part of the
networked system 102, it will be appreciated that, in alternative
embodiments, each system 142 and 144 may form part of a payment
service that is separate and distinct from the networked system
102. In some example embodiments, the payment system(s) 144 may
form part of the publication system(s) 142.
[0043] In some implementations, the data mesh system 150 provides
functionality to receive, retrieve, or store a broad spectrum of
data associated with the user. It will be noted that the
collective, aggregated attribute data may be referred to as a "data
mesh." In various implementations, the data mesh system 150 stores
received data in storage devices such as the database(s) 126. In an
example embodiment, the data mesh system 150 includes a user
analytics system 152 that identifies items the user may have demand
for and facilitate a purchase associated with the identified items.
In some example embodiments, the data mesh system 150 communicates
with the client device 110, the third party server(s) 130, the
publication system(s) 142 (e.g., retrieving listings), and the
payment system(s) 144 (e.g., purchasing a listing). In an
alternative example embodiment, the data mesh system 150 is a part
of the publication system(s) 142.
[0044] Further, while the client-server-based network architecture
100 shown in FIG. 1A employs a client-server architecture, the
present inventive subject matter is, of course, not limited to such
an architecture, and may equally well find application in a
distributed, or peer-to-peer, architecture system, for example. The
various systems of the applications server(s) 140 (e.g., the
publication system(s) 142 and the payment system(s) 144) may also
be implemented as standalone software programs, which do not
necessarily have networking capabilities.
[0045] The web client 112 may access the various systems of the
networked system 102 (e.g., the publication system(s) 142) via the
web interface supported by the web server 122. Similarly, the
programmatic client 116 and client application(s) 114 may access
the various services and functions provided by the networked system
102 via the programmatic interface provided by the API server 120.
The programmatic client 116 may, for example, be a seller
application (e.g., the Turbo Lister application developed by
eBay.RTM. Inc., of San Jose, Calif.) to enable sellers to author
and manage listings on the networked system 102 in an off-line
manner, and to perform batch-mode communications between the
programmatic client 116 and the networked system 102.
[0046] FIG. 1B illustrates a block diagram showing components
provided within the publication system(s) 142, according to some
embodiments. In various example embodiments, the publication
system(s) 142 may comprise a market place system to provide market
place functionality (e.g., facilitating the purchase of items
associated with item listings on an e-commerce website). The
networked system 102 may be hosted on dedicated or shared server
machines that are communicatively coupled to enable communications
between server machines. The components themselves are
communicatively coupled (e.g., via appropriate interfaces) to each
other and to various data sources, on as to allow information to be
passed between the applications or so as to allow the applications
to share and access common data. Furthermore, the components may
access one or more database(s) 126 via the database server(s)
124.
[0047] The networked system 102 may provide a number of publishing,
listing, and price-setting mechanisms whereby a seller also
referred to as a "first user") may list or publish information
concerning) goods or services for sale or barter, a buyer (also
referred to as a "second user") can express interest in or indicate
a desire to purchase or barter such goods or services, and a
transaction (such as a trade) may be completed pertaining to the
goods or services. To this end, the networked system 102 may
comprise a publication engine 160 and a selling engine 162. The
publication engine 160 may publish information, such as item
listings or product description pages, on the networked system 102.
In some embodiments, the selling engine 162 may comprise one or
more fixed-price engines that support fixed-price listing and price
setting mechanisms and one or more auction engines that support
auction-format listing and price setting mechanisms (e.g., English,
Dutch, Chinese, Double, Reverse auctions, etc.). The various
auction engines may also provide a number of features in support of
these auction-format listings, such as a reserve price feature
whereby a seller may specify a reserve price in connection with a
listing and a proxy-bidding feature whereby a bidder may invoke
automated proxy bidding. The selling engine 162 may further
comprise one or more deal engines that support merchant-generated
offers for products and services.
[0048] A listing engine 164 allows sellers to conveniently author
listings of items or authors to author publications. In one
embodiment, the listings pertain to goods or services that a user
(e.g., a seller) wishes to transact via the networked system 102.
In some embodiments, the listings may be an offer, deal, coupon, or
discount for the good or service. Each good or service is
associated with a particular category. The listing engine 164 may
receive listing data such as title, description, and aspect
name/value pairs. Furthermore, each listing for a good or service
may be assigned an item identifier. In other embodiments, a user
may create a listing that is an advertisement or other form of
information publication. The listing information may then be stored
to one or more storage devices coupled to the networked system 102
(e.g., database(s) 126). Listings also may comprise product
description pages that display a product and information (e.g.,
product title, specifications, and reviews) associated with the
product. In some embodiments, the product description page may
include an aggregation of item listings that correspond to the
product described on the product description page.
[0049] The listing engine 164 also may allow buyers to conveniently
author listings or requests for items desired to be purchased. In
some embodiments, the listings may pertain to goods or services
that a user (e.g., a buyer) wishes to transact via the networked
system 102. Each good or service is associated with a particular
category. The listing engine 164 may receive as much or as little
listing data, such as title, description, and aspect name/value
pairs, that the buyer is aware of about the requested item. In some
embodiments, the listing engine 164 may parse the buyer's submitted
item information and may complete incomplete portions of the
listing. For example, if the buyer provides a brief description of
a requested item, the listing engine 164 may parse the description,
extract key terms and use those terms to make a determination of
the identity of the item. Using the determined item identity, the
listing engine 164 may retrieve additional item details for
inclusion in the buyer item request. In some embodiments, the
listing engine 164 may assign an item identifier to each listing
for a good or service.
[0050] In some embodiments, the listing engine 164 allows sellers
to generate offers for discounts on products or services. The
listing engine 164 may receive listing data, such as the product or
service being offered, a price or discount for the product or
service, a time period for which the offer is valid, and so forth.
In some embodiments, the listing engine 164 permits sellers to
generate offers from sellers' mobile devices. The generated offers
may be uploaded to the networked system 102 for storage and
tracking
[0051] Searching the networked system 102 is facilitated by a
searching engine 166. For example, the searching engine 166 enables
keyword queries of listings published via the networked system 102.
In example embodiments, the searching engine 166 receives the
keyword queries from a device of a user and conducts a review of
the storage device storing the listing information. The review will
enable compilation of a result set of listings that may be sorted
and returned to the client device 110 of the user. The searching
engine 166 may record the query (e.g., keywords) and any subsequent
user actions and behaviors (e.g., navigations, selections, or
click-throughs).
[0052] The searching engine 166 also may perform a search based on
a location of the user. A user may access the searching engine 166
via a mobile device and generate a search query. Using the search
query and the user's location, the searching engine 166 may return
relevant search results for products, services, offers, auctions,
and so forth to the user. The searching engine 166 may identify
relevant search results both in a list form and graphically on a
map. Selection of a graphical indicator on the map may provide
additional details regarding the selected search result. In some
embodiments, the user may specify, as part of the search query, a.
radius or distance from the user's current location to limit search
results.
[0053] In a further example, a navigation engine 168 allows users
to navigate through various categories, catalogs, or inventory data
structures according to which listings may be classified within the
networked system 102. For example, the navigation engine 168 allows
a user to successively navigate down a category tree comprising a
hierarchy of categories (e.g., the category tree structure) until a
particular set of listings is reached. Various other navigation
applications within the navigation engine 168 may be provided to
supplement the searching and browsing applications. The navigation
engine 168 may record the various user actions (e.g., clicks)
performed by the user in order to navigate down the category
tree.
[0054] In some example embodiments, a personalization engine 170
provides functionality to personalize various aspects of user
interactions with the networked system 102. For instance, the user
can define, provide, or otherwise communicate personalization
settings that the personalization engine 170 uses to determine
interactions with the networked system 102. In further example
embodiments, the personalization engine 170 determines
personalization settings automatically and personalizes
interactions based on the automatically determined settings. For
example, the personalization engine 170 determines a native
language of the user and automatically presents information in the
native language.
[0055] FIG. 2A is a block diagram of the data mesh system 150,
which may provide functionality to receive, retrieve, or access
attribute data from attribute sources, analyze the attribute data,
manage the attribute data, and so forth. In an example embodiment,
the data mesh system 150 may include a presentation module 210, a
communication module 215, an attribute module 220, a characteristic
module 225, a management module 230, and the user analytics system
152. FIG. 2B is a block diagram of the user analytics system 152,
which may provide functionality to identify items that the user has
demand for and facilitate a purchase associate with the identified
items. The user analytics system 152 may include an item module
250, an analysis module 255, and an order module 260. All, or some,
of the modules 210-260 of FIGS. 2A and 2B, may communicate with
each other, for example, via a network coupling, shared memory, and
the like. It will be appreciated that each module of modules
210-260 may be implemented as a single module, combined into other
modules, or further subdivided into multiple modules. It will
further be appreciated that the modules or functionality of the
user analytics system 152 may be implemented in the data mesh
system 150 and the modules or functionality of the data mesh system
150 may be implemented in the user analytics system 152. Other
modules not pertinent to example embodiments may also be included,
but are not shown.
[0056] Referring to FIG. 2A, the presentation module 210 may
provide various presentation and user interface functionality
operable to interactively present and receive information from
users. For example, the presentation module 210 may cause
presentation of various notifications or user interfaces that
provide the user an option to make a purchase associated with the
identified items. In various implementations, the presentation
module 210 presents or causes presentation of information (e.g.,
visually displaying information on a screen, acoustic output,
haptic feedback). Interactively presenting is intended to include
the exchange of information between a particular device and the
user. The user provides input to interact with the user interface
in many possible manners such as alphanumeric input, cursor input,
tactile input, or other input (e.g., touch screen, tactile sensor,
light sensor, infrared sensor, biometric sensor, microphone,
gyroscope, accelerometer, or other sensors). It will be appreciated
that the presentation module 210 provides many other user
interfaces to facilitate functionality described herein. Further,
it will be appreciated that "presenting" as used herein is intended
to include communicating information or instructions to a
particular device that is operable to perform presentation based on
the communicated information or instructions.
[0057] The communication module 215 provides various communications
fimctionality and web services. For example, the communication
module 220 provides network communication such as communicating
with the networked system 102, the client device 110, and the third
party server(s) 130. In various example embodiments, the network
communication may operate over wired or wireless modalities. Web
services are intended to include retrieving information from the
third party server(s) 130, the database(s) 126, and the application
server(s) 140. In some implementations, information retrieved by
the communication module 215 comprises data associated with the
user (e.g., user profile information from an online account, social
network service data associated with the user), data associated
with one or more items listed on an e-commerce website (e.g.,
images of the item, reviews of the item, item price), or other data
to facilitate the functionality described herein.
[0058] The attribute module 220 may receive, access, or retrieve a
wide variety of attribute data from many different attribute
sources. For example, the attribute module 220 may receive,
retrieve, or access the attribute data from user devices or
machines (e.g., the client device 110), social network services,
the third party server(s) 130, the publication system(s) 142, the
payment system(s) 144, other applications servers, or other
attribute sources. The attribute data, as used herein, is intended
to include raw data such as sensor data, profile data, social
network content, and so on.
[0059] In some example embodiments, the attribute module 220
extracts the attribute data from various sources. For instance, a
payment history log of the user may include a tremendous amount of
extraneous data. In this instance, the attribute module 220
extracts purchase information such as item purchased, time,
purchase price, seller, location, brand, and so forth from the
payment history log of the user.
[0060] In further example embodiments, the attribute module 220
performs various functions to prepare or condition the attribute
data for analysis. For instance, the attribute module 220
standardizes the attribute data to facilitate analysis of the
attribute data (e.g., determine a normal form for the data to allow
for comparison and other mathematical analysis). The attribute
module 220 may perform many other functions to prepare the
attribute data for analysis.
[0061] In various example embodiments, the attribute module 220
stores the attribute data in association with the user for
subsequent analysis. In some implementations, the attribute module
220 stores the attribute data in the database(s) 126. The attribute
data may be stored in conjunction with a user identifier such that
the attribute module 220 may subsequently use the user identifier
to access the attribute data corresponding to a particular user. In
other implementations, the attribute module 220 accesses the stored
attribute data using other schemes. For instance, the attribute
module 220 may access a portion of the attribute data associated
with a time, an item, a user, a type of user, a particular
attribute source, and so forth. In this way, the attribute module
220 may access a portion of attribute data according to various
parameters from among a large quantity of the attribute data to
access, identify, or find pertinent and relevant data.
[0062] The characteristic module 225 infers a user characteristic
or multiple user characteristics corresponding to the user based on
an analysis of at least a portion of the attribute data. Many
schemes and techniques may be employed to infer the characteristic
from the attribute data. For example, a particular user
characteristic may be a work location of the user. The attribute
data can include a plurality of locations (e.g., as determined by a
GPS component of a user device used by the user) that include time
stamps. The work location of the user may be inferred based on the
consistency and timing of the locations included in the attribute
data (e.g., the user is typically at a particular office building
during normal working hours). Many different portions of attribute
data and combinations of portions of attribute data may be analyzed
to infer a wide variety of characteristics.
[0063] In various example embodiments, characteristics (e.g., the
user characteristics), as used herein, are intended to include
traits, qualities, actions, activities, attitudes, habits,
behaviors, and the like pertaining to a person or people. Inasmuch
as the attribute data may not necessarily pertain to a person
(e.g., raw data such as coordinates of a particular location), a
characteristic (e.g., current location of the user, disliking spicy
food, having young children, being a Star Trek fanatic) may be
distinct from the attribute data.
[0064] The management module 230 may provide management functions
associated with the attribute data. For example, the management
module 230 may provide the user with functionality to edit, modify,
update, or otherwise control the attribute data. For instance, the
user may remove undesired attribute data via the functionality
provided by the management module 230. In a further instance, the
user may specify permissions for portions of the attribute data
using the functionality provided by the management module 230. The
permissions may allow or prohibit certain access or uses for the
attribute data (e.g., the permission may prohibit access to the
attribute data by third parties). The user may grant various levels
of access and abilities. In some example embodiments, the
permissions may persist for a period of time, and after expiration
of the time period, the management module 230 may revoke the
permissions,
[0065] In further example embodiments, the management module 230
requests consent from the user to access portions of the attribute
data or to request permission for certain uses of the attribute
data. For example, the management module 230 requests consent from
the user to allow third parties to access portions of the attribute
data. The management module 230 may request a variety of other
consents associated with various actions corresponding to the
attribute data.
[0066] In still further example embodiments, the management module
230 provides functionality to allow third parties to access the
attribute data or the user characteristics. For example, the
management module 230 provides a set of APIs that may be invoked by
third parties to access the attribute data or the user
characteristics. As discussed above, in some example embodiments,
the management module 230 determines permission or consent of the
user prior to providing access to the attribute data.
[0067] Referring now to FIG. 2B, the item module 250 in the user
analytics system 152 may provide functionality to facilitate
identifying items from the attribute data. For example, the item
module 250 extracts demand indications, from the attribute data,
that may indicate anticipatory demand by the user for a particular
item. In a specific example, the demand indications may indicate
use of a particular item by the user, a user supply of a particular
item, a user activity indicative of a particular item (e.g.,
frequent visits to a beach may be indicative of demand for
sunscreen products and other beach related products), and other
indications of demand for various items. The item module 250
extracts the demand indications from many different portions of
attribute data and combinations of portions of attribute data using
a variety of schemes and techniques,
[0068] The analysis module 255 provides functionality to identify
items from the attribute data. For example, the analysis module 255
identifies the commerce item or a pertinent item based on the
demand indications, the user characteristics, the attribute data,
or any suitable combination thereof. In further example
embodiments, the analysis module 255 calculates a demand metric
based on the demand indications. In some implementations, the user
analytics system 152 performs a variety of tasks and functions
based on the demand metric, such as, various aspects of
facilitating the purchase associated with the commerce item.
[0069] The order module 260 provides functionality to facilitate
the user purchase associated with the commerce item. For example,
the order module 260 determines order parameters or transaction
parameters for the user purchase based on the user characteristics,
the attribute data, the demand indications, or among other data. In
some example embodiments, the order module 260 automatically (e.g.,
without intervention or action of the user) performs the user
purchase on behalf of the user based on various triggers or
analyses.
[0070] FIG. 3 is a flow diagram illustrating an example method 300
for identifying the commerce item from the attribute data and
facilitating the user purchase associated with the commerce item.
At operation 310, the attribute module receives attribute data
associated with the user from a plurality of attribute sources. In
various example embodiments, at least a portion of the attribute
data includes real-time data or near real-time data. The term
"real-time data," as used herein, is intended to include data
associated with an event currently happening. For example, the
real-time data may include user input data or sensor data
communicated to the attribute module 220 after a delay interval
(e.g., due to transmission delay or other delays such as being
temporarily stored at an intermediate device) between capturing the
data and the attribute module 220 receiving the data.
[0071] As will be discussed in connection with FIGS. 19 and 20, the
attribute data is received from a broad spectrum of attribute
sources (e.g., devices, sensors, servers, databases, and other
sources). Additionally, the attribute module 220 may receive the
attribute data via many pathways resulting from an assortment of
configurations of the attribute sources as further discussed in
connection with FIGS. 18A and 18B. In an example embodiment, the
attribute module 220 receives the attribute data directly from the
attribute sources. In other example embodiments, the attribute
module 220 receives the attribute data from a central device that
receives attribute data from a plurality of user devices. In still
other example embodiments, various user devices are communicatively
coupled in a decentralized device-to-device mesh, and the attribute
module 220 receives the attribute data corresponding to a
particular device in the mesh from any of the devices in the mesh.
In still other examples, the attribute module 220 receives the
attribute data from the attribute sources in many other
configurations including various suitable combinations of
configurations.
[0072] In various example embodiments, the attribute module 220
stores the attribute data in association with the user (e.g.,
indexed based on a user identifier) for subsequent analysis. The
attribute module 220 may store the attribute data in a storage
device such as the database(s) 126, for example. In some
implementations, the attribute module 220 accesses the stored
attribute data using a variety of search or find schemes. For
instance, the attribute data associated with a particular user is
accessed using a user identifier that corresponds to the particular
user. It will be noted that the collective, aggregated attribute
data may be referred to as a "data. mesh."
[0073] At operation 320, the item module 250 extracts demand
indications from the attribute data. In some example embodiments,
the demand indications are indicative of anticipatory demand by the
user for a particular item. For instance, a particular demand
indication may indicate that the user may want, desire, or have an
affinity for a particular product or commerce item. It is noted
that the terms "item," "product," "commerce item," and the like are
intended to include a wide variety of products (e.g., items
corresponding to item listings published on an e-commerce website)
and services (e.g., a particular activity such going to a
restaurant). It is also noted that the terms "anticipatory" and
"predictive" as used herein are intended to pertain to future
events, or activity, including events that are in the immediate
future (e.g., events within a short time period, such as minutes or
seconds, of the present) as well as events further in the future
(e.g., months or years from the present).
[0074] The item module 250 extracts the demand indications from a
wide variety of data included in the attribute data, such as,
purchase histories, location data (e.g., as determined by a GPS
component of a mobile device, beacon detections, or other location
services), social media data (e.g., check-ins or postings by the
user), as well as other data included in the attribute data as
discussed herein. The demand indications include, for example,
inventory level indications (e.g., a food supply of the user as
indicated by a smart refrigerator), item usage indications (e.g.,
user purchase history may indicate certain item usage patterns),
item activity indications, activity related to an item (e.g., the
user spending time on the ski slopes may indicate a demand for ski
equipment), user engagement data (e.g., the user clicking on
particular links associated with various products or activities),
and so forth. For instance, location data included in the attribute
data may indicate frequent trips to coffee shops. In this instance,
the item module 250 extracts the location data from the attribute
data since it may be indicative of demand for coffee or coffee
related products by the user. In another example, social media
data, such as check-ins to a gym or postings about fitness
activities, may indicate demand for fitness related items or
activities.
[0075] In a specific example, the user may currently possess a
sufficient supply of bottled water, but based on indications of
consumption rates of bottled water by the user, the user may have
future demand for the bottled water. Thus, in this example, the
item module 250 extracts supply indications for the bottled water
(e.g., purchase history data of the user) or consumption
indications for the bottled water (e.g., inventory activity data as
retrieved or accessed from a smart refrigerator) from the attribute
data. That is to say, the demand indications, extracted by the item
module 250, may include the supply indications, inventory level
indications, or inventory activity indication for the bottle
water.
[0076] At operation 330, the analysis module 255 identifies the
commerce item, product, or pertinent item from the attribute data
based on the extracted demand indications. For instance, the
analysis module 255 determines that there may be a high likelihood
that the user is interested or has demand for a particular item. In
other words, the analysis module 255 identifies the commerce item
or multiple commerce items from among a plurality of commerce items
associated with the demand indications based on user demand for
respective commerce items included in the plurality of commerce
items.
[0077] The analysis module 255 identifies the commerce item based
on the demand indications using a variety of schemes and
techniques. For example, the analysis module 255 may calculate the
demand metric based on the demand indications. The demand metric
may indicate a likelihood that the user has a demand for a
particular item. For instance, the demand metric may be based on an
occurrence count of demand indications that correspond to a
particular item (e.g., a particular item with multiple
corresponding demand indications may be associated with a higher
demand metric than a particular item with a single corresponding
demand indication). In some example embodiments, the analysis
module 255 may identify the commerce item based on the calculated
demand metric exceeding a threshold (e.g., a predefined or
dynamically determined value).
[0078] In another example embodiment, the analysis module 255
ranks, sorts, or otherwise orders at least a portion of the
plurality of commerce items associated with the demand indications
based on the demand metric. In this example embodiment, the
analysis module 255 identifies the first, a predefined number, or a
dynamically determined number of the highest-ranking commerce items
included in the plurality of commerce items associated with the
demand indications, either alone or in any suitable combination.
For instance, the analysis module 255 identifies the commerce item
from among the plurality of commerce items associated with the
demand indications based on a statistical analysis such as a
percentage (e.g., top ten percent of the ranked plurality of
commerce items), analysis based on standard deviations away from a
mean, or other statistical methods.
[0079] In further example embodiments, the demand indications are
weighted such that higher weighted demand indications may be more
influential in the analysis module 255 identifying the commerce
item based on the demand indications. The weighting can be
pre-defined or dynamically determined based on a user feedback data
(e.g., data that indicates whether the user actually had demand for
the commerce item identified by the analysis module 255). In some
implementations, the feedback data is included in the attribute
data subsequent to the analysis module 255 identifying the commerce
item. In this way, the analysis module can adapt, learn, or evolve
as more of the attribute data is received. In some example
embodiments, the analysis module 255 employs various
machine-learning techniques to enhance identifying the commerce
item based on the demand indications. Similar techniques may be
applied by the item module 250 to extract the demand indications in
the previous operation.
[0080] At operation 340, the characteristic module 225 infers or
directly measures user characteristics pertaining to the user from
the attribute data. In some example embodiments, the characteristic
module 225 may store the inferred user characteristics for
subsequent analysis, for example, in a storage device such as
database(s) 126. The characteristic module 225 may infer a vast
spectrum of the user characteristics from the attribute data. A few
specific examples of user characteristics include demographic data
(e.g., age, gender, marital status, number of children), user
preferences (e.g., being a morning person, favorite locations,
enjoying spicy food), idiosyncrasy (e.g., being forgetful, such as
draining the battery on a mobile device; or being impatient, such
as a line breaker that will leave a store if the line is too long),
qualities (e.g., being athletic, being tail, having a large
vocabulary), personality traits (e.g., being a risk taker),
actions, activities (e.g., working for a nonprofit), attitudes,
habits (e.g., being a coffee drinker), behaviors, beliefs, biases,
demeanor, and physical characteristics of the user (e.g., height,
weight, garment sizes, eye color, hair color). The specificity of
the characteristics ranges from very narrow (e.g., drinks a
particular brand of soda) to very broad (e.g., being generally
philanthropic). To illustrate inferring the user characteristic
from the attribute data, the attribute data may include user
location data that may indicate frequent visits to a local school,
local soccer fields, and the like. In this example, the
characteristic module 225 infers that the user has children based
on the types of locations the user may be frequently visiting,
[0081] In some instances, the characteristic module 225 performs
varying degrees of inferential analysis of the attribute data to
derive the user characteristics. For example, the characteristic
module 225 infers the user's wake-up time based on user device
activity or other activity (e.g., connected alarm clock settings,
logins to accounts, and various other user activities that may
indicate a wake-up time). In this example, the characteristic
module 225 infers a particular user characteristic that may be of a
larger inferential jump such as the user being a morning person or
a person that likes to steep in. The degree of inferential jump may
be configurable. In some example embodiments, the characteristic
module 225 employs various techniques to minimize or otherwise
control incorrect inferences (e.g., machine-learning, other
learning algorithms).
[0082] In further example embodiments, the characteristic module
225 may learn, adapt, or evolve as more of the attribute data is
received (e.g., via machine learning techniques or other learning
algorithms). For example, the attribute data may include location
data of the user. The characteristic module 225 may infer a
favorite location of the user based on a pattern (e.g., frequently
visited locations) in the location data. However, the
characteristic module 225 may subsequently receive employment data
of the user that may indicate a current employer including an
employer location. The characteristic module 225 may learn, update,
or otherwise adapt to account for the new attribute data. Thus, in
this example, the characteristic module 225 does not infer a
favorite location of the user if the location is a work location of
the user. In some instances, the user provides input directly
(e.g., via a user interface configured to receive inferential
guidance from the user) to facilitate the characteristic module 225
in inferring characteristics from the attribute data (e.g., user
input indicating that a particular inferred characteristic is
incorrect or providing input to be used as a basis for future
inferences).
[0083] In other instances, the characteristic module 225 performs
very little or no analysis to derive the user characteristics from
the attribute data. For example, if the attribute data includes an
alarm time setting from a connected alarm clock (e.g., a smart
phone with an alarm clock app), the alarm time setting directly
indicates a wake-up time. Since the attribute data directly relates
to a particular user characteristic, the characteristic module 225
need not perform analysis to derive the user characteristic,
[0084] In some example embodiments, the user characteristic
comprises predefined characteristics or dynamically determined
characteristics. For instance, a particular set of characteristics
can be predefined (e.g., work location, home location, marital
status, socio-economic level). These example embodiments, the
characteristic module 225 determines that particular predefined
characteristics are associated with the user based on an analysis
of the attribute data. In other instances, the characteristic
module 225 dynamically determines characteristics based on the
attribute data. For example, the attribute data may indicate that
the user owns a particular exotic pet. Although there may not be a
predefined characteristic associated with the particular exotic
pet, the characteristic module 225 determines the user
characteristic of owning an exotic pet from the attribute data, in
this example.
[0085] At operation 350, the order module 260 facilitates the user
purchase or suggested transaction, for the user, associated with
the commerce item based, at least in part, on the user
characteristics. Facilitating a particular purchase is intended to
include actions such as automatically (e,g., without intervention
or action of the user) performing the particular purchase on behalf
of the user, causing presentation of a notification that includes
the option to make the particular purchase, or other actions
associated with facilitating the particular purchase (e.g., causing
presentation of an advertisement, adjusting item listing search
results of the user to emphasize item listings associated with a
particular item). In further example embodiments, the order module
260 determines various parameters associated with the user purchase
e.g., order parameters or transaction parameters) based on the
attribute data, the user characteristics, the demand indications,
or other data. In the discussion below, additional aspects of
facilitating the user purchase are described.
[0086] FIG. 4 is a flow diagram illustrating further operations for
facilitating the user purchase based, at least in part, on an
evaluation of an inventory level, according to some example
embodiments. At operation 410, the item module 250 extracts a
current inventory level of the commerce item from the attribute
data. For example, the item module 250 extracts a quantity of the
commerce item from the attribute data that includes inventory data
received from a smart refrigerator. In other examples, the
attribute data includes inventory indications from sensors that
directly monitor or measure the commerce item. In a specific
example, brake pads in an automobile of the user can be monitored
via a sensor operable to indicate a condition of the brake pads
(e.g., needing replacement). In another example, another user that
may be associated with the user may purchase a particular item for
which the user has demand. In this example, the user may no longer
have demand for the particular item based on the purchase by the
another user (e.g., shared shopping list between family
members).
[0087] At operation 420, the analysis module 255 determines an
inventory threshold fur the commerce item by modeling usage of the
commerce item based on the extracted current inventory level and
the inferred user characteristics. In an example embodiment, the
analysis module 255 determines the inventory threshold such that
when the current inventory level may be below the threshold, the
inventory may need to be reordered to avoid depletion. For
instance, the analysis module 255 calculates a usage rate
corresponding to the commerce item based on the attribute data, the
user characteristics, or other data and applies the usage rate to
determine the inventory threshold for the commerce item to avoid
depletion of a supply of the commerce item. For example, the
analysis module 255 determines the usage rate for a particular item
based on historical purchase history data and infers usage rate
based on a frequency of purchase of the particular item. In a
specific example, the commerce item may be coffee beans and the
analysis module 255 determines that based on the user's current
supply of coffee beans and the user's rate of consumption (e.g.,
the usage rate) corresponding to coffee beans, the user may run out
in fourteen days. In this specific example, the analysis module 255
determines the inventory threshold to be a value, such as a
quantity of the coffee beans, that may be the current inventory
level several days prior to depletion of the supply of coffee
beans.
[0088] At operation 430, the analysis module 255 identifies a
mismatch between the inventory threshold and the current inventory
level. For example, if the analysis module 255 determines that the
current inventory level is below the inventory threshold, the
analysis module 255 identifies the mismatch on that basis.
[0089] At operation 440, the order module 260 automatically (e.g.,
without intervention or action of the user) performs the user
purchase on behalf of the user based on the mismatch, according to
some implementations. In some example embodiments, the order module
260 accounts for shipping delays and other delays so as to avoid
the current inventory level of the commerce item falling below the
inventory threshold. For instance, the analysis module 255
increases the inventory threshold to account for delays in
receiving an order for a particular item.
[0090] FIG. 5 is a flow diagram illustrating further operations for
facilitating the purchase including operations to determine
parameters for the purchase, according to some example embodiments.
At operation 510, the order module 260 determines at least one
order parameter based, at least in part, on the user
characteristics. The order parameter may include at least one of a
quantity, a delivery time, a payment time, a delivery method, a
delivery destination, a merchant, a brand, a price, an item color,
an item style, and so on. For example, the user characteristics may
indicate that the user may wear a certain garment size. In this
example, the order module 260 specifies order parameters for
clothing or apparel according to the garment size. Similarly, the
order module 260 may specify a brand for the user purchase based on
the user characteristics (e.g., historical brand purchases by the
user or an analysis of the user's style and the brands that conform
to that style). In another example, the user characteristics may
indicate that the user may be minimizing cost above other
considerations. In this example, the order module 260 identifies a
tower cost option as opposed to a speedier option (e.g., wait for a
sale for a particular item or use the cheapest shipping). In
another example, the user characteristics may indicate that the
shipping speed may be important for certain items (e.g., a trendy
new mobile device that the user may want right away). In this
example, the order module 260 determines the delivery method
parameter based on how fast the order may be delivered. In another
example embodiment, the order module 260 specifies the delivery
location for the user purchase based on the user characteristics
(e.g., if the user purchase relates to items for the user's work,
the delivery location may be the user's work location rather than a
home address). The order module 260 may determine many other order
parameters based on the user characteristics.
[0091] At operation 520, the order module 260 facilitates the
purchase according to the determined order parameters. For example,
the order module 260 may recommend the user purchase to the user,
with the recommendation including the determined order parameters
(e.g., providing the user with a notification including the option
to make the user purchase). In another example, the order module
260 automatically makes the user purchase on behalf of the user.
For instance, if the order module 260 determines that the user
purchase may be urgent (e.g., to avoid depletion of a supply of a
certain item), routine (e.g., purchasing bottle water), or
specified by the user as a permissible automatic purchase in
advance, the user purchase may be made automatically to avoid
burdening the user with the decision to make the user purchase.
[0092] FIG. 6 is a flow diagram illustrating further operations for
determining order parameters including operations to determine a
temporal parameter associated with the purchase, according to some
example embodiments. At operation 610, the analysis module 255
identities a purchase motive of the user for the commerce item by
analyzing the user characteristics. In various example embodiments,
the purchase motive corresponds to a motive time. For example, the
user may be planning a vacation that includes a beach destination.
The vacation may be indicative of the purchase motive of the user
for items associated with the vacation (e.g., sunscreen for a beach
type vacation, snacks for a road trip type vacation, Broadway
tickets for a New York city trip). In this example, the motive time
corresponds to the beginning of the vacation (e.g., as determined
by user calendar information included in the attribute data or
purchase history data such as plane ticket information).
[0093] At operation 620, the order module 260 determines temporal
order parameters based on the motive time. In the example above
where the user may be planning a vacation, the order module 260
determines the temporal order parameters so that the items
corresponding to the user purchase arrive prior to the vacation. In
another example, if the purchase motive is associated with an
event, such as a graduation party, the order module 260 may
determine the temporal order parameters such that the items
associated with the user purchase are delivered prior to the
graduation party, since the user may no longer have demand for the
particular item after a certain time.
[0094] At operation 630, the order module 260 facilitates the
purchase according to the determined temporal order parameters. For
example, the order module 260 schedules the items corresponding to
the user purchase to arrive at a certain time.
[0095] FIG. 7 is a flow diagram illustrating further operations to
facilitate the purchase based, at least in part, on purchase
criteria, according to some example embodiments. At operation 710,
the analysis module 255 accesses purchase criteria corresponding to
the user. The purchase criteria, for example, include predefined
criteria, user specified criteria, dynamically determined criteria,
either alone or any suitable combination thereof. For example, the
purchase criteria may include temporal based criteria (e.g.,
criteria that specifies making the user purchase with certain time
periods), budget criteria (e.g., spending limits associated with
particular items or categories of items), context based criteria
(e.g., adjusting the budgeting criteria based on the user's current
location), among other purchase criteria. In a specific example,
the user specifies a budget for a particular category or good
(e.g., transportation, food, utilities, housing, entrainment,
travel, health), a total budget, or a monthly budget. In addition,
the user can specify rules based criteria such as a particular time
to make certain purchase (e.g., after paycheck is deposited).
[0096] In further example embodiments, the analysis module 255
identifies, and includes, in the purchase criteria, patterns in the
user's purchasing habits, objectives, goals, or dynamically
generate criterion. In a specific example, the analysis module 255
determines that the user may be afflicted with a medical condition,
such as a peanut allergy. In this scenario, the analysis module 255
includes a. criterion in the purchase criteria to avoid items that
contain peanuts. In another example, the analysis module 255
determines that the user may be attempting to maintain a vegan diet
and the analysis module 255 may avoid food items that are contrary
to the goal of maintaining a vegan diet.
[0097] At operation 720, the order module 260 automatically
purchases the commerce item on behalf of the user according to the
purchase criteria. In an example embodiment, the order module 260
determines satisfaction of the purchase criteria prior to
facilitating the user purchase for the user. For example, the order
module 260 determines that a particular budget criterion included
in the purchase criteria has been exceeded and the order module 260
may not perform the user purchase on that basis. In other words,
the order module 260 facilitates the user purchase based on the
determined satisfaction of the purchase criteria.
[0098] Turning now to FIG. 8, the data mesh system 150 may
implement various suitable combinations of the operations discussed
above to identify the commerce item and facilitate the user
purchase. This applies equally to the operations discussed above as
well as the operations in the following discussion. FIG. 8 is a
flow diagram illustrating an example method 800 of one such
combination of the operations, although many other suitable
combinations may be employed. At the operation 310, the attribute
module 220 receives the attribute data associated with the user. At
the operation 320, the item module 250 extracts the demand
indications from the attribute data. At the operation 330, the
analysis module 255 identifies the commerce item from the attribute
data based on the demand indications. At the operation 340, the
characteristic module 225 infers the user characteristics from the
attribute data. As shown in FIG. 8, various combinations of the
above operations may be employed to facilitate the user purchase at
the operation 350.
[0099] In the example method 800, to facilitate the user purchase,
the item module 250 extracts the current inventory level of the
commerce item at the operation 410. As described above, at the
operation 420, the analysis module 255 determines the inventory
threshold for the commerce item. Subsequently, at the operation
430, the analysis module 255 identifies the mismatch between the
inventory threshold and the current inventory level. At the
operation 440, if the analysis module 255 identifies the mismatch,
the user analysis system 152 may proceed to the operation 510.
Alternatively, if the analysis module 255 does not identify the
mismatch, no subsequent operation may be performed.
[0100] Subsequent to determining the mismatch, at the operation
510, the order module 260 deter nines the order parameters of the
user purchase based on the user characteristics, in an example
embodiment. In some example embodiments, this may involve the
operations 610, 620, and 630, respectively, to determine the
temporal order parameters that may be included in the order
parameters. At the operation 520, the order module 260 facilitates
the user purchase according to the order parameters.
[0101] Finally, at the operation 710, the analysis module 255
accesses the purchase criteria and at the operation 720, the order
module 260 facilitates the user purchase according to the purchase
criteria. Thus, FIG. 8 shows an example embodiment where various
ones of the above operations may be employed in conjunction with
each other to facilitate the user purchase.
[0102] FIG. 9 is a flow diagram illustrating an alternative example
method 900 for identifying the commerce item and facilitating the
user purchase, according to some example embodiments. The example
method 900 may involve similar operations as those described above.
At operation 910, similar to the operation 310, the attribute
module 220 receives or accesses the attribute data associate with
the user. At operation 920, similar to the operation 340, the
characteristic module 225 infers the user characteristics
pertaining to the user from the attribute data.
[0103] At operation 930, the analysis module 255 identifies similar
users that are similar to the user based on the inferred user
characteristics and respective user characteristics of a plurality
of other users. The analysis module 255 identifies similar users
that are similar to the user based on a variety of factors. In some
example embodiments, the analysis module 255 accesses the attribute
data or stored user characteristics corresponding to the plurality
of other users. For example, the analysis module 255 identities the
similar users from among the plurality of other users that are
similar to the user based on the inferred user characteristics of
the user and respective user characteristics of the plurality of
other users. The analysis module 255 may correlate, match, or
otherwise compare the inferred user characteristics with respective
user characteristics of the plurality of other users to identity
the similar users. In various example embodiments, the analysis
module 255 identifies the similar users based on same or similar
demographic data (e.g., same or similar age, marital status,
gender, geographic location, etc.), same or similar user
characteristics (e.g., same or similar brand purchases), same or
similar attribute data, and so on.
[0104] At operation 940, the analysis module 255 identifies the
commerce item from the attribute data based on the user
characteristics of the similar users and the demand indications.
For example, the demand indications may indicate a particular item
that may not be particularly significant based on the demand
indications (e.g., the demand metric may be particularly low for
the particular item). However, the analysis module 255 may identify
this particular item based on the user characteristics of the
similar users indicating that the particular item may be of
significance. In other words, although the demand indications did
not show a strong demand for the particular item, the user
characteristics of the similar users indicated that the user may
have strong demand for the particular item.
[0105] In a specific example, the demand indications may indicate
the user has demand for a pair of sunglasses. The demand
indications in this example may further indicate the user may be
interested in brands X, Y, and Z with a particular emphasis on
brand X. The user characteristics of the similar users (e,g., user
of the same or similar age, location, gender, other demographic
information, or similar purchasing preferences may indicate that
brand Z may be in high demand for the users similar to the user. On
that basis, the analysis module 255 may identify brand Z sunglasses
as the commerce item.
[0106] At operation 950, the order module 260 determines the order
parameters or transaction parameters based on the user
characteristics of the similar users. For example, the delivery
method may be determined based on the user characteristics of the
similar users. For instance, if the similar users frequently choose
a speedy delivery method for a particular item (e.g., new
electronics), the order module 260 may determine a speedy delivery
method corresponding to the user purchase for the commerce item
that may be the same or similar to the particular item.
[0107] Similarly, the purchase criteria may include dynamically
determined criterion based on the user characteristics of the
similar users. That is to say, the analysis module 255 may
dynamically generate a portion of the purchase criteria based on
the similar users. For example, a default budget for particular
categories of items may be determined based on an analysis of the
user characteristics of the similar users (e.g., other users with
similar demographic information as the user may on average spend a
certain amount per category of good).
[0108] At operation 960, the order module 260 facilitates the user
purchase associated with the commerce item according to the
determined order parameters. As discussed above, the order module
260 facilitates the user purchase in a variety of manners including
automatically performing the user purchase on behalf of the user or
causing presentation of a notification to the user that includes
the option to make the user purchase according to the order
parameters.
[0109] FIG. 10 is a flow diagram illustrating further operations to
facilitate the purchase based, at least in part, on the demand
metric, according to some example embodiments. At operation 1010,
as discussed above in connection with the operation 330 of FIG. 3,
the analysis module 255 calculates the demand metric of the
identified item based on the demand indications corresponding to
the identified item.
[0110] At operation 1020, the order module 260 facilitates the user
purchase based on the demand metric. For example, if the order
module 260 determines that the demand metric is high (e.g., exceeds
a predefined or dynamically determined threshold) then the order
module 260 may facilitate the user purchase with more urgency than
for a lower demand metric. For instance, the order module 260
automatically performs the user purchase for the user based on a
high demand metric or more frequently causes presentation of the
notification that includes the option to make the user purchase to
the user (or with more emphasis such as a more conspicuous
notifications such as a larger user interface presentation to the
user). In some instance, the order module 260 determines the order
parameters based on the demand metric. For instance, if the order
module 260 determines that the demand metric is high, then the
order module 260 may subsequently determine a speedier delivery
option for the commerce item.
[0111] FIG. 11 is a flow diagram illustrating further operations to
facilitate the purchase using a notification, according to some
example embodiments. At operation 1110, the presentation module 210
generates the notification that includes an option to make the user
purchase. The notification may include a user interface, a text
message (.Short Message Service (SMS), Multimedia Messaging Service
(MMS), Enhanced Messaging Service (EMS), other messaging
modalities), and so on. In some example embodiments, the
presentation module 210 may generate the content of the
notification based on the commerce item, the user characteristics,
the user characteristics of the similar users, the attribute data,
and so forth. In various example embodiments, the notification may
include the order parameters for the user purchase.
[0112] At operation 1120, the presentation module 210 causes
presentation of the generated notification to the user. For
example, the presentation module 210 may communicate instructions
to present a user interface that includes the notification to a
device of the user. In some example embodiments, the presentation
module 210 may determine the device of the user to present the
notification of the user based on the user characteristics. For
example, if the user has a preference for a particular device
(e,g., a mobile device of the user), the presentation module 210
may cause presentation of the notification to that device. In
further example embodiments, the notification may provide the user
the option to specify or modify the order parameters of the user
purchase.
[0113] At operation 1130, the presentation module 210 receives a
user selection of the option to make the purchase. For example, if
the user chooses to make the user purchase, the presentation module
210 may receive the user selection of the option to make the
purchase and communicate the selection to the order module 260 to
perform the user purchase.
[0114] At operation 1140, the order module 260 performs the
purchase, according to some example embodiments. For example, the
order module 260 may make the user purchase according the order
parameters on behalf of the user.
[0115] FIG. 12 is a flow diagram illustrating further operations
for presenting a notification, according to some example
embodiments. At operation 1210, the presentation module 210
identifies presentation parameters for presentation of the
notification. For example, the presentation module 210 may identify
the presentation parameters based on the user characteristics, the
user characteristics of the similar users, the attribute data, the
demand indications, or other data. The presentation parameters may
include a preferred device of the user to present the notification,
preferred time of day to present the notification, content
preferences (e.g., do not present notifications regarding
particular item categories), and so on. In a specific example, the
user characteristics may indicate a work time period for the user.
In this example, the notification may not be presented to the user
during the work time period as the user may not respond. In another
example, the analysis module 255 may identity a device status of a
particular user device and, based on the device status, the
presentation module 210 may route the notification to another
device. For instance, if the device status indicates that the
device is inactive (e.g., being charged), the presentation module
210 may cause presentation of the notification to another device
(e.g., an active device as determined by device sensors).
[0116] At operation 1220, the presentation module 210 causes
presentation of the notification according to the presentation
parameters. For instance, the presentation module 210 may cause
presentation of the notification to a preferred device of the user
at a time of day that the user is likely to respond to the
notification as determined based on an analysis of the user
characteristics.
[0117] FIG. 13 is a flow diagram illustrating further operations
for presenting a notification, according to some example
embodiments. At operation 1310, the analysis module 255 detects a
trigger action of the user based on the real-time data included in
the attribute data. For example, the analysis module 255 may
determine that the user may be moving into the kitchen (e.g., as
determined by a Bluetootb.RTM. handshake between a mobile device
the user may be wearing and a smart appliance located in the user's
kitchen), which may be a good time to notify the user regarding
food supplies.
[0118] At operation 1320, the presentation module 210 causes
presentation of the notification to the user in response to
detecting the trigger action. In other words, based on the analysis
module 255 detecting the trigger action of the user, the
presentation module 210 may cause presentation of the notification
to the user.
[0119] FIG. 14 is a flow diagram illustrating an example method
1400 showing communication between various devices in relation to
presenting a notification to the user, according to some example
embodiments. At operation 1410, attribute source 1402 communicates
the attribute data to the data mesh system 150. As described above,
at the operation 310, the attribute module 220 may receive the
attribute data associated with the user. At the operation 320, the
item module 250 extracts the demand indications from the attribute
data. At the operation 330, the analysis module 255 may identify
the commerce item from the attribute data based on the demand
indications. At the operation 340, the characteristic module 225
infers the user characteristics from the attribute data.
[0120] In the example method 1400, the order module 260 facilitates
the user purchase by generate the notification at the operation
1420. The presentation module 210 communicates the notification
from the data mesh system 150 to the user device 1406. At operation
1430, the user device 1406 may receive the notification and present
the notification to the user. Subsequently, the user may select an
option to make the user purchase. The user device 1406 may
communicate an indication of the user selection to make the user
purchase at the operation 1440. At operation 1450, the data mesh
system 150 may receive the user selection to make the user
purchase. Finally, the order module 260 may perform the user
purchase in response to receiving the user selection to make the
user purchase at operation 1460.
[0121] FIG. 15 depicts an example user interface 1500 to facilitate
the purchase, according to some example embodiments. It will be
noted that alternate presentations of the displays of FIG. 15 may
include additional information, graphics, options, and so forth;
other presentations may include less information, or may provide
abridged information for easy use by the user. Notification 1510
may be text messages, such as Short Message Service (SMS) messages,
Multimedia Messaging Service (MMS), Enhanced Messaging Service
(EMS), and other messaging modalities, which may be provided to
notify the user of the user purchase including the order
parameters. In other example embodiments, the notification 1510 may
be a push notification or similar type of notification. Some
notification may be interactive, enabling the user to make a
selection through the SMS system, mobile application, or other
method. For instance, the user may interact with the notification
1510 using user interface element 1520.
[0122] To help illustrate the concepts described above, FIGS. 16
and 17 illustrate examples of identifying the commerce item and
facilitating the user purchase associated with the commerce item,
according to some example embodiment. Referring now to FIG. 16, a
scene 1600 depicts a living room attached to an open kitchen. In
the example of FIG. 16, the scene 1600 includes a media
entertainment device 1610, a smart television (TV) 1620, a lamp
1630, a mobile computer 1640, a mobile device 1650, a user 1660, a
smart refrigerator 1670, and a kitchen display 1680. Each of the
devices 1610-1650, 1670, and 1680 may be attribute sources coupled
to a network (e.g., the network 104) and operable to communicate
with the data mesh system 150. In various example embodiments, the
user 1660 is carrying a smart device (e.g., a mobile device, a
wearable device, a near field communication (NEC) enabled smart
ring) on their person that may provide real-time data corresponding
to the user 1660. For instance, the user 1660 may be carrying a
mobile device that may provide real-time location data (e.g., as
determined by a GPS component, beacon location detect, or other
location services). In this way, the analysis module 255 tracks,
monitors, or otherwise observes the location of the user 1660 via a
particular device the user is wearing or the location of the user
1660 may be derived from various real-time data associated with the
user's location included in the attribute data (e.g.,
Bluetooth.RTM. handshakes between a device the user is wearing and
a another device with a known or fixed location).
[0123] In an example embodiment, the lamp 1630 is a smart lamp
operable to communicate various operating data to the data mesh
system 150 or connected to a smart outlet operable to monitor the
functionality of the lamp 1630. In this example embodiment, the
item module 250 extracts demand indications from portions of the
attribute data corresponding to the lamp 1630. For example, the
demand indications may indicate use of the lamp in a particular way
(e.g., the user 1660 may use a low brightness setting on the lamp)
or that the light bulb of the lamp 1630 has burned out. The
analysis module 255 identifies the commerce item as a light bulb
that needs to be replaced based on the demand indications (e.g.,
detected via sensors of the lamp 1630 or derived via data from a
smart outlet such as reduced power consumption indicating a burnt
out light bulb). Subsequently, the order module 260 may notify the
user of the burnt-out light bulb with an option to reorder a
particular light bulb based on the user characteristics (e.g.,
purchase history of the user). In some instance, the order module
260 may automatically reorder the light bulb without notifying the
user. In various example embodiments, the order module 260
automatically performs the user purchase based on the purchase
criteria (e.g., for this particular category of goods, simply place
an automatic order).
[0124] In another example embodiment, the smart refrigerator 1670
communicates inventory data to the data mesh system 150. For
example, the smart refrigerator may communicate food supply data.
In this example embodiment, the item module 250 extracts demand
indications from the food supply data. Subsequently, the analysis
module 255 identifies the commerce item based on the demand
indications. For instance, the analysis module 255 may identify
milk as the commerce item based on a low inventory level of milk.
The order module 260 determines a quantity of milk to order based
on the user characteristics (e.g., historical purchase data for
milk during the current season of the year). The order module 260
may then generate a notification that includes an option to
purchase milk. The order module 260 causes presentation of the
notification based on the user characteristics. For instance, the
real-time data included in the attribute data may indicate the user
1660 is currently in the kitchen, which may be a good time to
provide the user 1660 the option to reorder milk (the reasoning
being that the user 1660 may be able to inspect the food supply
first). In a further instance, the order module 260 may determine
that the status of the mobile device 1650 is inactive (e.g., turned
off or not in use based on a lack of movement detected from device
accelerometers). In this scenario, the order module 260 may cause
presentation of the notification to the user on another device such
as display 1680,
[0125] FIG. 17 illustrates an example of identifying an item and
facilitating a purchase associated with the identified item,
according to some example embodiments. The scene 1700 depicts a
city including the user 1710 driving in a car. In this example, the
item module 250 extracts demand indications such as a location of
the user 1710 or a route of the user 1710 that may be indicative of
a. destination 1750 and thus, the commerce item. In continuing with
this example, the analysis module 255 determines the destination
1750 of the user 1710 from the demand indications or based on, for
example, a route 1730 the user is taking, a time of day, and a day
of the year. In other words, the analysis module 255 may determine
the destination 1750 of the user 1710 based on the demand
indications and the user characteristics or real-time context data
included in the attribute data (e.g., location as determine by a
GPS component of a mobile device). In some example embodiments, the
analysis module 255 or the characteristic module 225 determines the
real-time location of the user based on Bluetooth.RTM. or other
close range communication detections within a radius such as radius
1720. For instance, the analysis module 255 determines that the
user 1710 may be at the destination 1750 if the user 1710 is within
the radius 1740 of the destination 1750. In this scenario, the
destination 1750 may be a coffee shop and the commerce item may be
a cup of coffee. In some example embodiments, the order module 260
automatically places an order for the cup of coffee, or presents a
notification with the option to place the order for the coffee,
while the user 1710 may be in route. In further example
embodiments, the order module 260 determines the order parameters
based on the user characteristics such as past orders for coffee
included in a purchase history of the user. In still further
example embodiments, the presentation module 210 determines the
presentation parameters based on the user being in the car (e.g.,
present an audio alert for the option to place the order and
receive a vocal command from the user to place the order).
[0126] FIGS. 18A and 18B depict example configurations for
communicatively coupling attribute sources, according to some
example embodiments. The example embodiments described herein may
access a vast and rich "Internet of Things" (IoT) dataset that is
predominantly provided via communicatively connected,
interconnected, or otherwise communicatively coupled machines and
devices that may include a multitude of sensors. In example
embodiments, devices and machines that provide the attribute data,
such as the attribute sources, may be communicatively coupled in
many different configurations. For instance, each attribute source
may be communicatively coupled to the networked system 102
independently to provide the networked system 102 access to the
attribute data corresponding to each of the communicatively coupled
attribute sources. FIGS. 18A and 18B depict alternative example
attribute source configurations. It will be appreciated that FIGS.
18A and 18B are merely non-limiting examples of attribute source
configurations and many other configurations or suitable
combinations of configurations may be employed.
[0127] FIG. 18A depicts an example embodiment that may include
attribute sources 1810 communicatively coupled in a decentralized
device-to-device mesh. In this example embodiment, the attribute
data corresponding to a particular device in the mesh may be
received from any one or more of the devices in the mesh. For
instance, the networked system 102 may access the attribute data
corresponding to attribute source E via attribute source H or a
combination of attribute sources H and I in FIG. 18A. In an example
embodiment, the attribute source H or may aggregate and store the
attribute data corresponding to attribute sources A-F in FIG. 18A.
In some example embodiments, the networked system 102 may access
the attribute data associated with attribute source E by
communicating with attribute source H or I in FIG. 18A.
[0128] FIG. 18B depicts another example embodiment that may include
attribute sources 1820 communicatively coupled to a central
attribute source (e.g., attribute source H in FIG. 18B). The
networked system 102 may access the attribute data associated with
attribute sources A-G via the central attribute source in FIG. 18B.
In some embodiments, the central attribute source may aggregate and
store the attribute data received or accessed from the attribute
sources A-G and provide a centralized access point for the
attribute data associated with all, or some, of the communicatively
coupled attribute sources A-G in FIG. 18B.
[0129] FIG. 19 depicts example sources 1900 including attribute
sources 1910, according to some example embodiments. In various
example embodiments, the attribute data may include data received,
retrieved, or accessed from the attribute sources 1910. For
example, the attribute sources 1910 may provide data including
everything from a moisture level of a houseplant to a dribbling
rhythm of a basketball. In some embodiments, the attribute data
corresponding to the attribute sources 1910 may be received or
accessed in real-time or near real-time. For instance, the
attribute sources 1910 may communicate or otherwise provide access
to the attribute data as it becomes available. In example
embodiments, the attribute sources 1910 may include user device
sources 1920, user data sources 1930, transportation sources 1940,
materials sources 1950, third party sources 1960, home sources
1970, and a variety of other sources. As will be discussed in
connection with FIG. 20, the attribute sources 1910 may be
associated with a wide variety of sensors, gauges, measurement
components, and other components.
[0130] In an example embodiment, the attribute data may include
data corresponding to the user device sources 1920. The user device
sources 1920 may include such non-limiting examples as a personal
computer (PC), a tablet computer, a laptop computer, a netbook, a
set-top box (STB), a personal digital assistant (PDA), an
entertainment media system, a cellular telephone, a smart phone, a
mobile device, a wearable device (e.g., a smart watch), a smart
home device (e.g., a smart appliance), and other smart devices. As
will be discussed further in connection with FIG. 20, the attribute
data corresponding to the user device sources 1920 may include data
associated with sensors, gauges, and other measurement components
such as environmental sensor data (e.g., ambient temperature),
biometric sensor data (e.g., heart rate), detection data (e.g.,
detection of a Near Field Communication (NFC) beacon), motion data
(e.g., acceleration), position data (e.g., location as determined
by a GPS of a mobile device), and so forth.
[0131] In further example embodiments, the attribute data
corresponding to the user device sources 1920 may include data such
as device type, device model, device name, a unique device
identifier, and other device parameters. In some example
embodiments, the device type data may provide a basis for an
inference associated with the attribute data. For instance, if the
device type data indicates that the device is a mobile device of
the user, location data corresponding to the mobile device may
indicate the location of the user. Similarly, if the device type is
a media entertainment system, the attribute data corresponding to
the media entertainment system may be associated with a home of the
user.
[0132] The user data sources 1930 may include calendars (e.g., user
calendar events such as birthdays, trips, exams), user profiles
(e.g., demographic information such as age, gender, income level),
purchase histories, browse histories (e.g., search terms), social
media content (e.g., check-ins, posts, connections), other user
data (e.g., bookmarked websites, preferences or settings for
various applications, application usage data such as time spent
using a particular application), and the like. The attribute data
corresponding to the user data sources 1930 may be stored, for
example, by the user device sources 1920 (e.g., a mobile device
that includes a mobile browser with browse history of the user),
application server(s) 140 (e.g., payment history of the user stored
in payment system(s) 144, user profiles stored by an e-commerce
website), the third party server(s) 130 (e.g., social media data
stored in a social networking service), and so on. For example, the
attribute data corresponding to the user device sources 1920 may
include device resource data. The device resource data may include
files stored on the devices or metadata associated with the files.
For instance, the device resources may include digital media files
(e.g., MP3 formatted songs) or apps (e.g., pedometer app). The
metadata associated with the device resources may include usage
data such as number of times a song has been played, amount of time
using a particular app, and so forth.
[0133] As cars and other forms of transportation become
increasingly equipped with sensors and the ability to communicate,
a vast amount of data may he provided by the transportation sources
1940. For example, the attribute data corresponding to the
transportation sources 1940 may include acceleration data, velocity
data, and other sensor data (e.g., brake pad wear data, gear
shifting data). In this example, the attribute data corresponding
to the transportation sources 1940 may provide indications of a
user's driving patterns and styles (e.g., comes to a complete stop
at a stop sign, speeds, or finicky use of the brakes).
[0134] The materials sources 1950, such as clothing and structures,
are also increasingly gaining the ability to capture data. In
various example embodiments, the attribute data may include data
corresponding to the materials sources 1950. For example, clothing
may be embedded with sensors to detect motion. Data from these
sensors may provide indications of whether the user is active or
inactive. In another example, clothing may be embedded with
biometric sensors that may provide a continuous feed of biometric
data corresponding to the user. The biometric data may provide
indications of the user's health, athletic ability, and many other
characteristics corresponding to the user. Similarly, structures
may be equipped with sensors to passively or actively monitor the
surrounding environment (e.g., street cameras, traffic cameras, and
other sensors).
[0135] In example emibodiments, the attribute data may include data
associated with the third party sources 1960. The third party
sources 1960 may also provide an abundance of data associated with
the user. For instance, the attribute data may include data
accessed from government websites or other public records that may
provide criminal histories, civil citation histories, credit
histories, or other publicly available information.
[0136] Nearly every facet of a smart home may be capable of
providing data associated with the user. The attribute data may
include data corresponding to the home sources 1970. For instance,
the home sources 1970 may include smart appliances, consumables,
utilities, and many other smart home devices. In a few specific
instances, the attribute data may include consumable inventories
and consumption rates of various consumable goods (e.g., milk,
bread) tracked or monitored by smart refrigerators. In another
instance, the attribute data may include utility usage data (e.g.,
electricity, water). Analysis of the utility usage data may
indicate patterns or a status of the user, such as the user being
on vacation, the user being ill (e,g., increasing house thermostat
set temperature to cope with a cold), the user being an energy
conscious consumer, and so on.
[0137] FIG. 20 depicts non-limiting example components 2000 that
may provide attribute data according to some example embodiments.
In example embodiments, I/O components 2010 may include input
components 2020, output components 2030, environmental components
2040, motion components 2050, position components 2060, biometric
components 2070, communication components 2080, detection
components 2090, and a wide gamut of other sensors, gauges, and
measurement components. The I/O components 2010 or a suitable
combination of the I/O components 2010 may be included in any
suitable device or machine such as those included in the attribute
sources 1910 depicted in FIG. 19 to facilitate the functionality
described herein. In various example embodiments, the attribute
data provided by the I/O components 2010 may be accessible to all
or some of the modules described above on a real-time or near
real-time basis. The components 2000 are grouped according to
functionality merely for simplifying the following discussion and
the grouping is in no way limiting.
[0138] The input components 2020 may include alphanumeric input
components (e.g., a keyboard, a touch screen configured to receive
alphanumeric input, a photo-optical keyboard, or other alphanumeric
input components), point-based input components (e.g., a mouse, a
touchpad, a trackball, a joystick, a motion sensor, or other
pointing instrument), tactile input components (e.g., a physical
button, a touch screen that provides location and force of touches
or touch gestures, or other tactile input components), audio input
components (e.g., a microphone), and the like. The input components
2020 may receive input from the user to facilitate the
functionalities described herein. For instance, the user may
interact with a user interface using the input components 2020.
[0139] The output components 2030 may include visual components
(e.g., a display such as a plasma display panel (PDP), a light
emitting diode (LED) display, a liquid crystal display (LCD), a
projector, or a cathode ray tube (CRT)), acoustic components (e,g.,
speakers), haptic components (e.g., a vibratory motor), other
signal generators, and so forth. The output components 2030 may
present information to the user. For example, the output components
2030 may present a user interface to the user or present media
files to the user.
[0140] The environmental components 2040 may include illumination
sensors (e.g., photometer), temperature sensors (e.g., one or more
thermometers that detect ambient temperature), humidity sensors,
pressure sensors (e.g., barometer), acoustic sensors (e.g., one or
more microphones that detect background noise), proximity sensors
(e.g., an infrared sensor that detects nearby objects), gas sensors
(e,g., e.g., machine olfaction detection sensors, gas detection
sensors to detect concentrations of hazardous gases for safety or
to measure pollutants in the atmosphere), and so on. The
environmental components 2040 may measure various physical
parameters to provide an indication or signal corresponding to the
physical environment surrounding the environmental components
2040.
[0141] The motion components 2050 may include acceleration sensors
(e.g., accelerometer), gravitation sensors, rotation sensors (e.g.,
gyroscope), and so forth. The motion components 2050 may provide
motion data such as velocity, acceleration, or other force
measurements along an x, y, and z axes. The motion data may be
provided at a regular update rate (e.g., 10 updates per second)
that may be configurable.
[0142] The position components 2060 may include location sensors
(e.g., a Global Position System (GPS) receiver component), altitude
sensors (e,g., altimeters or barometers that detect air pressure
from which altitude may be derived), orientation sensors (e.g.,
magnetometers that provide magnetic field strength along the x, y,
and z axes), and the like. In an example embodiment, the position
components 2060 may provide position data such as latitude,
longitude, altitude, and a time stamp. Similar to the motion
components 2050, the position components 2060 may provide the
motion data at a regular update rate that may be configurable.
[0143] The biometric components 2070 may include components to
detect expressions, measure biosignais, or identify people, among
other functions. For example, the biometric components 2070 may
include expression components to detect expressions (also referred
to as "kinesics") such as hand gestures (e.g., an optical component
to detect a hand gesture or a Doppler component to detect hand
motions), vocal expressions (e.g., a microphone to detect changes
in voice pitch that may indicate tension), facial expressions
(e.g., a camera to detect expressions or micro-expressions of a
person such as a smile), body gestures, and eye tracking (e.g.,
detecting the focal point of a person's eyes or patterns in eye
movement). The biometric components 2070 may also include, for
example, biosignal components to measure biosignals such as blood
pressure, heart rate, body temperature, perspiration, and brain
waves (e.g., as determined by a e(electroencephalogram). In further
examples, the biometric components 2070 may include identification
components to identify people such as retinal scanners (e.g., a
camera component), vocal detectors (e.g., a microphone to receive
audio data for voice identification), facial detectors, fingerprint
detectors, and electroencephalogram sensors (e.g., to identify a
person via unique brain wave patterns).
[0144] Communication may be implemented using a wide variety of
technologies. The I/O components 2010 may include communication
components 2080 operable to communicatively couple machines or
devices. For example, the communication components 2080 may include
a network interface component or other suitable device to interface
with a network (e.g., the network 104). In further examples, the
communication components 2080 may include wired communication
components, wireless communication components, cellular
communication components, Near Field Communication (NFC)
components, Bluetooth.RTM. components (e.g., Bluetooth.RTM. Low
Energy), Wi-Fi.RTM. components, and other communication components
to provide communication via other modalities. In addition, a
variety of information may be derived using the communication
components 2080 such as location via Internet Protocol (IP)
geo-location, location via Wi-Fi.RTM. signal triangulation,
location via detecting a NFC beacon signal that may indicate a
particular location, and so forth.
[0145] The I/O components 2010 may include detection components
2090 that may detect a variety of identifiers. For example, the
detection components 2090 may include Radio Frequency
Identification (RFID) tag reader components, Near Field
Communication (NFC) smart tag detection components, optical reader
components (e.g., an optical sensor to detect one-dimensional bar
codes such as Universal Product Code (UPC) bar code,
multi-dimensional bar codes such as Quick Response (QR) code, Aztec
code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, UCC
RSS-2D bar code, and other optical codes), and acoustic detection
components (e.g., microphones to identify tagged audio
signals).
[0146] FIG. 21 is a block diagram 2100 of an example data structure
for the attribute data associated with a particular user according
to example embodiments. In example embodiments, the attribute data
may be associated with a plurality of users such as user 2102,
2104, 2106, 2108, 2110, 2112, and 2114. In an example embodiment,
the attribute data may be accessed for a particular user by using a
user identifier. The attribute data may include profile data 2120,
device data 2122, calendar data 2124, list data 2126, list type
data 2128, interest data 2130, fitment data 2132, garment type data
2134, preference data 2136, measured dimension data 2138, fitness
goal data 2140, reward data 2142, location data 2144, and other
data not shown. In some example embodiments, the attribute data may
be structured such that various portions of the attribute data are
associated with other portions of the attribute data via
relationships. For instance, the calendar data 2124 may include a
calendar event associated with an event name, an event data, and an
event location for the calendar event.
[0147] FIG. 22 is a block diagram 2200 of an example data structure
for data associated with a device according to some example
embodiments. In an example embodiment, the device data 2122 of FIG.
21 may include a device identifier, a device name, device resources
data (e.g., files stores on the devices such as browser cookies,
media files), I/O component data, and so forth. In example
embodiments, the device identifier may, for example, comprise an
Internet Protocol (IP) address, a Media Access Control (MAC)
address, other unique identifiers, an International Mobile Station
Equipment Identity (IMEI), or a Mobile Equipment Identifier (MEID).
In an example embodiment, the I/O component data may include
standard device parameters 2202, position data 2204, location data
2206, motion data 2208, environmental data 2210, biometric data
2212, and other data. FIG. 22 merely depicts example attribute data
that may correspond to a particular device, and a variety of other
data not shown may be included in the device data. The standard
device parameters 2202 may include parameters that are standard
across multiple devices included in the IoT. In various example
embodiments, standardized parameters and protocols may facilitate
access and utilization of the attribute data corresponding to such
devices. For example, the attribute data available on an unknown
device may be accessed and utilized without the need to discover or
otherwise determine which parameters are available and which units
of measure are associated with the parameters. Many other schemes
may be employed to discover or otherwise determine available
parameters accessible on a particular device.
Modules, Components, and Logic
[0148] Certain embodiments are described herein as including logic
or a number of components, modules, or mechanisms. Modules may
constitute either software modules (e.g., code embodied on a
machine-readable medium or in a transmission signal) or hardware
modules. A "hardware module" is a tangible unit capable of
performing certain operations and may be configured or arranged in
a certain physical manner. In various example embodiments, one or
more computer systems (e.g., a standalone computer system, a client
computer system, or a server computer system) or one or more
hardware modules of a computer system (e.g., a processor or a group
of processors) may be configured by software (e.g., an application
or application portion) as a hardware module that operates to
perform certain operations as described herein.
[0149] In some embodiments, a hardware module may be implemented
mechanically, electronically, or any suitable combination thereof.
For example, a hardware module may include dedicated circuitry or
logic that is permanently configured to perform certain operations.
For example, a hardware module may be a special-purpose processor,
such as a Field-Programmable Gate Array (FPGA) or an Application
Specific Integrated Circuit (ASIC). A hardware module may also
include programmable logic or circuitry that is temporarily
configured by software to perform certain operations. For example,
a hardware module may include software encompassed within a
general-purpose processor or other programmable processor. It will
be appreciated that the decision to implement a hardware module
mechanically, in dedicated and permanently configured circuitry, or
in temporarily configured circuitry (e.g., configured by software)
may be driven by cost and time considerations.
[0150] Accordingly, the phrase "hardware module" should be
understood to encompass a tangible entity, be that an entity that
is physically constructed, permanently configured (e.g.,
hardwired), or temporarily configured (e.g., programmed) to operate
in a certain manner or to perform certain operations described
herein. As used herein, "hardware-implemented module" refers to a
hardware module. Considering embodiments in which hardware modules
are temporarily configured (e.g., programmed), each of the hardware
modules need not be configured or instantiated at any one instance
in time. For example, where a hardware module comprises a
general-purpose processor configured by software to become a
special-purpose processor, the general-purpose processor may be
configured as respectively different special-purpose processors
(e,g., comprising different hardware modules) at different times.
Software may accordingly configure a particular processor or
processors, for example, to constitute a particular hardware module
at one instance of time and to constitute a different hardware
module at a different instance of time.
[0151] Hardware modules can provide information to, and receive
information from, other hardware modules. Accordingly, the
described hardware modules may be regarded as being communicatively
coupled. Where multiple hardware modules exist contemporaneously,
communications may be achieved through signal transmission (e.g.,
over appropriate circuits and buses) between or among two or more
of the hardware modules. In embodiments in which multiple hardware
modules are configured or instantiated at different times,
communications between such hardware modules may be achieved, for
example, through the storage and retrieval of information in memory
structures to which the multiple hardware modules have access. For
example, one hardware module may perform an operation and store the
output of that operation in a memory device to which it is
communicatively coupled. A further hardware module may then, at a
later time, access the memory device to retrieve and process the
stored output. Hardware modules may also initiate communications
with input or output devices, and can operate on a resource (e.g.,
a collection of information).
[0152] The various operations of example methods described herein
may be performed, at least partially, by one or more processors
that are temporarily configured (e.g., by software) or permanently
configured to perform the relevant operations. Whether temporarily
or permanently configured, such processors may constitute
processor-implemented modules that operate to perform one or more
operations or functions described herein. As used herein,
"processor-implemented module" refers to a hardware module
implemented using one or more processors.
[0153] Similarly, the methods described herein may be at least
partially processor-implemented, with a particular processor or
processors being an example of hardware. For example, at least some
of the operations of a method may be performed by one or more
processors or processor-implemented modules. Moreover, the one or
more processors may also operate to support performance of the
relevant operations in a "cloud computing" environment or as a
"software as a service" (SaaS). For example, at least some of the
operations may be performed by a group of computers (as examples of
machines including processors), with these operations being
accessible via a network (e.g., the Internet) and via one or more
appropriate interfaces (e.g., an Application Program Interface
(API)).
[0154] The performance of certain of the operations may be
distributed among the processors, not only residing within a single
machine, but deployed across a number of machines. In some example
embodiments, the processors or processor-implemented modules may be
located in a single geographic location (e.g., within a home
environment, an office environment, or a server farm). In other
example embodiments, the processors or processor-implemented
modules may be distributed across a number of geographic
locations.
Applications
[0155] FIG. 23 illustrates an example mobile device 2300 executing
a mobile operating system (e.g., iOS.TM., Android.TM., Windows.RTM.
Phone, or other mobile operating systems), according to example
embodiments. In one embodiment, the mobile device 2300 includes a
touch screen that may receive tactile information from a user 2302.
For instance, the user 2302 physically touch 2304 the mobile device
2300, and in response to the touch 2304, the mobile device 2300
determines tactile information such as touch location, touch force,
gesture motion, and so forth. In various example embodiments, the
mobile device 2300 displays a home screen 2306 (e.g., Springboard
on iOS.TM.) operable to launch applications or otherwise manage the
mobile device 2300. In some example embodiments, the home screen
2306 provides status information such as battery life,
connectivity, or other hardware statuses. In some implementations,
the user 2302 activates user interface elements by touching an area
occupied by a respective user interface element. In this manner,
the user 2302 may interact with the applications. For example,
touching the area occupied by a particular icon included in the
home screen 2306 causes launching of an application corresponding
to the particular icon.
[0156] Many varieties of applications (also referred to as "apps")
may be executing on the mobile device 2300 such as native
applications (e.g., applications programmed in Objective-C running
on iOS.TM. or applications programmed in Java running on
Android.TM.), mobile web applications (e.g., HTML5), or hybrid
applications (e.g., a native shell application that launches an
HTML5 session). In a specific example, the mobile device 2300
includes a messaging app 2320, audio recording app 2322, a camera
app 2324, a book reader app 2326, a media app 2328, a fitness app
2330, a file management app 2332, a location app 2334, a browser
app 2336, a settings app 2338, a contacts app 2340, a telephone
call app 2342, other apps (e.g., gaming apps, social networking
apps, biometric monitoring apps), a third party app 2344, and so
forth.
Software Architecture
[0157] FIG. 24 is a block diagram 2400 illustrating an architecture
of software 2402, which may be installed on any one or more of
devices described above. FIG. 24 is merely a non-limiting example
of a software architecture, and it will be appreciated that many
other architectures may be implemented to facilitate the
functionality described herein. The software 2402 may be executing
on hardware such as machine 2500 of FIG. 25 that includes
processors 2510, memory 2530, and I/O components 2550. In the
example architecture of FIG. 24, the software 2402 may be
conceptualized as a stack of layers where each layer may provide
particular functionality. For example, the software 2402 includes
layers such as an operating system 2404, libraries 2406, frameworks
2408, and applications 2410. Operationally, the applications 2410
invoke application programming interface (API) calls 2412 through
the software stack and receive messages 2414 in response to the API
calls 2412, according to some implementations.
[0158] In various implementations, the operating system 2404
manages hardware resources and provides common services. The
operating system 2404 includes, for example, a kernel 2420,
services 2422, and drivers 2424. The kernel 2420 acts as an
abstraction layer between the hardware and the other software
layers in some implementations. For example, the kernel 2420
provides memory management, processor management (e.g.,
scheduling), component management, networking, security settings,
among other functionality. The services 2422 may provide other
common services for the other software layers. The drivers 2424 may
be responsible for controlling or interfacing with the underlying
hardware. For instance, the drivers 2424 may include display
drivers, camera drivers, Bluetooth.RTM. drivers, flash memory
drivers, serial communication drivers (e.g., Universal Serial Bus
(USB) drivers), Wi-Fi.RTM. drivers, audio drivers, power management
drivers, and so forth.
[0159] The libraries 2406 may provide a low-level common
infrastructure that may be utilized by the applications 2410. The
libraries 2406 may include system 2430 libraries (e.g., C standard
library) that may provide functions such as memory allocation
functions, string manipulation functions, mathematic functions, and
the like. In addition, the libraries 2406 may include API libraries
2432 such as media libraries (e.g., libraries to support
presentation and manipulation of various media formats such as
MPREG4, H.264, MP3, AAC, AMR, JPG, PNG), graphics libraries (e.g.,
an OpenerGL framework used to render 2D and 3D in a graphic content
on a display), database libraries (e.g., SQLite to provide various
relational database functions), web libraries (e.g., WebKit to
provide web browsing functionality), and the like. The libraries
2406 may also include a wide variety of other libraries 2434 to
provide many other APIs to the applications 2410.
[0160] The frameworks 2408 may provide a high-level common
infrastructure that may be utilized by the applications 2410. For
example, the frameworks 2408 may provide various graphic user
interface (GUI) functions, high-level resource management,
high-level location services, and so forth. The frameworks 2408 may
provide a broad spectrum of other APIs that may be utilized by the
applications 2410, some of which may be specific to a particular
operating system or platform.
[0161] The applications 2410 include a home application 2450, a
contacts application 2452, a browser application 2454, a book
reader application 2456, a location application 2458, a media
application 2460, a messaging application 2462, a game application
2464, and a broad assortment of other applications such as third
party application 2466. In a specific example, the third party
application 2466 (e.g., an application developed using the
Android.TM. or iOS.TM. software development kit (SDK.) by an entity
other than the vendor of the particular platform) may be mobile
software running on a mobile operating system such as iOS.TM.,
Android.TM., Windows.RTM. Phone, or other mobile operating systems.
In this example, the third party application 2466 may invoke the
API calls 2412 provided by the mobile operating system 2404 to
facilitate functionality described herein.
Example Machine Architecture and Machine-Readable Medium
[0162] FIG. 25 is a block diagram illustrating components of a
machine 2500, according to some example embodiments, able to read
instructions from a machine-readable medium (e.g., a
machine-readable storage medium) and perform any one or more of the
methodologies discussed herein. Specifically, FIG. 25 shows a
diagrammatic representation of the machine 2500 in the example form
of a computer system, within which instructions 2516 (e.g.,
software, a program, an application, an applet, an app, or other
executable code) for causing the machine 2500 to perform any one or
more of the methodologies discussed herein may be executed. In
alternative embodiments, the machine 2500 operates as a standalone
device or may be coupled (e.g., networked) to other machines. In a
networked deployment, the machine 2500 may operate in the capacity
of a server machine or a client machine in a server-client network
environment, or as a peer machine in a peer-to-peer (or
distributed) network environment. The machine 2500 may comprise,
but not be limited to, a server computer, a client computer, a
personal computer (PC), a tablet computer, a laptop computer, a
netbook, a set-top box (STB), a personal digital assistant (PDA),
an entertainment media system, a cellular telephone, a smart phone,
a mobile device, a wearable device (e.g., a smart watch), a smart
home device (e.g., a smart appliance), other smart devices, a web
appliance, a network router, a network switch, a network bridge, or
any machine capable of executing the instructions 2516,
sequentially or otherwise, that specify actions to be taken by
machine 2500. Further, while only a single machine 2500 is
illustrated, the term "machine" shall also be taken to include a
collection of machines 2500 that individually or jointly execute
the instructions 2516 to perform any one or more of the
methodologies discussed herein.
[0163] The machine 2500 may include processors 2510, memory 2530,
and I/O components 2550, which may be configured to communicate
with each other via a bus 2502. In an example embodiment, the
processors 2510 (e.g., a Central Processing Unit (CPU), a Reduced
Instruction Set Computing (RISC) processor, a Complex Instruction
Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a
Digital Signal Processor (DSP), an Application Specific Integrated
Circuit (ASIC), a Radio-Frequency Integrated Circuit (MC), another
processor, or any suitable combination thereof) may include, for
example, processor 2512 and processor 2514 that may execute
instructions 2516. The term "processor" is intended to include
multi-core processors that may comprise two or more independent
processors (also referred to as "cores") that may execute
instructions contemporaneously. Although FIG. 25 shows multiple
processors, the machine 2500 may include a single processor with a
single core, a single processor with multiple cores (e.g., a
multi-core process), multiple processors with a single core,
multiple processors with multiples cores, or any combination
thereof.
[0164] The memory 2530 may include a main memory 2532, a static
memory 2534, and a storage unit 2536 accessible to the processors
2510 via the bus 2502. The storage unit 2536 may include a
machine-readable medium 2538 on which is stored the instructions
2516 embodying any one or more of the methodologies or functions
described herein. The instructions 2516 may also reside, completely
or at least partially, within the main memory 2532, within the
static memory 2534, within at least one of the processors 2510
(e.g., within the processor's cache memory), or any suitable
combination thereof, during execution thereof by the machine 2500.
Accordingly, in various implementations, the main memory 2532,
static memory 2534, and the processors 2510 are considered as
machine-readable media 2538.
[0165] As used herein, the term "memory" refers to a
machine-readable medium 2538 able to store data temporarily or
permanently and may be taken to include, but not be limited to,
random-access memory (RAM), read-only memory (ROM), buffer memory,
flash memory, and cache memory. While the machine-readable medium
2538 is shown in an example embodiment to be a single medium, the
term "machine-readable medium" should be taken to include a single
medium or multiple media (e.g., a centralized or distributed
database, or associated caches and servers) able to store
instructions 2516. The term "machine-readable medium" shall also be
taken to include any medium, or combination of multiple media, that
is capable of storing instructions (e.g., instructions 2516) for
execution by a machine (e.g., machine 2500), such that the
instructions, when executed by one or more processors of the
machine 2500 (e.g., processors 2510), cause the machine 2500 to
perform any one or more of the methodologies described herein.
Accordingly, a "machine-readable medium" refers to a single storage
apparatus or device, as well as "cloud-based" storage systems or
storage networks that include multiple storage apparatus or
devices. The term "machine-readable medium" shall accordingly be
taken to include, but not be limited to, one or more data
repositories in the form of a solid-state memory (e.g., flash
memory), an optical medium, a magnetic medium, other non-volatile
memory (e.g., Erasable Programmable Read-Only Memory (EPROM)), or
any suitable combination thereof. The term "machine-readable
medium" specifically excludes non-statutory signals per se.
[0166] The I/O components 2550 include a wide variety of components
to receive input, provide output, produce output, transmit
information, exchange information, capture measurements, and so on.
It will be appreciated that the I/O components 2550 may include
many other components that are not shown in FIG. 25. The I/O
components 2550 are grouped according to functionality merely for
simplifying the following discussion and the grouping is in no way
limiting. In various example embodiments, the I/O components 2550
include output components 2552 and input components 2554. The
output components 2552 include visual components (e,g., a display
such as a plasma display panel (PDP), light emitting diode (LED)
display, a liquid crystal display (LCD), a projector, or a cathode
ray tube (CRT)), acoustic components e.g., speakers), haptic
components (e.g., a vibratory motor), other signal generators, and
so forth. The input components 2554 include alphanumeric input
components (e.g., a keyboard, a touch screen configured to receive
alphanumeric input, a photo-optical keyboard, or other alphanumeric
input components), point based input components (e.g., a mouse, a
touchpad, a trackball, a joystick, a motion sensor, or other
pointing instrument), tactile input components (e.g., a physical
button, a touch screen that provides location and force of touches
or touch gestures, or other tactile input components), audio input
components (e.g., a microphone), and the like.
[0167] In some further example embodiments, the I/O components 2550
include biometric components 2556, motion components 2558,
environmental components 2560, or position components 2562 among a
wide array of other components. For example, the biometric
components 2556 include components to detect expressions (e.g.,
hand expressions, facial expressions, vocal expressions, body
gestures, or eye tracking), measure biosignals (e.g., blood
pressure, heart rate, body temperature, perspiration, or brain
waves), identify a person (e.g., voice identification, retinal
identification, facial identification, fingerprint identification,
or electroencephalogram based identification), and the like. The
motion components 2558 include acceleration sensor components
(e.g., accelerometer), gravitation sensor components, rotation
sensor components (e,g., gyroscope), and so forth. The
environmental components 2560 include, for example, illumination
sensor components (e,g., photometer), temperature sensor components
(e.g., one or more thermometer that detect ambient temperature),
humidity sensor components, pressure sensor components (e.g.,
barometer), acoustic sensor components (e.g., one or more
microphones that detect background noise), proximity sensor
components (e.g., infrared sensors that detect nearby objects), gas
sensors (e,g., machine olfaction detection sensors, gas detection
sensors to detection concentrations of hazardous gases for safety
or to measure pollutants in the atmosphere), or other components
that may provide indications, measurements, or signals
corresponding to a surrounding physical environment. The position
components 2562 include location sensor components (e.g., a Global
Position System (GPS) receiver component), altitude sensor
components (e.g., altimeters or barometers that detect air pressure
from which altitude may be derived), orientation sensor components
(e.g., magnetometers), and the like.
[0168] Communication may be implemented using a wide variety of
technologies. The I/O components 2550 may include communication
components 2564 operable to couple the machine 2500 to a network
2580 or devices 2570 via coupling 2582 and coupling 2572,
respectively. For example, the communication components 2564
include a network interface component or another suitable device to
interface with the network 2580. In further examples, communication
components 2564 include wired communication components, wireless
communication components, cellular communication components, Near
Field Communication (NFC) components, Bluetooth.RTM. components
(e.g., Bluetooth.RTM. Low Energy), Wi-Fi.RTM. components, and other
communication components to provide communication via other
modalities. The devices 2570 may be another machine or any of a
wide variety of peripheral devices (e,g., a peripheral device
coupled via a Universal Serial Bus (USB)).
[0169] Moreover, in some implementations, the communication
components 2564 detect identifiers or include components operable
to detect identifiers. For example, the communication components
2564 include Radio Frequency identification (RFID) tag reader
components, NFC smart tag detection components, optical reader
components (e.g., an optical sensor to detect one-dimensional bar
codes such as Universal Product Code (UPC) bar code,
multi-dimensional bar codes such as Quick Response (QR) code, Aztec
code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, UCC
RSS-2D bar code, and other optical codes), acoustic detection
components (e.g., microphones to identify tagged audio signals), or
any suitable combination thereof. In addition, a variety of
information can be derived via the communication components 2564,
such as, location via Internet Protocol (IP) geo-location, location
via Wi-Fit signal triangulation, location via detecting a NFC
beacon signal that may indicate a particular location, and so
forth.
Transmission Medium
[0170] In various example embodiments, one or more portions of the
network 2580 may be an ad hoc network, an intranet, an extranet, a
virtual private network (VPN), a local area network (LAN), a
wireless LAN (WLAN), a wide area network (WAN), a wireless WAN
(WWAN), a metropolitan area network (MAN), the Internet, a portion
of the Internet, a portion of the Public Switched Telephone Network
(PSTN), a plain old telephone service (POTS) network, a cellular
telephone network, a wireless network, a Wi-Fi.RTM. network,
another type of network, or a combination of two or more such
networks. For example, the network 2580 or a portion of the network
2580 may include a wireless or cellular network and the coupling
2582 may be a Code Division Multiple Access (CDMA) connection, a
Global System for Mobile communications (GSM) connection, or other
type of cellular or wireless coupling. In this example, the
coupling 2582 may implement any of a variety of types of data
transfer technology, such as Single Carrier Radio Transmission
Technology (1xRTT), Evolution-Data Optimized (EVDO) technology,
General Packet Radio Service (GPRS) technology, Enhanced Data rates
for GSM Evolution (EDGE) technology, third Generation Partnership
Project (3GPP) including 3G, fourth generation wireless (4G)
networks, Universal Mobile Telecommunications System (UMTS), High
Speed Packet Access (HSPA), Worldwide Interoperability for
Microwave Access (WiMAX), Long Term Evolution (LTE) standard,
others defined by various standard setting organizations, other
long range protocols, or other data transfer technology.
[0171] In example embodiments, the instructions 2516 are
transmitted or received over the network 2580 using a transmission
medium via a network interface device (e.g., a network interface
component included in the communication components 2564) and
utilizing any one of a number of well-known transfer protocols
(e.g., hypertext transfer protocol (HTTP)). Similarly, in other
example embodiments, the instructions 2516 are transmitted or
received using a transmission medium via the coupling 2572 (e.g., a
peer-to-peer coupling) to devices 2570. The term "transmission
medium" shall be taken to include any intangible medium that is
capable of storing, encoding, or carrying instructions 2516 for
execution by the machine 2500, and includes digital or analog
communications signals or other intangible medium to facilitate
communication of such software.
[0172] Furthermore, the machine-readable medium 2538 is
non-transitory (in other words, not having any transitory signals)
in that it does not embody a propagating signal. However, labeling
the machine-readable medium 2538 as "non-transitory" should not be
construed to mean that the medium is incapable of movement; the
medium should be considered as being transportable from one
physical location to another. Additionally, since the
machine-readable medium 2538 is tangible, the medium may be
considered to be a machine-readable device.
Language
[0173] Throughout this specification, plural instances may
implement components, operations, or structures described as a
single instance. Although individual operations of one or more
methods are illustrated and described as separate operations, one
or more of the individual operations may be performed concurrently,
and nothing requires that the operations be performed in the order
illustrated. Structures and functionality presented as separate
components in example configurations may be implemented as a
combined structure or component. Similarly, structures and
functionality presented as a single component may be implemented as
separate components. These and other variations, modifications,
additions, and improvements fall within the scope of the subject
matter herein.
[0174] Although an overview of the inventive subject matter has
been described with reference to specific example embodiments,
various modifications and changes may be made to these embodiments
without departing from the broader scope of embodiments of the
present disclosure. Such embodiments of the inventive subject
matter may be referred to herein, individually or collectively, by
the term "invention" merely for convenience and without intending
to voluntarily limit the scope of this application to any single
disclosure or inventive concept if more than one is, in fact,
disclosed.
[0175] The embodiments illustrated herein are described in
sufficient detail to enable those skilled in the art to practice
the teachings disclosed. Other embodiments may be used and derived
therefrom, such that structural and logical substitutions and
changes may be made without departing from the scope of this
disclosure. The Detailed Description, therefore, is not to be taken
in a limiting sense, and the scope of various embodiments is
defined only by the appended claims, along with the full range of
equivalents to which such claims are entitled.
[0176] As used herein, the term "or" may be construed in either an
inclusive or exclusive sense. Moreover, plural instances may be
provided for resources, operations, or structures described herein
as a single instance. Additionally, boundaries between various
resources, operations, modules, engines, and data stores are
somewhat arbitrary, and particular operations are illustrated in a
context of specific illustrative configurations. Other allocations
of functionality are envisioned and may fall within a scope of
various embodiments of the present disclosure. In general,
structures and functionality presented as separate resources in the
example configurations may be implemented as a combined structure
or resource. Similarly, structures and functionality presented as a
single resource may be implemented as separate resources. These and
other variations, modifications, additions, and improvements fall
within a scope of embodiments of the present disclosure as
represented by the appended claims. The specification and drawings
are, accordingly, to be regarded in an illustrative rather than a
restrictive sense.
* * * * *