U.S. patent application number 15/791196 was filed with the patent office on 2018-02-15 for system for monitoring individuals as they age in place.
The applicant listed for this patent is Vision Service Plan. Invention is credited to Richard Chester Klosinski, JR., Meghan Kathleen Murphy, Jay William Sales, Matthew David Steen, Matthew Allen Workman.
Application Number | 20180042523 15/791196 |
Document ID | / |
Family ID | 55436366 |
Filed Date | 2018-02-15 |
United States Patent
Application |
20180042523 |
Kind Code |
A1 |
Sales; Jay William ; et
al. |
February 15, 2018 |
SYSTEM FOR MONITORING INDIVIDUALS AS THEY AGE IN PLACE
Abstract
A computer-implemented method, and related system, for
monitoring the wellbeing of an individual by providing eyewear that
includes at least one sensor for monitoring the motion of the user.
In various embodiments, the system receives data generated by the
at least one sensor, uses the data to determine the user's
movements using the received data, and compares the user's
movements to previously established movement patterns of the user.
If the system detects one or more inconsistencies between the
user's current movements as compared to the previously established
movement patterns of the user, the system may notify the user or a
third party of the detected one or more inconsistencies. The system
may similarly monitor a user's compliance with a medical regime and
notify the user or a third party of the user's compliance with the
regime.
Inventors: |
Sales; Jay William; (Citrus
Heights, CA) ; Klosinski, JR.; Richard Chester;
(Sacramento, CA) ; Workman; Matthew Allen;
(Sacramento, CA) ; Murphy; Meghan Kathleen;
(Davis, CA) ; Steen; Matthew David; (Sacramento,
CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Vision Service Plan |
Rancho Cordova |
CA |
US |
|
|
Family ID: |
55436366 |
Appl. No.: |
15/791196 |
Filed: |
October 23, 2017 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
14562454 |
Dec 5, 2014 |
9795324 |
|
|
15791196 |
|
|
|
|
62046406 |
Sep 5, 2014 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G02C 11/10 20130101;
A61B 2562/0257 20130101; A61B 5/0476 20130101; A61B 5/1176
20130101; G08B 21/0461 20130101; G09B 5/00 20130101; A61B 3/112
20130101; A61B 5/117 20130101; A61F 2/76 20130101; H04L 63/0861
20130101; A61B 5/165 20130101; A61B 5/0402 20130101; A61B 5/7278
20130101; A61B 2560/0475 20130101; A61B 5/443 20130101; A61B 5/486
20130101; A61B 5/7282 20130101; G06K 9/00597 20130101; G16H 20/40
20180101; A61B 5/14552 20130101; A61B 5/6803 20130101; A61B 7/04
20130101; A61B 5/1128 20130101; G16H 40/63 20180101; A61B 5/024
20130101; G08B 21/0423 20130101; G08B 21/0476 20130101; A61B 5/0205
20130101; A61B 2562/0219 20130101; A61B 2576/00 20130101; G06K
9/00664 20130101; G06K 9/6201 20130101; G09B 5/06 20130101; G16H
50/20 20180101; A61B 5/0531 20130101; A61B 5/4884 20130101; G06K
9/00604 20130101; A61B 5/0022 20130101; A61B 5/1114 20130101; A61B
5/1103 20130101; G09B 19/0092 20130101; A61B 2562/0223 20130101;
G06K 9/00617 20130101; A61B 5/4076 20130101; G06F 21/35 20130101;
A61B 5/0002 20130101; A61B 5/0077 20130101; A61B 5/0816 20130101;
A61B 5/4266 20130101; G16H 40/67 20180101; G07C 9/37 20200101; A61F
2002/7695 20130101; A61B 5/1118 20130101; A61B 5/112 20130101; A63B
24/0062 20130101; A61B 5/1116 20130101; G06K 9/00348 20130101; G08B
21/02 20130101; A61B 5/1032 20130101; A61B 5/7246 20130101 |
International
Class: |
A61B 5/11 20060101
A61B005/11 |
Claims
1. A computer-implemented method of monitoring the wellbeing of an
individual, the method comprising the steps of: a. providing a user
with computerized eyewear comprising at least one sensor for
monitoring the motion of the user; b. receiving, by one or more
processors, an indication from the user, for the at least one
sensor to generate a first set of data identifying one or more
movement patterns for the user; c. in response to receiving the
indication, collecting, by one or more processors, via the at least
one sensor, the first set of data; d. generating, by one or more
processors, an established one or more movement patterns for the
user based on the first set of data generated by the at least one
sensor; e. receiving, by one or more processors, a second set of
data generated by the at least one sensor after the established one
or more movement patterns have been generated; f. at least
partially in response to receiving the second set of data generated
by the at least one sensor, determining, by one or more processors,
the user's movements using the received second set of data; g.
detecting, by one or more processors, one or more inconsistencies
between the user's movements based on the second set of data as
compared to the previously established one or more movement
patterns for the user based on the first set of data; h. at least
partially in response to detecting the one more inconsistencies,
notifying, by one or more processors, at least one recipient of the
one or more inconsistencies, where the at least one recipient is a
recipient selected from a group consisting of: the user and a third
party.
2. The computer-implemented method of claim 1, wherein the at least
one sensor comprises at least one sensor selected from a group
consisting of: a. a motion sensor; b. an accelerometer; c. a
gyroscope; d. a geomagnetic sensor; e. a global positioning system
sensor; f. an impact sensor; g. a microphone; h. a forward facing
camera; i. a heart rate monitor; j. a pulse oximeter; k. a blood
alcohol monitor; l. a respiratory rate sensor; and m. a transdermal
sensor.
3. The computer-implemented method of claim 2, wherein the at least
one sensor comprises at least one sensor selected from a group
consisting of: a motion sensor, an accelerometer, a global
positioning sensor, a gyroscope, and a forward facing camera.
4. The computer-implemented method of claim 2, wherein the method
further comprises the step of: a. calculating, by a processor, a
number of steps taken by the user in a particular day; b. at least
partially in response to calculating the number of steps,
comparing, by a processor, the calculated number of steps taken by
the user in the particular day to a predetermined average number of
steps taken by the user in a day; and c. at least partially in
response to comparing the calculated number of steps to the
predetermined average number of steps, notifying the user or a
third party if the calculated number of steps in the particular day
is less than a predetermined percentage of the predetermined
average number of steps taken by the user in a day.
5. The computer-implemented method of claim 2, further comprising
the steps of: a. detecting, by a processor, whether the user moves
during a predefined time period; and b. at least partially in
response to detecting whether the user moves during the predefined
time period, notifying, by a processor, the at least one recipient
selected from a group consisting of: the user or a third party if
the user does not move during the predefined time period.
6. The computer-implemented method of claim 2, further comprising
the steps of: a. detecting, by a processor, from the received data
generated by the at least one sensor if the user experiences a
sudden acceleration or sudden impact; and b. at least partially in
response to detecting that the user has experienced a sudden
acceleration or sudden impact, notifying, by a processor, the user
or a third party that the user experienced the sudden acceleration
or sudden impact.
7. The computer-implemented method of claim 2, further comprising
the steps of: a. detecting, by a processor, from the received data
generated by the at least one sensor: (1) whether the user is
breathing; and (2) whether the user's heart is beating; and b. at
least partially in response to determining that the user is not
breathing or that the user's heart is not beating, sending a
notification to a third party.
8. The computer-implemented method of claim 2, further comprising
the steps of: a. receiving, by a processor, from the user or third
party, a medicine regime associated with the user; b. storing, by a
processor, the medicine regime in memory; c. receiving, by a
processor, data generated by a forward facing camera associated
with the computerized eyewear; d. analyzing, by a processor, the
received data to determine data selected from a group consisting of
one or more: i. types of medicine taken by the user; ii. times the
medicine is taken by the user; and iii. doses of the medicine taken
by the user; e. at least partially in response to analyzing the
received data, comparing, by a processor, the one or more of the
types of medicine taken, the one or more times the medicine is
taken, or the one or more doses of medicine taken to the stored
medicine regime for the user; f. at least partially in response to
comparing the one or more of the type of medicine taken, the time
the medicine is taken and the dose of medicine taken, identifying,
by a processor, one or more inconsistencies between the stored
medicine regime, and the one or more types of medicine taken, the
one or more times the medicine is taken, or the one or more doses
of medicine taken; g. at least partially in response to identifying
the one or more inconsistencies between the medicine regime and the
one or more of the types of medicine taken, the one or more times
the medicine is taken, or the one or more doses of medicine taken,
sending an alert to the user or a third party of the one or more
inconsistencies.
9. The computer-implemented method of claim 8, wherein: a. the data
generated comprises one or more images captured by the forward
facing camera; b. the step of analyzing the received data further
comprises: i. detecting, by a processor, one or more pills in the
one or more images; ii. comparing, by a processor, the one or more
detected pills found in the one or more images to one or more known
images of pills stored in a database; iii. identifying, by a
processor, the one or more pills by matching the one or more pills
from the one or more images to the one or more known images of
pills stored in the database; and iv. detecting, by a processor, a
time that the one or more images were taken.
10. The computer-implemented method of claim 1, wherein the
indicator is defined by the user.
11-19. (canceled)
20. A computer-implemented method of monitoring the wellbeing of an
individual, the method comprising the steps of: a. providing a user
with computerized eyewear comprising at least one sensor for
monitoring the motion of the user; b. receiving a command for the
at least one sensor to generate a first set of data identifying one
or more user-defined movement patterns for the user; c. in response
to receiving the command, collecting, by one or more processors,
via the at least one sensor, the first set of data; d. generating,
by one or more processors, an established one or more movement
patterns for the user based on the first set of data generated by
the at least one sensor; e. receiving, by one or more processors, a
second set of data generated by the at least one sensor after the
established one or more movement patterns have been generated; f.
at least partially in response to receiving the second set of data
generated by the at least one sensor, determining, by one or more
processors, the user's movements using the received second set of
data; g. detecting, by one or more processors, one or more
inconsistencies between the user's movements based on the second
set of data as compared to the previously established one or more
movement patterns for the user based on the first set of data; h.
at least partially in response to detecting the one more
inconsistencies, notifying, by one or more processors, at least one
recipient of the one or more inconsistencies, where the at least
one recipient is a recipient selected from a group consisting of:
the user and a third party.
21. The computer-implemented method of claim 20, wherein the at
least one sensor comprises at least one sensor selected from a
group consisting of: a. a motion sensor; b. an accelerometer; c. a
gyroscope; d. a geomagnetic sensor; e. a global positioning system
sensor; f. an impact sensor; g. a microphone; h. a forward facing
camera; i. a heart rate monitor; j. a pulse oximeter; k. a blood
alcohol monitor; l. a respiratory rate sensor; and m. a transdermal
sensor.
22. The computer-implemented method of claim 21, wherein the at
least one sensor comprises at least one sensor selected from a
group consisting of: a motion sensor, an accelerometer, a global
positioning sensor, a gyroscope, and a forward facing camera.
23. The computer-implemented method of claim 21, wherein the method
further comprises the step of: a. calculating, by a processor, a
number of steps taken by the user in a particular day; b. at least
partially in response to calculating the number of steps,
comparing, by a processor, the calculated number of steps taken by
the user in the particular day to a predetermined average number of
steps taken by the user in a day; and c. at least partially in
response to comparing the calculated number of steps to the
predetermined average number of steps, notifying the user or a
third party if the calculated number of steps in the particular day
is less than a predetermined percentage of the predetermined
average number of steps taken by the user in a day.
24. The computer-implemented method of claim 21, further comprising
the steps of: a. detecting, by a processor, whether the user moves
during a predefined time period; and b. at least partially in
response to detecting whether the user moves during the predefined
time period, notifying, by a processor, the at least one recipient
selected from a group consisting of: the user or a third party if
the user does not move during the predefined time period.
25. The computer-implemented method of claim 21, further comprising
the steps of: a. detecting, by a processor, from the received data
generated by the at least one sensor if the user experiences a
sudden acceleration or sudden impact; and b. at least partially in
response to detecting that the user has experienced a sudden
acceleration or sudden impact, notifying, by a processor, the user
or a third party that the user experienced the sudden acceleration
or sudden impact.
26. The computer-implemented method of claim 21, further comprising
the steps of: a. detecting, by a processor, from the received data
generated by the at least one sensor: (1) whether the user is
breathing; and (2) whether the user's heart is beating; and b. at
least partially in response to determining that the user is not
breathing or that the user's heart is not beating, sending a
notification to a third party.
27. The computer-implemented method of claim 21, further comprising
the steps of: a. receiving, by a processor, from the user or third
party, a medicine regime associated with the user; b. storing, by a
processor, the medicine regime in memory; c. receiving, by a
processor, data generated by a forward facing camera associated
with the computerized eyewear; d. analyzing, by a processor, the
received data to determine data selected from a group consisting of
one or more: i. types of medicine taken by the user; ii. times the
medicine is taken by the user; and iii. doses of the medicine taken
by the user; e. at least partially in response to analyzing the
received data, comparing, by a processor, the one or more of the
types of medicine taken, the one or more times the medicine is
taken, or the one or more doses of medicine taken to the stored
medicine regime for the user; f. at least partially in response to
comparing the one or more of the type of medicine taken, the time
the medicine is taken and the dose of medicine taken, identifying,
by a processor, one or more inconsistencies between the stored
medicine regime, and the one or more types of medicine taken, the
one or more times the medicine is taken, or the one or more doses
of medicine taken; g. at least partially in response to identifying
the one or more inconsistencies between the medicine regime and the
one or more of the types of medicine taken, the one or more times
the medicine is taken, or the one or more doses of medicine taken,
sending an alert to the user or a third party of the one or more
inconsistencies.
28. The computer-implemented method of claim 27, wherein: a. the
data generated comprises one or more images captured by the forward
facing camera; b. the step of analyzing the received data further
comprises: i. detecting, by a processor, one or more pills in the
one or more images; ii. comparing, by a processor, the one or more
detected pills found in the one or more images to one or more known
images of pills stored in a database; iii. identifying, by a
processor, the one or more pills by matching the one or more pills
from the one or more images to the one or more known images of
pills stored in the database; and iv. detecting, by a processor, a
time that the one or more images were taken.
29. The computer-implemented method of claim 20, wherein the
command is defined by the user.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application is a continuation of U.S. patent
application Ser. No. 14/562,454, filed Dec. 5, 2014, entitled
"System for Monitoring Individuals as They Age in Place," which
claims the benefit of U.S. Provisional Patent Application No.
62/046,406, filed Sep. 5, 2014, entitled, "Wearable Health Computer
Apparatus, Systems, and Related Methods," the disclosures of which
are hereby incorporated herein by reference in their entirety.
BACKGROUND
[0002] Being able to monitor elderly individuals who live
independently at home has become increasingly important due, in
part, to the high cost of elder care facilities. Accordingly, there
is a need for improved systems and methods for monitoring the
activities and wellbeing of elderly individuals living at home.
There is a similar need for monitoring the activities and wellbeing
of individuals with special needs living outside of an
institutional setting. Various embodiments of the present systems
and methods recognize and address the foregoing considerations, and
others, of prior art systems and methods.
SUMMARY OF THE VARIOUS EMBODIMENTS
[0003] A computer-implemented method of monitoring the wellbeing of
an individual according to various embodiments comprises the steps
of: (1) providing a user with computerized eyewear comprising at
least one sensor for monitoring the motion of the user; (2)
receiving data generated by the at least one sensor; (3)
determining the user's movements using the received data; (4)
comparing the user's movements to previously established one or
more movement patterns for the user; (5) determining whether one or
more inconsistencies exist between the current user's movements and
the previously-established one or more movement patterns; and (6)
at least partially in response to determining that such one or more
inconsistencies exist, notifying a recipient selected from a group
consisting of: the user and/or a third party of the detected one or
more inconsistencies.
[0004] A computer-implemented method of monitoring the wellbeing of
an individual according to further embodiments comprises the steps
of: (1) providing a user with a computerized wearable device
comprising at least one sensor for monitoring actions taken by a
user; (2) receiving a medicine regime associated with the user; (3)
receiving data generated by the at least of the wearable device's
sensors; (4) analyzing the received data generated by the at least
one sensor to determine: (a) the type of medicine taken by the
wearer; (b) the time the medicine is taken by the wearer; and/or
(c) the dose of medicine taken by the wearer; (5) comparing the
medicine regime for the user to the determined one or more of the
type of medicine taken by the wearer, the time the medicine is
taken by the wearer, and/or dose of medicine taken by the wearer;
(6) detecting one or more inconsistencies between the medicine
regime associated with the user and the determined one or more of
the type of medicine taken by the user, the time the medicine is
taken by the user, and/or the dose of medicine taken by the user;
(7) notifying the user and/or third party of the detected
inconsistencies.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] Various embodiments of systems and methods for assessing a
user's activities and movements are described below. In the course
of this description, reference will be made to the accompanying
drawings, which are not necessarily drawn to scale and wherein:
[0006] FIG. 1 is a block diagram of a Behavior Pattern Analysis
System in accordance with an embodiment of the present system;
[0007] FIG. 2 is a block diagram of the Pattern Analysis Server of
FIG. 1;
[0008] FIG. 3 is a flowchart that generally illustrates various
steps executed by a Behavior Pattern Analysis Module according to a
particular embodiment; and
[0009] FIG. 4 is a perspective view of computerized eyewear
according to a particular embodiment.
DETAILED DESCRIPTION OF SOME EMBODIMENTS
[0010] Various embodiments will now be described more fully
hereinafter with reference to the accompanying drawings. It should
be understood that the invention may be embodied in many different
forms and should not be construed as limited to the embodiments set
forth herein. Rather, these embodiments are provided so that this
disclosure will be thorough and complete, and will fully convey the
scope of the invention to those skilled in the art. Like numbers
refer to like elements throughout.
Overview
[0011] A wearable health monitoring system, according to various
embodiments, may include a suitable wearable device that is
configured to monitor one or more movements, activities, and/or
health attributes of a wearer (e.g., user). Suitable wearable
devices may include, for example: (1) pair of eyewear (e.g.,
goggles or eyeglasses); (2) one or more contact lenses; (3) a
wristwatch; (4) an article of clothing (e.g., such as a suitable
shirt, pair of pants, undergarment, compression sleeve, etc.); (5)
footwear; (6) a hat; (7) a helmet; (8) an adhesive strip or other
tag that may be selectively attached to an individual or the
individual's clothing; (9) a computing device that is embedded into
a portion of an individual's body (e.g., under the individual's
skin, or within a medical device, such as a pacemaker); (10) an
orthopedic cast, or (11) any other suitable wearable item. In a
particular example, a wearable health monitoring system embodied as
a pair of eyewear may enable the system to monitor what an
individual is sensing (e.g., touching, seeing, hearing, smelling,
and/or tasting) based at least in part on a proximity of the
eyewear to the wearer's sensory systems (e.g., skin, eyes, mouth,
ears, nose) when worn by the wearer.
[0012] In various embodiments, the system comprises one or more
sensors that are configured to determine one or more current
physical attributes of the wearer (e.g., heart rate, brain wave
activity, movement, body temperature, blood pressure, oxygen
saturation level, etc. . . . ). The one or more sensors may
include, for example: (1) one or more heart rate monitors; (2) one
or more electrocardiograms (EKG); (3), one or more
electroencephalograms (EEG); (4) one or more pedometers; (5) one or
more thermometers; (6) one or more transdermal transmitter sensors;
(7) one or more front-facing cameras; (8) one or more eye-facing
cameras; (9) one or more microphones; (10) one or more
accelerometers; (11) one or more blood pressure sensors; (12) one
or more pulse oximeters; (13) one or more respiratory rate sensors;
(14) one or more blood alcohol concentration (BAC) sensors; (15)
one or more near-field communication sensors; (16) one or more
motion sensors; (17) one or more gyroscopes; (18) one or more
geomagnetic sensors; (19) one or more global positioning system
sensors; (20) one or more impact sensors; and/or (21) any other
suitable one or more sensors.
[0013] In particular embodiments, the system is configured to
gather data, for example, using the one or more sensors, about the
wearer (e.g., such as the wearer's body temperature, balance, heart
rate, level of physical activity, diet (e.g., food recently eaten),
compliance with a prescribed medical regimen (e.g., medications
recently taken), position, movements (e.g., body movements, facial
muscle movements), location, distance traveled, etc.). In various
embodiments, the system is configured to, for example: (1) store
the gathered data associated with the user; (2) provide the data to
one or more medical professionals, for example, to aid in the
diagnosis and/or treatment of the user; (3) use the data to predict
one or more medical issues with the user (e.g., the illness or
death of the user); and/or (4) take any other suitable action based
at least in part on the gathered data.
[0014] In a particular implementation, the system's wearable device
is a pair of computerized eyewear that comprises one or more
sensors for monitoring one or more day-to-day activities of an
elderly individual as they "age in place" (e.g., they live in a
non-institutional setting). In particular embodiments, the one or
more sensors are coupled to (e.g., connected to, embedded in, etc.)
the pair of glasses, which may be, for example, a pair of
computerized or non-computerized eyeglasses. In particular
embodiments, the individual is a senior citizen who lives at least
substantially independently.
[0015] In particular embodiments, the wearable computing device
comprises one or more location sensors (e.g., geomagnetic sensors,
etc.), motion sensors (e.g., accelerometers, gyroscopes, magnetic
sensors, pressure sensors, etc.), and/or impact sensors that are
adapted to sense the movement and location of the individual. In
various embodiments, the wearable device is adapted to facilitate
the transmission of this movement information to a remote computing
device (e.g., a handheld computing device, an automated dialing
device, a central server, or any other suitable smart device that
may, in various embodiments, contain a wireless communications
device that can connect to the wearable computing device) that
analyzes the information to determine whether the individual's
movement patterns are consistent with the individual's typical
(e.g., past) movement patterns. If the movement patterns are
inconsistent with the individual's typical movement patterns, the
system may, for example, generate and transmit an alert to a third
party (e.g., a physician, relative of the individual, other
caretaker, police, etc.) informing the third party of the
irregularities in the individual's movement. The third party may
then, for example, check on the individual to make sure that the
individual does not require assistance.
[0016] In further embodiments, the wearable device may be adapted,
for example, to monitor: (1) an individual's compliance with a
prescribed treatment plan (e.g., compliance with a medication
schedule); (2) an individual's compliance with a diet; and/or (3)
whether an individual leaves a prescribed area defined by a
geo-fence (e.g., a virtual fence). The system may do this, for
example, by using any suitable sensors (e.g., location sensors,
cameras, etc. . . . ) associated with the wearable device.
Exemplary Technical Platforms
[0017] As will be appreciated by one skilled in the relevant field,
the present systems and methods may be, for example, embodied as a
computer system, a method, or a computer program product.
Accordingly, various embodiments may be entirely hardware or a
combination of hardware and software. Furthermore, particular
embodiments may take the form of a computer program product stored
on a computer-readable storage medium having computer-readable
instructions (e.g., software) embodied in the storage medium.
Various embodiments may also take the form of Internet-implemented
computer software. Any suitable computer-readable storage medium
may be utilized including, for example, hard disks, compact disks,
DVDs, optical storage devices, and/or magnetic storage devices.
[0018] Various embodiments are described below with reference to
block diagram and flowchart illustrations of methods, apparatuses,
(e.g., systems), and computer program products. It should be
understood that each block of the block diagrams and flowchart
illustrations, and combinations of blocks in the block diagrams and
flowchart illustrations, respectively, can be implemented by a
computer executing computer program instructions. These computer
program instructions may be loaded onto a general purpose computer,
a special purpose computer, or other programmable data processing
apparatus that can direct a computer or other programmable data
processing apparatus to function in a particular manner such that
the instructions stored in the computer-readable memory produce an
article of manufacture that is configured for implementing the
functions specified in the flowchart block or blocks.
[0019] The computer instructions may execute entirely on the user's
computer, partly on the user's computer, as a stand-alone software
package, partly on a user's computer and partly on a remote
computer, or entirely on the remote computer or server. In the
latter scenario, the remote computer may be connected to the user's
computer through any type of network, including but not limited to:
(1) a local area network (LAN); (2) a wide area network (WAN); (3)
a cellular network; or (4) the connection may be made to an
external computer (for example, through the Internet using an
Internet Service Provider).
[0020] These computer program instructions may also be stored in a
computer-readable memory that can direct a computer or other
programmable data processing apparatus to function in a particular
manner such that the instructions stored in the computer-readable
memory produce an article of manufacture that is configured for
implementing the function specified in the flowchart block or
blocks. The computer program instructions may also be loaded onto a
computer or other programmable data processing apparatus to cause a
series of operational steps to be performed on the computer or
other programmable apparatus to produce a computer-implemented
process (e.g., method) such that the instructions that execute on
the computer or other programmable apparatus provide steps for
implementing the functions specified in the flowchart block or
blocks.
[0021] Example System Architecture
[0022] FIG. 1 is a block diagram of a Behavior Pattern Analysis
System 100 according to particular embodiments. As may be
understood from this figure, the Behavior Pattern Analysis System
100 includes One or More Networks 115, One or More Third Party
Servers 50, a Pattern Analysis Server 120 that includes a Behavior
Pattern Analysis Module 300, a Movement Information Database 140,
One or More Remote Computing Devices 154 (e.g., such as a smart
phone, a tablet computer, a wearable computing device, a laptop
computer, a desktop computer, a Bluetooth device, an automated
dialing apparatus, etc.), and One or More Wearable Health
Monitoring Devices 156, which may, for example, be embodied as one
or more of eyewear, headwear, clothing, a watch, a hat, a helmet, a
cast, an adhesive bandage, a piece of jewelry (e.g., a ring,
earring, necklace, bracelet, etc.), or any other suitable wearable
device. In particular embodiments, the one or more computer
networks 115 facilitate communication between the One or More Third
Party Servers 50, the Pattern Analysis Server 120, the Movement
Information Database 140, the One or More Remote Computing Devices
154, and the one or more Health Monitoring Devices 156.
[0023] The one or more networks 115 may include any of a variety of
types of wired or wireless computer networks such as the Internet,
a private intranet, a mesh network, a public switch telephone
network (PSTN), or any other type of network (e.g., a network that
uses Bluetooth or near field communications to facilitate
communication between computing devices). The communication link
between the One or More Remote Computing Devices 154 and the
Pattern Analysis Server 120 may be, for example, implemented via a
Local Area Network (LAN) or via the Internet.
[0024] FIG. 2 illustrates a diagrammatic representation of the
architecture for the Pattern Analysis Server 120 that may be used
within the Behavior Pattern Analysis System 100. It should be
understood that the computer architecture shown in FIG. 2 may also
represent the computer architecture for any one of the One or More
Remote Computing Devices 154, one or more Third Party Servers 50,
and One or More Health Monitoring Devices 156 shown in FIG. 1. In
particular embodiments, the Pattern Analysis Server 120 may be
suitable for use as a computer within the context of the Behavior
Pattern Analysis System 100 that is configured for monitoring the
behavior (e.g., movements, location, eating and sleeping habits) of
the wearer.
[0025] In particular embodiments, the Pattern Analysis Server 120
may be connected (e.g., networked) to other computing devices in a
LAN, an intranet, an extranet, and/or the Internet as shown in FIG.
1. As noted above, the Pattern Analysis Server 120 may operate in
the capacity of a server or a client computing device in a
client-server network environment, or as a peer computing device in
a peer-to-peer (or distributed) network environment. The Pattern
Analysis Server 120 may be a desktop personal computing device
(PC), a tablet PC, a set-top box (STB), a Personal Digital
Assistant (PDA), a cellular telephone, a web appliance, a network
router, a switch or bridge, or any other computing device capable
of executing a set of instructions (sequential or otherwise) that
specify actions to be taken by that computing device. Further,
while only a single computing device is illustrated, the term
"computing device" shall also be interpreted to include any
collection of computing devices that individually or jointly
execute a set (or multiple sets) of instructions to perform any one
or more of the methodologies discussed herein.
[0026] An exemplary Pattern Analysis Server 120 includes a
processing device 202, a main memory 204 (e.g., read-only memory
(ROM), flash memory, dynamic random access memory (DRAM) such as
synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static
memory 206 (e.g., flash memory, static random access memory (SRAM),
etc.), and a data storage device 218, which communicate with each
other via a bus 232.
[0027] The processing device 202 represents one or more
general-purpose or specific processing devices such as a
microprocessor, a central processing unit (CPU), or the like. More
particularly, the processing device 202 may be a complex
instruction set computing (CISC) microprocessor, reduced
instruction set computing (RISC) microprocessor, very long
instruction word (VLIW) microprocessor, or processor implementing
other instruction sets, or processors implementing a combination of
instruction sets. The processing device 202 may also be one or more
special-purpose processing devices such as an application specific
integrated circuit (ASIC), a field programmable gate array (FPGA),
a digital signal processor (DSP), network processor, or the like.
The processing device 202 may be configured to execute processing
logic 226 for performing various operations and steps discussed
herein.
[0028] The Pattern Analysis Server 120 may further include a
network interface device 208. The Pattern Analysis Server 120 may
also include a video display unit 210 (e.g., a liquid crystal
display (LCD) or a cathode ray tube (CRT)), an alpha-numeric input
device 212 (e.g., a keyboard), a cursor control device 214 (e.g., a
mouse), and a signal generation device 216 (e.g., a speaker).
[0029] The data storage device 218 may include a non-transitory
computing device-accessible storage medium 230 (also known as a
non-transitory computing device-readable storage medium or a
non-transitory computing device-readable medium) on which is stored
one or more sets of instructions (e.g., the Behavior Pattern
Analysis Module 300) embodying any one or more of the methodologies
or functions described herein. The one or more sets of instructions
may also reside, completely or at least partially, within the main
memory 204 and/or within the processing device 202 during execution
thereof by the Pattern Analysis Server 120--the main memory 204 and
the processing device 202 also constituting computing
device-accessible storage media. The one or more sets of
instructions may further be transmitted or received over a network
115 via a network interface device 208.
[0030] While the computing device-accessible storage medium 230 is
shown in an exemplary embodiment to be a single medium, the term
"computing device-accessible storage medium" should be understood
to include a single medium or multiple media (e.g., a centralized
or distributed database, and/or associated caches and servers) that
store the one or more sets of instructions. The term "computing
device-accessible storage medium" should also be understood to
include any medium that is capable of storing, encoding, or
carrying a set of instructions for execution by the computing
device and that causes the computing device to include any one or
more of the methodologies of the present invention. The term
"computing device-accessible storage medium" should accordingly be
understood to include, but not be limited to, solid-state memories,
optical and magnetic media, etc.
Exemplary System Platform
[0031] As noted above, a system, according to various embodiments,
is adapted to monitor one or more patterns of behavior and/or one
or more locations of a user of a wearable device. Various aspects
of the system's functionality may be executed by certain system
modules, including the Behavior Pattern Analysis Module 300. The
Behavior Pattern Analysis Module 300 is discussed in greater detail
below.
Behavior Pattern Analysis Module
[0032] FIG. 3A is a flow chart of operations performed by an
exemplary Behavior Pattern Analysis Module 300, which may, for
example, run on the Pattern Analysis Server 120, or any suitable
computing device (such as the One or More Health Monitoring Devices
156, a handheld computing device coupled to communicate with the
One or More Health Monitoring Devices 156 or a suitable mobile
computing device). In particular embodiments, the Behavior Pattern
Analysis Module 300 may assess a user's behavior and determine the
user's location to provide this information to the user or to a
third party.
[0033] The system begins, in various embodiments, at Step 305 by
providing a user with computerized eyewear comprising at least one
sensor for monitoring one or more behaviors of the user and/or any
suitable attributes of the user. In various embodiments, the at
least one sensor may include a location sensor (e.g., a GPS unit),
an accelerometer, a heart rate monitor, one or more
electrocardiogram (EKG), an electroencephalogram (EEG), a
pedometer, a thermometer, a front-facing camera, an eye-facing
camera, a microphone, an accelerometer, a blood pressure sensor, a
pulse oximeter, a near-field communication sensor, a motion sensor,
a gyroscope, a geomagnetic sensor, an impact sensor, and/or any
other suitable sensor. In particular embodiments, the computerized
eyewear further comprises: a motion sensor, an accelerometer, a GPS
unit, a gyroscope, and/or a front-facing camera.
[0034] In particular embodiments, the sensors may be coupled to the
eyewear in any suitable way. For example, in various embodiments,
the sensors may be physically embedded into, or otherwise coupled
to the eyewear. In some embodiments, the sensors may be positioned:
(1) along the brow bar of the eyewear; (2) along the temples of the
eyewear; (3) adjacent the lenses of the eyewear; and/or (4) in any
other suitable location.
[0035] In particular embodiments: (1) the sensors are coupled to a
wireless (e.g., Bluetooth, near-field communications, Wi-Fi, etc.)
device that is configured to transmit one or more signals obtained
from the one or more sensors to a handheld wireless device (e.g., a
smartphone, a tablet, an automated dialing device, etc.); and (2)
the step of receiving one or more signals from the one or more
sensors further comprises receiving the one or more signals from
the wireless handheld device via the Internet. In particular
embodiments, one or more of the sensors may be selectively
detachable from the eyewear, or other wearable device. For example,
if a user does not need the temperature sensor, the temperature
sensor may be selectively removed from the eyewear and stored for
future use.
[0036] At Step 310, the system receives data generated by the at
least one sensor. In particular embodiments, the data generated by
the at least one sensor may include data for a heart rate, a heart
rhythm or electrical activity, a brain wave activity, a distance
traveled, a temperature, an image, a sound, a speed traveled, a
blood pressure, an oxygen saturation level, a near-field
communication, a motion, an orientation, a geomagnetic field, a
global position, an impact, a medicine regime, or any other
suitable data.
[0037] In various embodiments, the system may receive the data
substantially automatically after the sensor generates the data. In
some embodiments, the system may receive the data periodically
(e.g., by the second, by the minute, hourly, daily, etc.). For
example, the system may receive the data every thirty seconds
throughout the day. In other embodiments, the system may receive
the data after receiving an indication from the user or a third
party that the system should receive the data. For instance, the
user may speak a voice command to the wearable device requesting
that the device track the user's steps taken. In various
embodiments, the system may receive an indication from the user or
a third party of when to have the system receive the data. For
example, the system may receive an indication from the third party
to have the system receive global positioning data at 8:00 a.m. and
at 2:00 p.m.
[0038] In particular embodiments, the system may receive an
indication from the user or a third party to have particular data
received from a particular sensor at the same time that the system
receives second particular data from a second particular sensor.
For example, when the system receives data that indicates that the
user's speed has increased, the system may at least partially in
response to receiving the increased speed data, also obtain global
position data of the user. In particular embodiments, the system
may receive behavior data during a predefined time period. For
instance, the system may receive behavior data for the user during
a predefined time period when the user should not be moving (e.g.,
11:00 p.m. through 7:00 a.m. because the user should be sleeping).
In various embodiments, the system may receive the data when a
sensor detects movement of the user. For example, the system may
receive data from the global positioning system sensor when the
accelerometer or the gyroscope detects movement of the user.
[0039] In other embodiments, the data generated by the at least one
sensor may be whether the user experiences one of sudden
acceleration and sudden impact. In still other embodiments, the
data generated by the at least one sensor may be a heartbeat and
whether the user is breathing. In yet other embodiments, the data
generated by the at least one sensor may be a medicine regime
associated with the user. For instance, the user or the user's
physician may manually input a medicine regime into the system by
stating the name of the medicine while the user or the user's
physician requests that the front-facing camera capture an image of
the medicine and the medicine bottle. In some embodiments, the
received data generated by the at least one sensor may be one or
more images captured by the forward facing camera. In other
embodiments, the received data generated by the at least one sensor
may be the level of one or more medicines in the user's
bloodstream.
[0040] In various embodiments, the system may receive data from a
single sensor. In other embodiments, the system may receive data
from all of the sensors. In yet other embodiments, the system may
receive multiple data from each of the sensors. In various
embodiments, the system may be configured to receive first data
from a first sensor at the same time that it receives second data
from a second sensor. For example, the system may be configured to
receive a global position from the global positioning system sensor
at the same time that it receives impact data from the impact
sensor.
[0041] In particular embodiments, the system may store the received
data. In various embodiments, the system may store the received
data substantially automatically after receiving the data. In other
embodiments, the system may store the received data after receiving
manual input from the user or a third party requesting that the
system store the data. In various embodiments, the system may store
the received data for a specified period of time. For instance, the
system may store the received data for a day, a month, a year,
etc., in the Behavior Information Database 140. In some
embodiments, the system may store the received data on any suitable
server, database, or device. In other embodiments, the system may
store the received data on the Pattern Analysis Server 120. In
still other embodiments, the system may store the received data in
an account associated with the user. In various embodiments, the
system may store the received data with a timestamp of when the
data was received.
[0042] At Step 315, the system analyzes the data generated by the
at least one sensor. In various embodiments, the system analyzes
the data generated by the at least one sensor substantially
automatically after receiving the generated data. In various
embodiments, the system may analyze the data periodically (e.g., by
the second, by the minute, hourly, daily, etc.). For example, the
system may analyze the data every thirty seconds throughout the
day. In other embodiments, the system may analyze the data after
receiving an indication from the user or a third party that the
system should analyze data. For instance, the user may speak a
voice command to the wearable device requesting that the device
analyze the user's steps taken. In various embodiments, the system
may receive an indication from the user or a third party of when to
have the system analyze the data. For example, the system may
receive an indication from the third party to have the system
analyze global positioning data at 8:00 a.m. and at 2:00 p.m.
[0043] In other embodiments, the system may analyze the data to
determine one or more of (1) the type of medicine taken by the
user; (2) the time the medicine is taken by the user; and (3) the
dose of medicine taken by the user. In still other embodiments, the
step of analyzing the received data further comprises detecting one
or more pills in the one or more images, comparing the one or more
detected pills found in the one or more images to known images of
pills stored in a database, identifying the one or more pills by
matching the one or more pills from the one or more images to the
known images of pills stored in the database, and detecting the
time that the image was taken. In various embodiments, the system
analyzes the level of the one or more medicines in the user's
bloodstream.
[0044] Then, at Step 320, the system determines the user's current
movements using the received data in order to generate one or more
movement patterns for the user. In various embodiments, the system
determines the user's current movements substantially automatically
after receiving the data. In various embodiments, the system may
determine the user's current movements periodically (e.g., by the
second, by the minute, hourly, daily, etc.). For example, the
system may determine the user's current movements every thirty
seconds throughout the day. In other embodiments, the system may
determine the user's current movements after receiving an
indication from the user or a third party that the system should
analyze data. For instance, the user may speak a voice command to
the wearable device requesting that the device analyze the user's
steps taken. In various embodiments, the system may receive an
indication from the user or a third party of when to have the
system analyze the data. For example, the system may receive an
indication from the third party to have the system analyze global
positioning data at 8:00 a.m. and at 2:00 p.m.
[0045] In various embodiments, the system determines the user's
current movements by calculating the number of steps taken by the
user in a particular day. In some embodiments, the system
determines the user's current movements by tracking the distance
traveled by the user for a particular day. In other embodiments,
the system determines the user's current movements by capturing a
series of images from the front-facing camera throughout the day.
In still other embodiments, the system determines the user's
current movements by tracking the orientation of the user using the
gyroscope. In particular embodiments, the current movements of the
user may include actions such as lying down, falling, wandering,
sitting, standing, walking, running, convulsing, shaking,
balancing, etc.
[0046] In various embodiments, the user's current movements may
include the user's current location. For example, the user's
current location may be an address, a geographic area, an
intersection, a bus stop, a building, or any other suitable
definable location. In other embodiments, the user's current
movements may help to indicate the user's current status (e.g.,
asleep, awake, conscious, unconscious, alive, deceased, stable,
good, fair, serious, critical, injured, distressed, etc.). In some
embodiments, the user's current behaviors may include compliance
with prescribed treatment regimes. For instance, the user's current
behaviors may include that the user has not been complying with
prescribed treatment regimes as captured through the front-facing
camera of the user not taking prescribed medicine.
[0047] In various embodiments, the system tracks current movements,
current location, current status, and current compliance to
generate one or more movement patterns, location patterns, status
patterns, and compliance patterns. In some embodiments, the system
generates the one or more patterns substantially automatically
after the system determines the user's current movements, location,
status, and compliance. In some embodiments, the system may
generate the patterns periodically (e.g., by the second, by the
minute, hourly, daily, weekly, monthly, etc.). For example, the
system may generate a movement pattern for each month. In other
embodiments, the system may generate the pattern after receiving an
indication from the user or a third party that the system should
generate the pattern. For instance, the user may speak a voice
command to the wearable device requesting that the device generate
a pattern for the number of steps taken by the user for a typical
day. In various embodiments, the system may receive an indication
from the user or a third party of when to have the system generate
the patterns. For example, the system may receive an indication
from the third party to have the system generate a location pattern
for the location of the user at 8:00 a.m. and at 2:00 p.m. for the
previous month.
[0048] In various embodiments, the movement patterns may include
one or more typical movements made by the user. For example, the
movement pattern may include that the user gets out of bed every
morning. In particular embodiments, the location patterns may
include one or more typical locations of the user. For instance,
the location pattern may include that the user is at a first
particular address in the morning, at a second particular address
during the day, and at the first particular address at night. In
some embodiments, the status patterns may include one or more
typical statuses of the user. In example, the status pattern may
include that the user is awake from 7:00 a.m. until 11:00 p.m. and
asleep from 11:00 p.m. until 7:00 a.m. In other embodiments, the
compliance patterns may include one or more typical compliance
schedules of the user. For example, the compliance pattern may
include that the user is prescribed a medicine that the user takes
every day in the morning with food. In yet other embodiments, the
medicine regime patterns may include one or more typical medicine
regimes for the user. For instance, the medicine regime pattern may
include that the user takes a particular yellow pill, a particular
white pill, and a particular pink pill in the evening with food. In
various embodiments, the system may include one or more typical
levels of one or more medicines in the user's bloodstream. For
example, the typical level of a particular medicine in the user's
bloodstream may be a certain volume at a particular period of
time.
[0049] In particular embodiments, the system may store the
generated patterns in an account associated with the user. In some
embodiments, the generated patterns may be accessible by the user
or a third party. For instance, the generated patterns may be
diagramed in a chart that is accessible from the wearable device or
from a computing device by the user's physician. In various
embodiments, the system may store the generated patterns in the
Behavior Information Database 140. In particular embodiments, the
system may store information in the Behavior Information Database
140 regarding past movement patterns associated with the user
(e.g., when the user goes into different rooms in the user's house,
when the user eats, when the user takes a walk, the destinations
along the walk, etc.). In some embodiments, the system may store
information in the Behavior Information Database 140 regarding the
user's sleep patterns. In other embodiments, the system may store
information in the Behavior Information Database 140 regarding
geo-fences associated with the user. In still other embodiments,
the system may store information in the Behavior Information
Database 140 regarding deviations to the user's typical behavior
(e.g., movement) patterns.
[0050] At Step 325, the system compares the user's behaviors (e.g.,
movements) to the previously established one or more patterns for
the user. In some embodiments, the system compares the user's
movement to the previously established one or more movement
patterns for the user substantially automatically after the system
receives the user's current movements. In some embodiments, the
system may compare the user's movement to the previously
established one or more movement patterns periodically (e.g., by
the second, by the minute, hourly, daily, weekly, monthly, etc.).
For example, the system may compare the user's current movement to
the previously established one or more movement patterns every
thirty minutes throughout the day. In other embodiments, the system
may compare the user's movement to the previously established one
or more movement patterns after receiving an indication from the
user or a third party that the system should compare the user's
movement to the previously established movement pattern. For
instance, the user may speak a voice command to the wearable device
requesting that the device compare the user's movements for the
current day to a movement pattern established the previous month.
In various embodiments, the system may receive an indication from
the user or a third party of when to have the system compare the
user's movements to the one or more patterns. For example, the
system may receive an indication from the third party to have the
system compare the user's current location to a location pattern
for the location of the user at 8:00 a.m. and at 2:00 p.m. on a
typical day.
[0051] In some embodiments, the system may compare the user's
movements to a previously established movement pattern by
calculating the number of steps taken by the user in the particular
day to a predetermined average number of steps taken by the user in
a day. In various embodiments, the system may compare the user's
location to a previously established location pattern by
determining the average location of the user at a particular time
of day. In other embodiments, the system may compare the user's
status to a previously established status pattern by determining
the user's average status at particular times of day.
[0052] In still other embodiments, the system may compare the
user's compliance with a prescribed treatment regime by determining
the user's average compliance with the prescribed treatment regime
for a particular day. In yet other embodiments, the system may
compare the one or more of the type of medicine taken; the time the
medicine is taken; and the dose of the medicine taken to the stored
medicine regime for the user. In various embodiments, the system
may compare the level of one or more medicines in the user's
bloodstream by determining the average level of the one or more
medicines in the user's bloodstream at particular times of day.
[0053] In particular embodiments, the system may store the
comparisons in an account associated with the user. In some
embodiments, the comparisons may be accessible by the user or a
third party. For instance, the comparisons may be diagramed in a
chart that is accessible from the wearable device or from a
computing device by the user's physician.
[0054] Continuing to Step 330, the system detects one or more
inconsistencies between the user's current movements as compared to
the previously established one or more patterns. In other
embodiments, the system does not detect one or more inconsistencies
between the user's current movements as compared to the previously
established one or more patterns. In various embodiments, the
system may detect the one or more inconsistencies by determining
that the user's current movements are inconsistent with the
previously established patterns. In particular embodiments, the
user's current movements may be inconsistent with previously
established patterns based on the current movements being different
from the established patterns by a particular percentage. For
instance, where the user's movement patterns establish that the
user walks a total of one mile a day, the system may determine that
the user's current movement of walking 1/2 mile for the day is
inconsistent with the user's previously established pattern of
walking one mile a day because there is a difference of 50%.
[0055] In some embodiments, the user's current movements may be
inconsistent with the previously established movement patterns
based on the user's current movements not matching the previously
established movement patterns. For instance, for the movement
pattern that includes that the user gets out of bed every morning,
where the system detects that the user does not get out of bed on a
particular morning, the system may determine that the user's
current movements are inconsistent with the previously established
pattern.
[0056] In other embodiments, the user's current movements may be
inconsistent with the previously established patterns based on the
user's current location not matching the previously established
location patterns. For example, for the location pattern that
includes the user at a first particular address in the morning, at
a second particular address during the day, and at the first
particular address at night, where the system detects that the user
was not at the second particular address during the day, the system
may determine that the user's current movements are inconsistent
with the previously established pattern.
[0057] In still other embodiments, the user's current movements may
be inconsistent with the previously established patterns based on
the user's current status not matching the previously established
status patterns. For instance, for the status pattern that includes
that the user is awake from 7:00 a.m. until 11:00 p.m. and asleep
from 11:00 p.m. until 7:00 a.m., where the system detects that the
user is asleep from 7:00 a.m. until 2:00 p.m., the system may
determine that the user's current movements are inconsistent with
the previously established pattern.
[0058] In yet other embodiments, the system may detect one or more
inconsistencies between the medicine regime associated with the
user and the determined one or more of the type of medicine taken
by the user, the time the medicine is taken by the user, and the
dose of medicine taken by the user. For instance, for a medicine
regime that includes that the user takes a particular pill having a
particular color (e.g., yellow), shape (e.g., triangular, square),
and marking (e.g., the number 17) in the evening with food, where
the system detects that the user did not take the particular yellow
pill on a particular evening with food, the system may determine
that the user's current movements are inconsistent with the
previously established pattern.
[0059] In some embodiments, the system may detect one or more
inconsistencies between the level of the one or more medicines in
the user's bloodstream and the determined typical level of the one
or more medicines in the user's bloodstream. For example, for a
typical level of a particular medicine in the user's bloodstream
that includes that the level is a certain volume at a particular
period of time, where the system detects that the level of the
medicine in the user's bloodstream is less than the typical level,
the system may determine that the user's current movements are
inconsistent with the previously established patterns.
[0060] At Step 335, the system notifies the user and/or a third
party of the detected one or more inconsistencies. In particular
embodiments, in addition to notifying at least one recipient
selected from a group consisting of: the user and the third party,
the system updates the user's account to note that a notification
was sent. In various embodiments, the system notifies the user of
the detected one or more inconsistencies. In some embodiments, the
system notifies the third party of the detected one or more
inconsistencies. In particular embodiments, the system may notify
the user of the detected one or more inconsistencies by displaying
an image on the lens of the eyewear, or on another display
associated with the eyewear. In other embodiments, the system
notifies the user of the one or more inconsistencies by
communicating through a speaker to the user.
[0061] In various embodiments, the third party may be a relative of
the user. In other embodiments, the third party may be a police
department. In particular embodiments, the third party may be an
ambulance service. In some embodiments, the third party may be a
physician. In still other embodiments, the third party may be an
independent living provider. In yet other embodiments, the third
party may be a particular caregiver of the user.
[0062] In some embodiments, the system notifies the user and/or the
third party of the one or more inconsistencies by sending a
notification to the user's and/or the third party's mobile devices.
In particular embodiments, the system notifies the user and/or the
third party of the one or more inconsistencies by email or text
message. In other embodiments, the system may notify the user
and/or the third party of a single inconsistency substantially
immediately after the system detects the inconsistency between the
user's current movements as compared to the previously established
one or more movement patterns. In yet other embodiments, the system
may notify the user and/or the third party of all inconsistencies
detected on a particular day at the end of that day.
[0063] In various embodiments, the system may notify the user
and/or the third party of the one or more inconsistencies after a
particular event. For example, the system may notify the user if
the system determines that the calculated number of steps of the
user for a particular day is less than a predetermined percentage
of the predetermined average number of steps taken by the user in a
day. In some embodiments, the system may notify the user and/or the
third party of the one or more inconsistencies after a particular
period of time. For instance, the system may notify the third party
of an association one hour after the system detects one or more
inconsistencies between the user's current movements as compared to
the previously established one or more movement patterns. In still
other embodiments, the system may notify the user of the one or
more inconsistencies at a particular time of day. As an example,
the system may notify the user of one or more inconsistencies
between the user's current movements as compared to the previously
established one or more movement patterns at the end of the
day.
[0064] In various embodiments, at least partially in response to
detecting whether the user moves during the predefined time period,
the system may notify the user and/or third party if the user does
not move during the predefined time period. In other embodiments,
at least partially in response to detecting one of sudden
acceleration and sudden impact (e.g., such as that associated with
a fall), the system may notify user and/or the third party that the
user experienced the one of sudden acceleration and sudden impact.
In some embodiments, at least partially in response to not
detecting one of a heartbeat or breathing associated with the user,
the system may notify the user and/or the third party that the
heartbeat and/or breathing of the user cannot be detected. This may
indicate, for example, a medical emergency associated with the user
or a malfunction of one or more system components.
[0065] In particular embodiments, the system may notify the user
and/or the third party of detected inconsistencies between the
user's current movements and the previously established movement
patterns. In some embodiments, the system may notify the user
and/or the third party of detected inconsistencies between the
user's current location and the previously established location
patterns. In other embodiments, the system may notify the user
and/or the third party of detected inconsistencies between the
user's current status and the previously established status
patterns. In still other embodiments, the system may notify the
user and/or the third party of detected inconsistencies between the
user's current compliance and the previously established compliance
patterns. In yet other embodiments, the system may notify the user
and/or the third party of detected inconsistencies between the
user's current medicine regime and the previously established
medicine regime patterns. In various embodiments, the system may
notify at least one recipient selected from a group consisting of:
the user and the third party of detected inconsistencies between
the user's current level of one or more medicines and the
previously established typical one or more levels of medicine.
[0066] In particular embodiments, the system may notify the user
and/or the third party of detected inconsistencies between the
stored medicine regime and the one or more of the type of medicine
taken, the time the medicine is taken, and the dose of medicine
taken. In some embodiments, the system may notify at least one
recipient selected from a group consisting of: the user and the
third party if the user removes the wearable device for a
predetermined period of time. In other embodiments, the system may
notify the user and/or the third party if the user does not consume
food for a predetermined period of time. In particular embodiments,
the system may notify the user and/or the third party if the user
does not consume liquids for a predetermined period of time. In
various embodiments, the system may notify the user and/or the
third party if the user's caloric intake is above or below a
predetermined number of calories. In some embodiments, the system
may notify the user and/or the third party if the user's oxygen
levels fall below a predetermined threshold. In other embodiments,
the system may notify the user and/or the third party if the user's
blood sugar drops below a predetermined threshold.
[0067] In various embodiments, the system, when executing the
Behavior Pattern Analysis Module 300, may omit particular steps,
perform particular steps in an order other than the order presented
above, or perform additional steps not discussed directly
above.
Structure of the Eyewear
[0068] As shown in FIG. 4, eyewear 400, according to various
embodiments, includes: (1) an eyewear frame 410; (2) a first temple
412; and (3) a second temple 414. These various components are
discussed in more detail below. In various embodiments, the eyewear
400 may be used as the one or more health monitoring devices 156
shown in FIG. 1.
[0069] Eyewear Frame
[0070] Referring still to FIG. 4, eyewear 400, in various
embodiments, includes any suitable eyewear frame 410 configured to
support one or more lenses 418, 420. In the embodiment shown in
this figure, the eyewear frame 410 has a first end 402 and a second
end 404. The eyewear frame 410 may be made of any suitable material
such as metal, ceramic, polymers or any combination thereof. In
particular embodiments, the eyewear frame 410 is configured to
support the first and second lenses 418, 420 about the full
perimeter of the first and second lenses 418, 420. In other
embodiments, the eyewear frame 410 may be configured to support the
first and second lenses 418, 420 about only a portion of each
respective lens. In various embodiments, the eyewear frame 410 is
configured to support a number of lenses other than two lenses
(e.g., a single lens, a plurality of lenses, etc.). In particular
embodiments, the lenses 418, 420 may include prescription lenses,
sunglass lenses, or any other suitable type of lens (e.g., reading
lenses, non-prescription lenses), which may be formed from glass or
polymers.
[0071] The eyewear frame 410 includes a first and second nose pad
422 (not shown in figures), 424, which may be configured to
maintain the eyewear 400 adjacent the front of a wearer's face such
that the lenses 418, 420 are positioned substantially in front of
the wearer's eyes while the wearer is wearing the eyewear 400. In
particular embodiments, the nose pads 422, 424 may comprise a
material that is configured to be comfortable when worn by the
wearer (e.g., rubber, etc.). In other embodiments, the nose pads
may include any other suitable material (e.g., plastic, metal,
etc.). In still other embodiments, the nose pads may be integrally
formed with the frame 410.
[0072] The eyewear frame 410 includes a first and second hinge 426,
428 that attach the first and second temples 412, 414 to the frame
first and second ends 402, 404, respectively. In various
embodiments, the hinges may be formed by any suitable connection
(e.g., tongue and groove, ball and socket, spring hinge, etc.). In
particular embodiments, the first hinge 426 may be welded to, or
integrally formed with, the frame 410 and the first temple 412 and
the second hinge 428 may be welded to, or integrally formed with,
the frame 410 and the second temple 414.
[0073] First and Second Temples
[0074] As shown in FIG. 4, the first temple 412, according to
various embodiments, is rotatably connected to the frame 410 at a
right angle to extend the first temple 412 substantially
perpendicular, substantially parallel, or anywhere in between the
right angle to the frame 410. The first temple 412 has a first and
second end 412a, 412b. Proximate the first temple second end 412b,
the first temple 412 includes an earpiece 413 configured to be
supported by a wearer's ear. Similarly, the second temple 414,
according to various embodiments, is rotatably connected to the
frame 410 at a right angle to extend the second temple 414
substantially perpendicular, substantially parallel, or anywhere in
between the right angle to the frame 410. The second temple 414 has
a first and second end 414a, 414b. Proximate the second temple
second end 414b, the second temple 414 includes an earpiece 415
configured to be supported by a wearer's ear.
[0075] Sensors
[0076] In various embodiments, the second temple 414 has one or
more sensors 430 connected to the second temple 414. In various
embodiments, the one or more sensors 430 may be coupled to the
frame 410, the first and second temples 412, 414, the first and
second lenses 418, 410, or any other portion of the eyewear 400 in
any suitable way. For instance, the one or more sensors 430 may be
embedded into the eyewear 400, coupled to the eyewear 400, and/or
operatively coupled to the eyewear 400. In various embodiments, the
one or more sensors may be formed at any point along the eyewear
400. For instance, a fingerprint reader may be disposed adjacent
the first temple of the eyewear 400. In various embodiments, the
one or more sensors may be formed in any shape. In addition, the
one or more sensors may be formed on the inner (back) surface of
the frame 410, the first and second temples 412, 414, the first and
second lenses 418, 410, or any other portion of the eyewear 400. In
other embodiments, the one or more sensors may be formed on the
outer (front) surface of the frame 410, the first and second
temples 412, 414, the first and second lenses 418, 410, or any
other portion of the eyewear 400.
[0077] In various embodiments, the one or more sensors 430 that are
coupled to the eyewear (or other wearable device) are adapted to
detect one or more characteristics of the eyewear or a wearer of
the eyewear, wherein the one or more characteristics of the wearer
are associated with the wearer's identity. In various embodiments,
the one or more sensors coupled to the eyewear or other health
monitoring device may include, for example, one or more of the
following: a near-field communication sensor, a gyroscope, a
Bluetooth chip, a GPS unit, an RFID tag (passive or active), a
fingerprint reader, an iris reader, a retinal scanner, a voice
recognition sensor, a heart rate monitor, an electrocardiogram
(EKG), an electroencephalogram (EEG), a pedometer, a thermometer, a
front-facing camera, an eye-facing camera, a microphone, an
accelerometer, a magnetometer, a blood pressure sensor, a pulse
oximeter, a skin conductance response sensor, any suitable
biometric reader, or any other suitable sensor. In some
embodiments, the one or more sensors may include a unique shape, a
unique code, or a unique design physically inscribed into the
eyewear that may be readable by an individual or a remote computing
device. In particular embodiments, the sensors coupled to the
eyewear may include one or more electronic communications devices
such as a near field communication sensor, a Bluetooth chip, an
active RFID, and a GPS unit.
[0078] In various embodiments, the one or more sensors are coupled
to a computing device that is associated with (e.g., embedded
within, attached to) the eyewear or other wearable device. In
particular embodiments, the eyewear or other wearable device
comprises at least one processor, computer memory, suitable
wireless communications components (e.g., a Bluetooth chip) and a
power supply for powering the wearable device and/or the various
sensors.
[0079] As noted above, the one or more sensors may be coupled to a
Bluetooth device that is configured to transmit the one or more
signals to a handheld wireless device, and the step of using the
eyewear to confirm the identity of the wearer of the eyewear
(discussed above in reference to Step 310) further comprises
receiving the one or more signals from the wireless handheld device
(e.g., via the Internet). In particular embodiments, one or more of
the sensors may be detachable from the eyewear. For instance, if a
wearer does not need a temperature sensor or other particular
sensor, the sensor may be removed from the eyewear.
[0080] Exemplary User Experience
[0081] Independent Living of Elderly
[0082] In a particular example of a user using the Behavior Pattern
Analysis Module 300, the user may put on the wearable device in the
morning and continue to wear the device throughout the day. In
various embodiments, the wearable device may be operatively coupled
(e.g., via a suitable wireless or wired connection) to a smart
phone, a laptop, a desktop computer, an automated dialing
apparatus, or any other computing device that can receive signals
from the wearable device and either transmit the signals to a
central system (e.g., via a wireless or wired telephone network) or
analyze the signals and make decisions based on the received
signals (e.g., call for help, notify a loved one, etc.). During
this time, the system will track the movements of the user using
the motion sensor, the accelerometer, the global positioning
sensor, the gyroscope, and the front-facing camera. In this
example, the user may be an elderly or infirm person that desires
to live independently, but the person requires monitoring for
events that deviate from the person's normal routine. Thus, by
wearing the wearable device throughout the day, the device is able
to track the user's movements and create certain patterns based on
these movements for the user. The system may then store these
patterns in a database while continuing to track the user's
movements. Where the system detects that the user has deviated from
the previously established pattern, the system may notify the
user's physician, for example directly from the wearable device, or
via the connected smartphone, computer or the automated dialing
apparatus. Such deviations from the previously established pattern
may include that the user falls, that the user wanders beyond
preset boundaries (e.g., defined by one or more geofences), that
the user begins sleeping longer than usual, that the user stops
moving, or any other deviation from a previously established
pattern of the user's normal routine.
[0083] For example, the user may be an Alzheimer's patient that has
lucid moments and moments of memory loss. As has been established
as a movement pattern by the wearable device, the patient takes a
walk around the block every morning. However, if the patient
wanders two blocks over and outside of the user's predetermined
geo-fenced area, which is a deviation from the patient's normal
pattern of movements, the system may alert the patient's caregiver
of the inconsistency between the patient's current actions and the
previously established patterns.
[0084] Monitor Compliance with Medicine Regime
[0085] The system, in a particular example, will also monitor the
user's compliance with a medicine regime. In order to establish the
user's medicine regime pattern, the user may wear the wearable
device to detect when the user takes medicine and what medicine is
taken using the front-facing camera. The user may also speak the
name of the medicine as the wearable device captures an image of
the medicine the user is taking. The system is then able to
establish a pattern of the user taking blood pressure medicine
every morning after monitoring the user for a week. The system may
then monitor the user's current medicine intake to compare the
medicines the user takes and the time that the user takes the
medicine to the previously established medicine regime pattern. If
the user fails to take the blood pressure medicine on a particular
morning, the system may notify the user, the user's caregiver, a
health care provider, or a third party that the user has deviated
from the previously established medicine regime.
CONCLUSION
[0086] Many modifications and other embodiments of the invention
will come to mind to one skilled in the art to which this invention
pertains, having the benefit of the teaching presented in the
foregoing descriptions and the associated drawings. Therefore, it
is to be understood that the invention is not to be limited to the
specific embodiments disclosed and that modifications and other
embodiments are intended to be included within the scope of the
appended claims. Although specific terms are employed herein, they
are used in a generic and descriptive sense only and not for the
purposes of limitation.
* * * * *