U.S. patent application number 17/119618 was filed with the patent office on 2021-06-17 for adjusting operation of a medication delivery system in response to gesture-indicated activity changes.
The applicant listed for this patent is MEDTRONIC MINIMED, INC.. Invention is credited to Lavie Golenberg, Maria Diana Miller.
Application Number | 20210178068 17/119618 |
Document ID | / |
Family ID | 1000005313239 |
Filed Date | 2021-06-17 |
United States Patent
Application |
20210178068 |
Kind Code |
A1 |
Miller; Maria Diana ; et
al. |
June 17, 2021 |
ADJUSTING OPERATION OF A MEDICATION DELIVERY SYSTEM IN RESPONSE TO
GESTURE-INDICATED ACTIVITY CHANGES
Abstract
A system disclosed here includes an insulin infusion device, a
gesture-based physical behavior detection system that generates
gesture data for a user, and a controller that controls operation
of the insulin infusion device. The controller performs adaptive
training of at least one feature, function, setting, or model
associated with the insulin infusion device, based at least in part
on the sensor data. The controller processes activity-identifying
data that indicates a current behavior pattern of the user, and
that includes gesture data provided by the gesture-based physical
behavior detection system. The controller determines, from the
activity-identifying data, that the current behavior pattern
differs from a currently implemented therapy behavior pattern of
the user, and, in response to the determination, alters the
adaptive training of the at least one feature, function, setting,
or model.
Inventors: |
Miller; Maria Diana; (Santa
Rosa Valley, CA) ; Golenberg; Lavie; (Sherman Oaks,
CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
MEDTRONIC MINIMED, INC. |
Northridge |
CA |
US |
|
|
Family ID: |
1000005313239 |
Appl. No.: |
17/119618 |
Filed: |
December 11, 2020 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
62947988 |
Dec 13, 2019 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
A61M 2005/14288
20130101; G16H 40/40 20180101; G16H 20/17 20180101; G06F 3/017
20130101; A61M 5/1723 20130101 |
International
Class: |
A61M 5/172 20060101
A61M005/172; G16H 40/40 20060101 G16H040/40; G16H 20/17 20060101
G16H020/17; G06F 3/01 20060101 G06F003/01 |
Claims
1. A method of operating a medication delivery system comprising a
fluid pump mechanism, an analyte sensor to provide sensor data
indicative of a physiological characteristic of a user, and at
least one controller that regulates operation of the fluid pump
mechanism to deliver medication from the medication delivery system
to the user, based at least in part on the sensor data, the method
comprising: performing adaptive training of at least one feature,
function, setting, or model associated with the medication delivery
system, based at least in part on the sensor data provided by the
analyte sensor; processing activity-identifying data that indicates
a current behavior pattern of the user, the activity-identifying
data comprising gesture data for the user, the gesture data
provided by a gesture-based physical behavior detection system;
determining, from the activity-identifying data, that the current
behavior pattern differs from a currently implemented therapy
behavior pattern of the user; and in response to the determining,
altering the adaptive training of the at least one feature,
function, setting, or model, resulting in an altered adaptive
training scheme.
2. The method of claim 1, further comprising: in response to the
determining, generating a confirmation message for the user, the
confirmation message requesting authorization to alter the adaptive
training, wherein altering the adaptive training occurs in response
to receiving an authorization to alter the adaptive training.
3. The method of claim 1, further comprising: adaptively training
the at least one feature, function, setting, or model in accordance
with the altered adaptive training scheme for a predetermined
period of time; automatically reverting to a previous adaptive
training scheme after the predetermined period of time; and
adaptively training the at least one feature, function, setting, or
model in accordance with the previous adaptive training scheme.
4. The method of claim 1, further comprising: adaptively training
the at least one feature, function, setting, or model in accordance
with the altered adaptive training scheme; and reverting to a
previous adaptive training scheme in response to receiving a
user-initiated command.
5. The method of claim 1, further comprising: processing updated
activity-identifying data that indicates an updated behavior
pattern of the user; detecting, from the updated
activity-identifying data, that the updated behavior pattern
corresponds to a previously implemented therapy behavior pattern of
the user; and reverting to a previous adaptive training scheme in
response to the detecting.
6. The method of claim 1, wherein: the medication delivery system
is controlled to automatically deliver the medication to the user
in accordance with a therapy control algorithm; and the adaptive
training trains at least one therapy-altering factor of the therapy
control algorithm.
7. The method of claim 6, further comprising: changing one or more
therapy-altering factors of the therapy control algorithm, in
response to determining that the current behavior pattern differs
from the currently implemented therapy behavior pattern of the
user.
8. The method of claim 1, wherein the adaptive training trains a
physiological model of the user that simulates physiological
response of the user to delivery of the medication.
9. The method of claim 1, wherein altering the adaptive training
occurs automatically without user input.
10. The method of claim 1, wherein: the processing and determining
steps are performed by a data processing system that communicates
with the medication delivery system; and the data processing system
sends at least one command to the medication delivery system, the
at least one command causing the medication delivery system to
alter the adaptive training.
11. The method of claim 1, wherein the processed
activity-identifying data comprises user status data for the user,
the user status data generated by at least one ancillary system
that monitors the user.
12. At least one non-transitory computer readable medium having
stored thereon program code instructions that are configurable to
cause at least one processor to perform a method comprising:
performing adaptive training of at least one feature, function,
setting, or model associated with a medication delivery system,
based at least in part on sensor data provided by an analyte sensor
that measures a physiological characteristic of a user; processing
activity-identifying data that indicates a current behavior pattern
of the user, the activity-identifying data comprising gesture data
for the user, the gesture data provided by a gesture-based physical
behavior detection system; determining, from the
activity-identifying data, that the current behavior pattern
differs from a currently implemented therapy behavior pattern of
the user; and in response to the determining, altering the adaptive
training of the at least one feature, function, setting, or model,
resulting in an altered adaptive training scheme.
13. The at least one non-transitory computer readable medium of
claim 12, wherein: the medication delivery system operates to
automatically deliver the medication to the user in accordance with
a therapy control algorithm; and the adaptive training trains at
least one therapy-altering factor of the therapy control
algorithm.
14. The at least one non-transitory computer readable medium of
claim 13, wherein the method further comprises: changing one or
more therapy-altering factors of the therapy control algorithm, in
response to determining that the current behavior pattern differs
from the currently implemented therapy behavior pattern of the
user.
15. The at least one non-transitory computer readable medium of
claim 12, wherein the adaptive training trains a physiological
model of the user that simulates physiological response of the user
to delivery of the medication.
16. The at least one non-transitory computer readable medium of
claim 12, wherein the processed activity-identifying data comprises
user status data for the user, the user status data generated by at
least one ancillary system that monitors the user.
17. A system comprising: an insulin infusion device that regulates
delivery of insulin to a user; a gesture-based physical behavior
detection system configured to generate gesture data for the user,
and configured to communicate the gesture data; and at least one
controller that controls operation of the insulin infusion device,
the at least one controller configured to: perform adaptive
training of at least one feature, function, setting, or model
associated with the insulin infusion device, based at least in part
on the sensor data; process activity-identifying data that
indicates a current behavior pattern of the user, the
activity-identifying data comprising gesture data for the user, the
gesture data provided by the gesture-based physical behavior
detection system; determine, from the activity-identifying data,
that the current behavior pattern differs from a currently
implemented therapy behavior pattern of the user; and in response
to the determining, altering the adaptive training of the at least
one feature, function, setting, or model, resulting in an altered
adaptive training scheme.
18. The system of claim 17, wherein the insulin infusion device
comprises the at least one controller.
19. The system of claim 17, wherein the activity-identifying data
comprises user status data for the user, the user status data
generated by at least one ancillary system that monitors
characteristics, status, or condition of the user.
20. The system of claim 17, wherein: the insulin infusion device
operates to automatically deliver insulin to the user in accordance
with a therapy control algorithm; and the adaptive training trains
at least one therapy-altering factor of the therapy control
algorithm.
21. The system of claim 17, wherein the adaptive training trains a
physiological model of the user that simulates physiological
response of the user to delivery of insulin.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of U.S. provisional
patent application No. 62/947,988, filed Dec. 13, 2019.
TECHNICAL FIELD
[0002] The present technology is generally related to the control,
operation, and adjustment of a medication delivery system in
response to changes in patient lifestyle, activity, or eating
habits, as detected by a gesture-based physical behavior detection
system.
BACKGROUND
[0003] Medical therapy delivery systems, such as fluid infusion
devices, are relatively well known in the medical arts for use in
delivering or dispensing an agent, such as insulin or another
prescribed medication, to a patient. A typical medication infusion
device includes a fluid pump mechanism and an associated drive
system that actuates a plunger or piston of a fluid reservoir to
deliver fluid medication from the reservoir to the body of a
patient via a fluid delivery conduit between the reservoir and the
body of a patient. Use of infusion pump therapy has been
increasing, especially for delivering insulin to diabetic
patients.
[0004] Control schemes have been developed to allow insulin
infusion devices to monitor and regulate a patient's blood glucose
level in a substantially continuous and autonomous manner. An
insulin infusion device can be operated in an automatic mode
wherein basal insulin is delivered at a rate that is automatically
adjusted for the user. Moreover, an insulin infusion device can be
operated to automatically calculate, recommend, and deliver insulin
boluses as needed (e.g., to compensate for meals consumed by the
user). Ideally, the amount of an insulin bolus should be accurately
calculated and administered to maintain the user's blood glucose
within the desired range. In particular, an automatically generated
and delivered insulin bolus should safely manage the user's blood
glucose level and keep it above a defined threshold level. To this
end, an insulin infusion device operating in an automatic mode uses
continuous glucose sensor data and control algorithms to regulate
the user's blood glucose, based on a target glucose setpoint
setting and user-initiated meal announcements that typically
include estimations of the amount of carbohydrates to be consumed
in an upcoming meal.
BRIEF SUMMARY
[0005] The subject matter of this disclosure generally relates to a
system and related operating methodologies for the control,
operation, and adjustment of a medication delivery system, such as
an insulin infusion device. Certain settings, parameters, or
operating modes of the medication delivery system can be adjusted
or modified in response to changes in patient lifestyle, activity,
or eating habits, as detected by a gesture-based physical behavior
detection system.
[0006] In one aspect, the present disclosure provides a method of
operating a medication delivery system having a fluid pump
mechanism, an analyte sensor to provide sensor data indicative of a
physiological characteristic of a user, and at least one controller
that regulates operation of the fluid pump mechanism to deliver
medication from the medication delivery system to the user, based
at least in part on the sensor data. Exemplary embodiments of the
method involve: performing adaptive training of at least one
feature, function, setting, or model associated with the medication
delivery system, based at least in part on the sensor data provided
by the analyte sensor; processing activity-identifying data that
indicates a current behavior pattern of the user, the
activity-identifying data including gesture data for the user, the
gesture data provided by a gesture-based physical behavior
detection system; determining, from the activity-identifying data,
that the current behavior pattern differs from a currently
implemented therapy behavior pattern of the user; and in response
to the determining, altering the adaptive training of the at least
one feature, function, setting, or model, resulting in an altered
adaptive training scheme.
[0007] In another aspect, the disclosure provides a non-transitory
computer readable medium having stored thereon program code
instructions that are configurable to cause at least one processor
to perform a method that involves: performing adaptive training of
at least one feature, function, setting, or model associated with a
medication delivery system, based at least in part on sensor data
provided by an analyte sensor that measures a physiological
characteristic of a user; processing activity-identifying data that
indicates a current behavior pattern of the user, the
activity-identifying data including gesture data for the user, the
gesture data provided by a gesture-based physical behavior
detection system; determining, from the activity-identifying data,
that the current behavior pattern differs from a currently
implemented therapy behavior pattern of the user; and in response
to the determining, altering the adaptive training of the at least
one feature, function, setting, or model, resulting in an altered
adaptive training scheme.
[0008] In yet another aspect, the disclosure provides a system
having: an insulin infusion device that regulates delivery of
insulin to a user; a gesture-based physical behavior detection
system configured to generate gesture data for the user, and
configured to communicate the gesture data; and at least one
controller that controls operation of the insulin infusion device.
The at least one controller is configured to: perform adaptive
training of at least one feature, function, setting, or model
associated with the insulin infusion device, based at least in part
on the sensor data; process activity-identifying data that
indicates a current behavior pattern of the user, the
activity-identifying data including gesture data for the user, the
gesture data provided by a gesture-based physical behavior
detection system; determine, from the activity-identifying data,
that the current behavior pattern differs from a currently
implemented therapy behavior pattern of the user; and in response
to the determining, altering the adaptive training of the at least
one feature, function, setting, or model, resulting in an altered
adaptive training scheme.
[0009] The details of one or more aspects of the disclosure are set
forth in the accompanying drawings and the description below. Other
features, objects, and advantages of the techniques described in
this disclosure will be apparent from the description and drawings,
and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] FIG. 1 is a simplified block diagram representation of an
exemplary embodiment of a system that includes a medication
delivery system that responds to changes in patient activity as
indicated by the output of a gesture-based physical behavior
detection system;
[0011] FIG. 2 is a plan view of an exemplary embodiment of an
insulin infusion device that is suitable for use as the medication
delivery system shown in FIG. 1;
[0012] FIG. 3 is a top perspective view of an embodiment of an
insulin infusion device implemented as a patch pump device that is
suitable for use as the medication delivery system shown in FIG.
1;
[0013] FIG. 4 is a perspective view of an exemplary embodiment of a
smart insulin pen that is suitable for use as the medication
delivery system shown in FIG. 1;
[0014] FIG. 5 is a perspective view of an exemplary embodiment of a
smart pen accessory that is suitable for use with the medication
delivery system shown in FIG. 1;
[0015] FIG. 6 is a block diagram representation of an exemplary
embodiment of a computer-based or processor-based device suitable
for deployment in the system shown in FIG. 1;
[0016] FIG. 7 is a block diagram representation of a closed loop
glucose control system arranged in accordance with certain
embodiments;
[0017] FIG. 8 is a block diagram representation of a gesture-based
physical behavior detection system arranged in accordance with
certain embodiments;
[0018] FIG. 9 is a flow chart that illustrates an infusion device
control process according to certain embodiments; and
[0019] FIG. 10 is a flow chart that illustrates a gesture training
process according to certain embodiments.
DETAILED DESCRIPTION
[0020] The following detailed description is merely illustrative in
nature and is not intended to limit the embodiments of the subject
matter or the application and uses of such embodiments. As used
herein, the word "exemplary" means "serving as an example,
instance, or illustration." Any implementation described herein as
exemplary is not necessarily to be construed as preferred or
advantageous over other implementations. Furthermore, there is no
intention to be bound by any expressed or implied theory presented
in the preceding technical field, background, brief summary or the
following detailed description.
[0021] It should be understood that various aspects disclosed
herein may be combined in different arrangements than the
combinations specifically presented in the description and
accompanying drawings. It should also be understood that, depending
on the example, certain acts or events of any of the processes or
methods described herein may be performed in a different sequence,
may be added, merged, or left out altogether (e.g., all described
acts or events may not be necessary to carry out the techniques).
In addition, while certain aspects of this disclosure are described
as being performed by a single module or unit for purposes of
clarity, it should be understood that the techniques of this
disclosure may be performed by a combination of units or modules
associated with, for example, a medical device.
[0022] In one or more examples, the described techniques may be
implemented in hardware, software, firmware, or any combination
thereof. If implemented in software, the functions may be stored as
one or more instructions or code on a computer-readable medium and
executed by a hardware-based processing unit. Computer-readable
media may include non-transitory computer-readable media, which
corresponds to a tangible medium such as data storage media (e.g.,
RAM, ROM, EEPROM, flash memory, or any other medium that can be
used to store desired program code in the form of instructions or
data structures and that can be accessed by a computer).
[0023] Program code instructions may be configurable to be executed
by one or more processors, such as one or more digital signal
processors (DSPs), general purpose microprocessors, controllers,
application specific integrated circuits (ASICs), field
programmable logic arrays (FPGAs), or other equivalent integrated
or discrete logic circuitry. Accordingly, the term "processor" as
used herein may refer to any of the foregoing structure or any
other physical structure suitable for implementation of the
described techniques. Also, the techniques could be fully
implemented in one or more circuits or logic elements.
[0024] Techniques and technologies may be described herein in terms
of functional and/or logical block components, and with reference
to symbolic representations of operations, processing tasks, and
functions that may be performed by various computing components or
devices. Such operations, tasks, and functions are sometimes
referred to as being computer-executed, computerized,
software-implemented, or computer-implemented. It should be
appreciated that the various block components shown in the figures
may be realized by any number of hardware, software, and/or
firmware components configured to perform the specified functions.
For example, an embodiment of a system or a component may employ
various integrated circuit components, e.g., memory elements,
digital signal processing elements, logic elements, look-up tables,
or the like, which may carry out a variety of functions under the
control of one or more microprocessors or other control
devices.
[0025] FIG. 1 is a simplified block diagram representation of an
exemplary embodiment of a system 100 that responds to changes in
the user's activity (e.g., eating, sleeping, exercise, and/or
working habits) by regulating operation of a medication delivery
system 102 in an appropriate manner. In certain embodiments, the
medication delivery system 102 responds to changes in patient
activity as indicated by the output of a gesture-based physical
behavior detection system 104 and/or the output of at least one
ancillary sensor, detector, or measurement system 106 (hereinafter
referred to as ancillary system(s) 106). Certain embodiments of the
system 100 include, without limitation: the medication delivery
system 102 (or device) that regulates delivery of medication to a
user; at least one gesture-based physical behavior detection system
104 that monitors user behavior and/or status to obtain gesture
data that indicates user activity events or behavior; at least one
ancillary system 106; at least one user device 108 that includes or
cooperates with a suitably written and configured patient care
application 110; an analyte sensor 112 to measure a physiological
characteristic of the user, such that sensor data obtained from the
analyte sensor 112 can be used to control, regulate, or otherwise
influence the operation of the medication delivery system 102; and
at least one patient history and outcomes database 114. In
accordance with certain cloud-implemented embodiments, the system
includes at least one data processing system 116, which may be in
communication with any or all of the other components of the system
100. Other configurations and topologies for the system 100 are
also contemplated here, such as a system that includes additional
intermediary, interface, or data repeating devices in the data path
between a sending device and a receiving device.
[0026] At least some of the components of the system 100 are
communicatively coupled with one another to support data
communication, signaling, and/or transmission of control commands
as needed, via at least one communications network 120. The at
least one communications network 120 may support wireless data
communication and/or data communication using tangible data
communication links. FIG. 1 depicts network communication links in
a simplified manner. In practice, the system 100 may cooperate with
and leverage any number of wireless and any number of wired data
communication networks maintained or operated by various entities
and providers. Accordingly, communication between the various
components of the system 100 may involve multiple network links and
different data communication protocols. In this regard, the network
can include or cooperate with any of the following, without
limitation: a local area network; a wide area network; the
Internet; a personal area network; a near-field data communication
link; a cellular communication network; a satellite communication
network; a video services or television broadcasting network; a
network onboard a vehicle; or the like. The components of the
system 100 may be suitably configured to support a variety of
wireless and wired data communication protocols, technologies, and
techniques as needed for compatibility with the at least one
communication network 120.
[0027] The system 100 can support any type of medication delivery
system 102 that is compatible with the features and functionality
described here. For example, the medication delivery system 102 may
be realized as a user-activated or user-actuated fluid delivery
device, such as a manual syringe, an injection pen, a smart insulin
pen, or the like. As another example, the medication delivery
system 102 may be implemented as an electronic device that is
operated to regulate the delivery of medication fluid to the user.
In certain embodiments, however, the medication delivery system 102
includes or is realized as an insulin infusion device, e.g., a
portable patient-worn or patient-carried insulin pump, a smart
insulin pen, or the like. In such embodiments, the analyte sensor
112 includes or is realized as a glucose meter, a glucose sensor,
or a continuous glucose monitor. For the sake of brevity,
conventional techniques related to insulin infusion device
operation, infusion set operation, and other functional aspects of
the systems (and the individual operating components of the
systems) may not be described in detail here. Examples of infusion
pumps may be of the type described in, but not limited to, U.S.
Pat. Nos.: 4,562,751; 4,685,903; 5,080,653; 5,505,709; 5,097,122;
6,485,465; 6,554,798; 6,558,320; 6,558,351; 6,641,533; 6,659,980;
6,752,787; 6,817,990; 6,932,584; and 7,621,893; each of which are
herein incorporated by reference.
[0028] FIG. 2 is a plan view of an exemplary embodiment of an
insulin infusion device 130 suitable for use as the medication
delivery system 102 shown in FIG. 1. The insulin infusion device
130 is a portable medical device designed to be carried or worn by
the patient. The illustrated embodiment of the insulin infusion
device 130 includes a housing 132 adapted to receive an
insulin-containing reservoir (hidden from view in FIG. 2). An
opening in the housing 132 accommodates a fitting 134 (or cap) for
the reservoir, with the fitting 134 being configured to mate or
otherwise interface with tubing 136 of an infusion set 138 that
provides a fluid path to/from the body of the user. In this manner,
fluid communication from the interior of the insulin reservoir to
the user is established via the tubing 136. The illustrated version
of the insulin infusion device 130 includes a human-machine
interface (HMI) 140 (or user interface) that includes elements that
can be manipulated by the user to administer a bolus of fluid
(e.g., insulin), to change therapy settings, to change user
preferences, to select display features, and the like. The insulin
infusion device 130 also includes a display 142, such as a liquid
crystal display (LCD) or another suitable display technology, that
can be used to present various types of information or data to the
user, such as, without limitation: the current glucose level of the
patient; the time; a graph or chart of the patient's glucose level
versus time; device status indicators; etc. The insulin infusion
device 130 may be configured and controlled to support other
features and interactive functions described in more detail
below.
[0029] Generally, a fluid infusion device (such as the insulin
infusion device 130) includes a fluid pump mechanism having a motor
or other actuation arrangement that is operable to linearly
displace a plunger (or stopper) of a fluid reservoir provided
within the fluid infusion device to deliver a dosage of fluid
medication, such as insulin, to the body of a user. Dosage commands
that govern operation of the motor may be generated in an automated
manner in accordance with the delivery control scheme associated
with a particular operating mode, and the dosage commands may be
generated in a manner that is influenced by a current (or most
recent) measurement of a physiological condition in the body of the
user. For a glucose control system suitable for use by diabetic
patients, a closed-loop or automatic operating mode can be used to
generate insulin dosage commands based on a difference between a
current (or most recent) measurement of the interstitial fluid
glucose level in the body of the user and a target (or reference)
glucose setpoint value. In this regard, the rate of infusion may
vary as the difference between a current measurement value and the
target measurement value fluctuates. For purposes of explanation,
the subject matter is described herein in the context of the
infused fluid being insulin for regulating a glucose level of a
user (or patient); however, it should be appreciated that many
other fluids may be administered through infusion, and the subject
matter described herein is not necessarily limited to use with
insulin.
[0030] FIG. 3 is a top perspective view of an embodiment of an
insulin infusion device 146 implemented as a patch pump device that
is suitable for use as the medication delivery system 102 shown in
FIG. 1. The insulin infusion device 146 can be implemented as a
combination device that includes an insertable insulin delivery
cannula and an insertable glucose sensor (both of which are hidden
from view in FIG. 3). In such an implementation, the glucose sensor
may take the place of the separate analyte sensor 112 shown in FIG.
1. The insulin infusion device 146 includes a housing 148 that
serves as a shell for a variety of internal components. FIG. 3
shows the insulin infusion device 146 with a removable fluid
cartridge 150 installed and secured therein. The housing 148 is
suitably configured to receive, secure, and release the removable
fluid cartridge 150. The insulin infusion device 146 includes at
least one user interface feature, which can be actuated by the
patient as needed. The illustrated embodiment of the insulin
infusion device 146 includes a button 152 that is physically
actuated. The button 152 can be a multipurpose user interface if so
desired to make it easier for the user to operate the insulin
infusion device 146. In this regard, the button 152 can be used in
connection with one or more of the following functions, without
limitation: waking up the processor and/or electronics of the
insulin infusion device 146; triggering an insertion mechanism to
insert a fluid delivery cannula and/or an analyte sensor into the
subcutaneous space or similar region of the user; configuring one
or more settings of the insulin infusion device 146; initiating
delivery of medication fluid from the fluid cartridge 150;
initiating a fluid priming operation; disabling alerts or alarms
generated by the insulin infusion device 146; and the like. In lieu
of the button 152, the insulin infusion device 146 can employ a
slider mechanism, a pin, a lever, a switch, a touch-sensitive
element, or the like. In certain embodiments, the insulin infusion
device 146 may be configured and controlled to support other
features and interactive functions described in more detail
below.
[0031] FIG. 4 is a perspective view of an exemplary embodiment of a
smart insulin pen 160 suitable for use as the medication delivery
system shown in FIG. 1. The pen 160 includes an injector body 162
and a cap 164. FIG. 4 shows the cap 164 removed from the injector
body 162, such that a delivery needle 166 is exposed. The pen 160
includes suitably configured electronics and processing capability
to communicate with an application running on a user device, such
as a smartphone, to support various functions and features such as:
tracking active insulin; calculating insulin dosages (boluses);
tracking insulin dosages; monitoring insulin supply levels; patient
reminders and notifications; and patient status reporting. In
certain embodiments, the smart insulin pen 160 can receive insulin
dosage recommendations or instructions and/or recommended dosing
times (or a recommended dosing schedule). Moreover, the smart
insulin pen 160 may be configured and controlled to support other
features and interactive functions described in more detail
below.
[0032] FIG. 5 is a perspective view of an exemplary embodiment of a
smart pen accessory 170 that is suitable for use with the
medication delivery system 102 shown in FIG. 1. In particular, the
smart pen accessory 170 cooperates with a "non-smart" insulin pen
that lacks the intelligence and functionality of a smart insulin
pen (as described above). The smart pen accessory 170 can be
realized as a pen cap, a clip-on apparatus, a sleeve, or the like.
The smart pen accessory 170 is attached to an insulin pen 172 such
that the smart pen accessory 170 can measure the amount of insulin
delivered by the insulin pen 172. The insulin dosage data is stored
by the smart pen accessory 170 along with corresponding date/time
stamp information. In certain embodiments, the smart pen accessory
170 can receive, store, and process additional patient-related or
therapy-related data, such as glucose data. Indeed, the smart pen
accessory 170 may also support various features and functions
described above in the context of the smart insulin pen 160. For
example, the smart pen accessory 170 may be configured to receive
insulin dosage recommendations or instructions and/or recommended
dosing times (or a recommended dosing schedule). Moreover, the
smart pen accessory 170 may be configured and controlled to support
other features and interactive functions described in more detail
below.
[0033] The analyte sensor 112 may communicate sensor data to the
medication delivery system 102 for use in regulating or controlling
operation of the medication delivery system 102. Alternatively or
additionally, the analyte sensor 112 may communicate sensor data to
one or more other components in the system 100, such as, without
limitation: a user device 108 (for use with the patient care
application 110); a data processing system 116; and/or a patient
history and outcomes database 114.
[0034] The system 100 can support any number of user devices 108
linked to the particular user or patient. In this regard, a user
device 108 may be, without limitation: a smartphone device; a
laptop, desktop, or tablet computer device; a medical device; a
wearable device; a global positioning system (GPS) receiver device;
a system, component, or feature onboard a vehicle; a smartwatch
device; a television system; a household appliance; a video game
device; a media player device; or the like. For the example
described here, the medication delivery system 102 and the at least
one user device 108 are owned by, operated by, or otherwise linked
to a user/patient. Any given user device 108 can host, run, or
otherwise execute the patient care application 110. In certain
embodiments, for example, the user device 108 is implemented as a
smartphone with the patient care application 110 installed thereon.
In accordance with another example, the patient care application
110 is implemented in the form of a website or webpage, e.g., a
website of a healthcare provider, a website of the manufacturer,
supplier, or retailer of the medication delivery system 102, or a
website of the manufacturer, supplier, or retailer of the analyte
sensor 112. In accordance with another example, the medication
delivery system 102 executes the patient care application 110 as a
native function.
[0035] In certain embodiments, at least some of the features or
output of the gesture-based physical behavior detection system 104
and/or the ancillary system(s) 106 can be used to influence
features, functions, and/or therapy-related operations of the
medication delivery system 102. In particular, the systems 104, 106
may be suitably configured and operated to generate and provide
output (e.g., data, control signals, markers, or flags) that
indicates whether the user's behavior or activity is out of the
ordinary, unusual, or has significantly changed relative to a
currently implemented or active therapy behavior pattern of the
user, such that the medication delivery system 102 can dynamically
respond in an appropriate manner that contemplates a change in user
activity. Changes in user activity patterns, behavior, routine, or
lifestyle may include, for example: working longer hours than
usual; working on a different schedule than usual; eating,
drinking, walking, or exercising more than usual combined with an
unusual geographic location (which might indicate that the user is
on vacation or is traveling for business); a new or altered
exercise regimen; a change in diet or eating schedule; the addition
of a daily walking or running routine as a result of a new dog;
weekly participation in a sport due to the start of a recreational
league; or the like.
[0036] As described in more detail below, the gesture-based
physical behavior detection system 104 includes one or more
sensors, detectors, measurement devices, and/or readers to
automatically detect certain user gestures that correlate to user
behavior, eating habits, work habits, or the like (e.g.,
work-related physical activity, commuting, eating at common meal
times, eating particular portion sizes, sleeping, exercising, or
watching television). The gesture-based physical behavior detection
system 104 may communicate gesture data to the medication delivery
system 102, the user device 108, and/or the data processing system
116 for processing in an appropriate manner for use in regulating
or controlling certain functions of the medication delivery system
102. For example, the gesture data may be communicated to a user
device 108, such that the user device 108 can process the gesture
data and inform the user or the medication delivery system 102 as
needed (e.g., remotely regulate or control certain functions of the
medication delivery system 102). As another example, the
gesture-based physical behavior detection system 104 may
communicate the gesture data to one or more cloud computing systems
or servers (such as a remote data processing system 116) for
appropriate processing and handling in the manner described
herein.
[0037] Similarly, an ancillary system 106 may include one or more
sensors, detectors, measurement devices, and/or readers that obtain
ancillary user status data that correlates to user activity,
detectable behavior, eating habits, etc. In certain embodiments, an
ancillary system 106 may include, cooperate with, or be realized as
any of the following, without limitation: a heartrate monitor
linked to the user; a blood pressure monitor linked to the user; a
respiratory rate monitor linked to the user; a vital signs monitor
linked to the user; a microphone; a thermometer (for the user's
body temperature and/or the environmental temperature); a sweat
detector linked to the user; an activity tracker linked to the
user; a global positioning system (GPS); a clock, calendar, or
appointment application linked to the user; a pedometer linked to
the user; or the like. An ancillary system 106 may be configured
and operated to communicate its output (user status data) to one or
more components of the system 100 for analysis, processing, and
handling in the manner described herein. In certain embodiments,
user status data obtained from one or more ancillary systems 106
supplements the gesture data obtained from the gesture-based
physical behavior detection system 104, such that user habits,
physical behavior, and activity events are accurately and reliably
detected. For example, the user's location (obtained from GPS
location data) can be useful to identify a change in the user's
lifestyle or behavior patterns, wherein the change results from a
move to a new city, a vacation, a business trip, or the like.
[0038] In certain embodiments, the gesture-based physical behavior
detection system 104 and the medication delivery system 102 are
implemented as physically distinct and separate components, as
depicted in FIG. 1. In such embodiments, the gesture-based physical
behavior detection system 104 is external to the medication
delivery system 102 and is realized as an ancillary component,
relative to the medication delivery system 102. In accordance with
alternative embodiments, however, the medication delivery system
102 and the gesture-based physical behavior detection system 104
can be combined into a single hardware component or provided as a
set of attached hardware devices. For example, the medication
delivery system 102 may include the gesture-based physical behavior
detection system 104 or integrate the functionality of the system
104. Similarly, the analyte sensor 112 can be incorporated with the
medication delivery system 102 or the gesture-based physical
behavior detection system 104. These and other arrangements,
deployments, and topologies of the system 100 are contemplated by
this disclosure.
[0039] The at least one patient history and outcomes database 114
includes historical data related to the user's physical condition,
physiological response to the medication regulated by the
medication delivery system 102, activity patterns or related
information, eating patterns and habits, work habits, and the like.
In accordance with embodiments where the medication delivery system
102 is an insulin infusion device and the analyte sensor 112 is a
glucose meter, sensor, or monitor, the database 114 can maintain
any of the following, without limitation: historical glucose data
and corresponding date/time stamp information; insulin delivery and
dosage information; user-entered stress markers or indicators;
gesture data (provided by the gesture-based physical behavior
detection system 104) and corresponding date/time stamp
information; ancillary user status data (provided by one or more
ancillary systems 106) and corresponding date/time stamp data; diet
or food intake history for the user; location information; and/or
any other information that may be generated by or used by the
system 100 for purposes of controlling the operation of the
medication delivery system 102. In certain embodiments, the at
least one patient history and outcomes database 114 can receive and
maintain training data that is utilized to train, configure, and
initialize the system 100 based on historical user behavior,
physiological state, operation of the medication delivery system
102, and user-identified activity events.
[0040] A patient history and outcomes database 114 may reside at a
user device 108, at the medication delivery system 102, at a data
processing system 116, or at any network-accessible location (e.g.,
a cloud-based database or server system). In certain embodiments, a
patient history and outcomes database 114 may be included with the
patient care application 110. The patient history and outcomes
database 114 enables the system 100 to generate recommendations,
warnings, and guidance for the user and/or to regulate the manner
in which the medication delivery system 102 functions to administer
therapy to the user, based on detected changes in the user's
activity (which may be temporary, ongoing for an extended period of
time, or somewhat permanent in nature).
[0041] In accordance with certain embodiments, any or all of the
components shown in FIG. 1 can be implemented as a computer-based
or a processor-based device, system, or component having suitably
configured hardware and software written to perform the functions
and methods needed to support the features described herein. In
this regard, FIG. 6 is a simplified block diagram representation of
an exemplary embodiment of a computer-based or processor-based
device 200 that is suitable for deployment in the system 100 shown
in FIG. 1.
[0042] The illustrated embodiment of the device 200 is intended to
be a high-level and generic representation of one suitable
platform. In this regard, any computer-based or processor-based
component of the system 100 can utilize the architecture of the
device 200. The illustrated embodiment of the device 200 generally
includes, without limitation: at least one controller (or
processor) 202; a suitable amount of memory 204 that is associated
with the at least one controller 202; device-specific items 206
(including, without limitation: hardware, software, firmware, user
interface (UI), alerting, and notification features); a power
supply 208 such as a disposable or rechargeable battery; a
communication interface 210; at least one application programming
interface (API) 212; and a display element 214. Of course, an
implementation of the device 200 may include additional elements,
components, modules, and functionality configured to support
various features that are unrelated to the primary subject matter
described here. For example, the device 200 may include certain
features and elements to support conventional functions that might
be related to the particular implementation and deployment of the
device 200. In practice, the elements of the device 200 may be
coupled together via at least one bus or any suitable
interconnection architecture 216.
[0043] The at least one controller 202 may be implemented or
performed with a general purpose processor, a content addressable
memory, a microcontroller unit, a digital signal processor, an
application specific integrated circuit, a field programmable gate
array, any suitable programmable logic device, discrete gate or
transistor logic, discrete hardware components, or any combination
designed to perform the functions described here. Moreover, the at
least one controller 202 may be implemented as a combination of
computing devices, e.g., a combination of a digital signal
processor and a microprocessor, a plurality of microprocessors, one
or more microprocessors in conjunction with a digital signal
processor core, or any other such configuration.
[0044] The memory 204 may be realized as at least one memory
element, device, module, or unit, such as: RAM memory, flash
memory, EPROM memory, EEPROM memory, registers, a hard disk, a
removable disk, a CD-ROM, or any other form of storage medium known
in the art. In this regard, the memory 204 can be coupled to the at
least one controller 202 such that the at least one controller 202
can read information from, and write information to, the memory
204. In the alternative, the memory 204 may be integral to the at
least one controller 202. As an example, the at least one
controller 202 and the memory 204 may reside in an ASIC. At least a
portion of the memory 204 can be realized as a computer storage
medium that is operatively associated with the at least one
controller 202, e.g., a tangible, non-transitory computer-readable
medium having computer-executable instructions stored thereon. The
computer-executable instructions are configurable to be executed by
the at least one controller 202 to cause the at least one
controller 202 to perform certain tasks, operations, functions, and
processes that are specific to the particular embodiment. In this
regard, the memory 204 may represent one suitable implementation of
such computer-readable media. Alternatively or additionally, the
device 200 could receive and cooperate with computer-readable media
(not separately shown) that is realized as a portable or mobile
component or platform, e.g., a portable hard drive, a USB flash
drive, an optical disc, or the like.
[0045] The device-specific items 206 may vary from one embodiment
of the device 200 to another. For example, the device-specific
items 206 will support: sensor device operations when the device
200 is realized as a sensor device; smartphone features and
functionality when the device 200 is realized as a smartphone;
activity tracker features and functionality when the device 200 is
realized as an activity tracker; smart watch features and
functionality when the device 200 is realized as a smart watch;
medical device features and functionality when the device is
realized as a medical device; etc. In practice, certain portions or
aspects of the device-specific items 206 may be implemented in one
or more of the other blocks depicted in FIG. 6.
[0046] If present, the UI of the device 200 may include or
cooperate with various features to allow a user to interact with
the device 200. Accordingly, the UI may include various
human-to-machine interfaces, e.g., a keypad, keys, a keyboard,
buttons, switches, knobs, a touchpad, a joystick, a pointing
device, a virtual writing tablet, a touch screen, a microphone, or
any device, component, or function that enables the user to select
options, input information, or otherwise control the operation of
the device 200. The UI may include one or more graphical user
interface (GUI) control elements that enable a user to manipulate
or otherwise interact with an application via the display element
214. The display element 214 and/or the device-specific items 206
may be utilized to generate, present, render, output, and/or
annunciate alerts, alarms, messages, or notifications that are
associated with operation of the medication delivery system 102,
associated with a status or condition of the user, associated with
operation, status, or condition of the system 100, etc.
[0047] The communication interface 210 facilitates data
communication between the device 200 and other components as needed
during the operation of the device 200. In the context of this
description, the communication interface 210 can be employed to
transmit or stream device-related control data, patient-related
user status (e.g., gesture data or status data), device-related
status or operational data, sensor data, calibration data, and the
like. It should be appreciated that the particular configuration
and functionality of the communication interface 210 can vary
depending on the hardware platform and specific implementation of
the device 200. In practice, an embodiment of the device 200 may
support wireless data communication and/or wired data
communication, using various data communication protocols. For
example, the communication interface 210 could support one or more
wireless data communication protocols, techniques, or
methodologies, including, without limitation: RF; IrDA (infrared);
Bluetooth; BLE; ZigBee (and other variants of the IEEE 802.15
protocol); IEEE 802.11 (any variation); IEEE 802.16 (WiMAX or any
other variation); Direct Sequence Spread Spectrum; Frequency
Hopping Spread Spectrum; cellular/wireless/cordless
telecommunication protocols; wireless home network communication
protocols; paging network protocols; magnetic induction; satellite
data communication protocols; wireless hospital or health care
facility network protocols such as those operating in the WMTS
bands; GPRS; and proprietary wireless data communication protocols
such as variants of Wireless USB. Moreover, the communication
interface 210 could support one or more wired/cabled data
communication protocols, including, without limitation: Ethernet;
powerline; home network communication protocols; USB; IEEE 1394
(Firewire); hospital network communication protocols; and
proprietary data communication protocols.
[0048] The at least one API 212 supports communication and
interactions between software applications and logical components
that are associated with operation of the device 200. For example,
one or more APIs 212 may be configured to facilitate compatible
communication and cooperation with the patient care application
110, and to facilitate receipt and processing of data from sources
external to the device 200 (e.g., databases or remote devices and
systems).
[0049] The display element 214 is suitably configured to enable the
device 200 to render and display various screens, recommendation
messages, alerts, alarms, notifications, GUIs, GUI control
elements, drop down menus, auto-fill fields, text entry fields,
message fields, or the like. Of course, the display element 214 may
also be utilized for the display of other information during the
operation of the device 200, as is well understood. Notably, the
specific configuration, operating characteristics, size,
resolution, and functionality of the display element 214 can vary
depending upon the implementation of the device 200.
[0050] As mentioned above, the medication delivery system 102 is
suitably configured and programmed to support an automatic mode to
automatically control delivery of insulin to the user. In this
regard, FIG. 7 is a simplified block diagram representation of a
closed loop glucose control system 300 arranged in accordance with
certain embodiments. The system 300 depicted in FIG. 7 functions to
regulate the rate of fluid infusion into a body of a user based on
feedback from an analyte concentration measurement taken from the
body. In particular embodiments, the system 300 is implemented as
an automated control system for regulating the rate of insulin
infusion into the body of a user based on a glucose concentration
measurement taken from the body. The system 300 is designed to
model the physiological response of the user to control an insulin
infusion device 302 in an appropriate manner to release insulin 304
into the body 306 of the user in a similar concentration profile as
would be created by fully functioning human .beta.-cells when
responding to changes in blood glucose concentrations in the body.
Thus, the system 300 simulates the body's natural insulin response
to blood glucose levels and not only makes efficient use of
insulin, but also accounts for other bodily functions as well since
insulin has both metabolic and mitogenic effects.
[0051] Certain embodiments of the system 300 include, without
limitation: the insulin infusion device 302; a glucose sensor
system 308 (e.g., the analyte sensor 112 shown in FIG. 1); and at
least one controller 310, which may be incorporated in the insulin
infusion device 302 as shown in FIG. 7. The glucose sensor system
308 generates a sensor signal 314 representative of blood glucose
levels 316 in the body 306, and provides the sensor signal 314 to
the at least one controller 310. The at least one controller 310
receives the sensor signal 314 and generates commands 320 that
regulate the timing and dosage of insulin 304 delivered by the
insulin infusion device 302. The commands 320 are generated in
response to various factors, variables, settings, and control
algorithms utilized by the insulin infusion device 302. For
example, the commands 320 (and, therefore, the delivery of insulin
304) can be influenced by a target glucose setpoint value 322 that
is maintained and regulated by the insulin infusion device 302.
Moreover, the commands 320 (and, therefore, the delivery of insulin
304) can be influenced by any number of adaptive parameters and
factors 324. The adaptive parameters and factors 324 may be
associated with or used by: a therapy control algorithm of the
insulin infusion device 302; a digital twin model of the patient,
which can be used to recommend manual insulin dosages; a meal
prediction algorithm; a user glucose prediction algorithm; or the
like. In this context, the adaptive parameters and factors 324 may
include, without limitation: a time constant; a threshold value; a
glucose limit; an insulin delivery limit; and/or a gain value.
[0052] Generally, the glucose sensor system 308 includes a
continuous glucose sensor, sensor electrical components to provide
power to the sensor and generate the sensor signal 314, a sensor
communication system to carry the sensor signal 314 to the at least
one controller 310, and a sensor system housing for the electrical
components and the sensor communication system. As mentioned above
with reference to FIG. 6, the glucose sensor system 308 may be
implemented as a computer-based or processor-based component having
the described configuration and features.
[0053] Typically, the at least one controller 310 includes
controller electrical components and software to generate commands
for the insulin infusion device 302 based on the sensor signal 314,
the target glucose setpoint value 322, the adaptive parameters and
factors 324, and other user-specific parameters, settings, and
factors. The at least one controller 310 may include a controller
communication system to receive the sensor signal 314 and issue the
commands 320.
[0054] Generally, the insulin infusion device 302 includes a fluid
pump mechanism 328, a fluid reservoir 330 for the medication (e.g.,
insulin), and an infusion tube to infuse the insulin 304 into the
body 306. In certain embodiments, the insulin infusion device 302
includes an infusion communication system to handle the commands
320 from the at least one controller 310, electrical components and
programmed logic to activate the fluid pump mechanism 328 motor
according to the commands 320, and a housing to hold the components
of the insulin infusion device 302. Accordingly, the fluid pump
mechanism 328 receives the commands 320 and delivers the insulin
304 from the fluid reservoir 330 to the body 306 in accordance with
the commands 320. It should be appreciated that an embodiment of
the insulin infusion device 302 can include additional elements,
components, and features that may provide conventional
functionality that need not be described herein. Moreover, an
embodiment of the insulin infusion device 302 can include
alternative elements, components, and features if so desired, as
long as the intended and described functionality remains in place.
In this regard, as mentioned above with reference to FIG. 6, the
insulin infusion device 302 may be implemented as a computer-based
or processor-based components having the described configuration
and features, including the display element 214 or other
device-specific items 206 as described above.
[0055] The at least one controller 310 is configured and programmed
to regulate the operation of the fluid pump mechanism 328 and other
functions of the insulin infusion device 302. The at least one
controller 310 controls the fluid pump mechanism 328 to deliver the
fluid medication (e.g., insulin) from the fluid reservoir 330 to
the body 306. As mentioned above, the at least one controller 310
can be housed in the infusion device housing, wherein the infusion
communication system is an electrical trace or a wire that carries
the commands 320 from the at least one controller 310 to the fluid
pump mechanism 328. In alternative embodiments, the at least one
controller 310 can be housed in the sensor system housing, wherein
the sensor communication system is an electrical trace or a wire
that carries the sensor signal 314 from the sensor electrical
components to the at least one controller 310. In accordance with
some embodiments, the at least one controller 310 has its own
housing or is included in a supplemental or ancillary device. In
other embodiments, the at least one controller 310, the insulin
infusion device 302, and the glucose sensor system 308 are all
located within one common housing.
[0056] Referring again to FIG. 1, the gesture-based physical
behavior detection system 104 employs at least one sensor to obtain
corresponding user-specific sensor data. The obtained user-specific
sensor data is processed or analyzed by the gesture-based physical
behavior detection system 104 and/or by another suitably configured
device or component of the system 100 to determine whether the
user's current behavior reflects a significant or measurable change
in activity, relative to a currently implemented, active, or
monitored therapy behavior pattern of the user. The obtained
user-specific sensor data may also be processed or analyzed to
obtain certain activity-related parameters, characteristics, and/or
metadata for the user. For example, the obtained user-specific
sensor data may identify, include, or indicate any or all of the
following, without limitation: timestamp data corresponding to the
occurrence of detected events; a type, category, or classification
of the detected physical behavior or activity; location data; user
posture or position information; etc. In some examples, the type,
category, or classification of detected physical behavior or
activity can correspond to activity duration and/or intensity.
[0057] The gesture-based physical behavior detection system 104 may
include, cooperate with, or be realized as a motion-based physical
behavior detection system, an activity-based physical behavior
detection system, an image or video based activity detection
system, or the like. In certain embodiments, the system 104 may be
realized as a unitary "self-contained" wearable system that
communicates with one or more other components of the system 100.
For example, the system 104 can be implemented with at least one
wearable device such as an activity monitor device, a smart watch
device, a smart bracelet or wristband device, or the like. In some
embodiments, the system 104 may be realized as at least one
portable or wearable device that includes or communicates with one
or more external or ancillary sensor devices, units, or components.
For example, the system 104 can be implemented with a wearable or
portable smart device that is linked with one or more external
sensors worn or carried by the user. These and other possible
deployments of the system 104 are contemplated by this disclosure.
In this regard, United States patent publication number US
2020/0135320 and United States patent publication number US
2020/0289373 disclose gesture-based physical behavior detection
systems that are suitable for use as the system 104; the entire
content of these United States patent documents is incorporated by
reference herein.
[0058] FIG. 8 is a block diagram representation of a gesture-based
physical behavior detection system 400 arranged in accordance with
certain embodiments. The system 400 is suitable for use with the
system 100 shown FIG. 1. In certain embodiments, the system 400 is
deployed as a wearable electronic device in the form factor of a
bracelet or wristband that is worn around the wrist or arm of a
user's dominant hand. The system 400 may optionally be implemented
using a modular design, wherein individual modules include one or
more subsets of the disclosed components and overall functionality.
The user may choose to add specific modules based on personal
preferences and requirements.
[0059] The system 400 includes a battery 402 and a power management
unit (PMU) 404 to deliver power at the proper supply voltage levels
to all electronic circuits and components. The PMU 404 may also
include battery-recharging circuitry. The PMU 404 may also include
hardware, such as switches, that allows power to specific
electronics circuits and components to be cut off when not in
use.
[0060] When there is no movement-based or gesture-based behavior
event in progress, most circuitry and components in the system 400
are switched off to conserve power. Only circuitry and components
that are required to detect or help predict the start of a behavior
event of interest may remain enabled. For example, if no motion is
being detected, all sensor circuits but an accelerometer 406 may be
switched off and the accelerometer 406 may be put in a low-power
wake-on-motion mode or in another lower power mode that consumes
less power and uses less processing resources than its high
performance active mode. A controller 408 of the system 400 may
also be placed into a low-power mode to conserve power. When motion
or a certain motion pattern is detected, the accelerometer 406
and/or the controller 408 may switch into a higher power mode and
additional sensors such as, for example, a gyroscope 410 and/or a
proximity sensor 412 may also be enabled. When a potential start of
a movement-based or gesture-based event is detected, memory
variables for storing event-specific parameters, such as gesture
types, gesture duration, etc. can be initialized.
[0061] In another example, upon detection of user motion, the
accelerometer 406 switches into a higher power mode, but other
sensors remain switched off until the data from the accelerometer
406 indicates that the start of a behavior event has likely
occurred. At that point in time, additional sensors such as the
gyroscope 410 and the proximity sensor 412 may be enabled.
[0062] In another example, when there is no behavior event in
progress, both the accelerometer 406 and gyroscope 410 are enabled
but at least one of either the accelerometer 406 or the gyroscope
410 is placed in a lower power mode compared to their regular power
mode. For example, the sampling rate may be reduced to conserve
power. Similarly, the circuitry required to transfer data from the
system 400 to a destination device may be placed in a lower power
mode. For example, radio circuitry 414 could be disabled.
Similarly, the circuitry required to transfer data from the system
400 may be placed in a lower power mode. For example, the radio
circuitry 414 could be disabled until a possible or likely start of
a behavior event has been determined. Alternatively, it may remain
enabled but in a low power state to maintain the connection between
the system 400 and one or more other components of the system 100,
but without transferring user status data, sensor data, or the
like.
[0063] In yet another example, all motion-detection related
circuitry may be switched off if, based on certain metadata, it is
determined that the occurrence of a particular behavior event, such
as a food intake event, is unlikely. This may be desirable to
further conserve power. Metadata used to make this determination
may, among other things, include one or more of the following: time
of the day, location, ambient light levels, proximity sensing, and
detection that the system 400 has been removed from the wrist or
hand, detection that the system 400 is being charged, or the like.
Metadata may be generated and collected by the system 400.
Alternatively, metadata may be collected by another device that is
external to the system 400 and is configured to directly or
indirectly exchange information with the system 400. It is also
possible that some metadata is generated and collected by the
system 400, while other metadata is generated and collected by a
device that is external to the system 400. In case some or all of
the metadata is generated and collected external to the system 400,
the system 400 may periodically or from time to time power up its
radio circuitry 414 to retrieve metadata related information from
another device.
[0064] In certain embodiments, some or all of the sensors may be
turned on or placed in a higher power mode if certain metadata
indicates that the occurrence of a particular behavior event, such
as the user beginning to work, jog, or eat, is likely. Metadata
used to make this determination may, among other things, include
one or more of the following: time of the day; location; ambient
light levels; proximity sensing; historical user behavior patterns.
Some or all of the metadata may be collected by the system 400 or
by an ancillary device that cooperates or communicates with the
system 400, as mentioned above.
[0065] User status data used to track certain aspects of a user's
behavior may be stored locally inside memory 416 of the system 400
and processed locally using the controller 408 of the system 400.
User status data may also be transferred to the medication delivery
system 102, the patient care application 110, and/or one or more of
the database 114 mentioned above with reference to FIG. 1 (such
that the user status data can be processed, analyzed, or otherwise
utilized by the applications or components that receive the user
status data). It is also possible that some of the processing and
analysis are performed locally by the system 400, while further
processing and analysis are performed by one or more other
components of the system 100.
[0066] The detection of the start of a behavior event, such as the
start of a work activity, may trigger the power up and/or
activation of additional sensors and circuitry, such as a camera
418. Power up and/or activation of additional sensors and circuitry
may occur at the same time as the detection of the behavior event
of interest or some time thereafter. Specific sensors and circuitry
may be turned on only at specific times during a detected event,
and may be switched off otherwise to conserve power. It is also
possible that the camera 418 only gets powered up or activated upon
explicit user intervention such as, for example, pushing and
holding a button 420. Releasing the button 420 may turn off the
camera 418 to conserve power.
[0067] When the camera 418 is powered up, a projecting light source
422 may also be enabled to provide visual feedback to the user
about the area that is within view of the camera or to otherwise
illuminate the field of view. Alternatively, the projecting light
source 422 may only be activated sometime after the camera 418 has
been activated. In certain cases, additional conditions may need to
be met before the projecting light source 422 is activated. Such
conditions may include: the determination that the projecting light
source 422 is likely aiming in the direction of the object of
interest; the determination that the system 400 is not moving
excessively; or the like. In some embodiments, one or more light
emitting diodes (LEDs) 426 may be used as the projecting light
source 422.
[0068] Images may be tagged with additional information or metadata
such as: camera focal information; proximity information from the
proximity sensor 412; ambient light levels information from an
ambient light sensor 424; timestamp information; etc. Such
additional information or metadata may be used during the
processing and analysis of the user status data.
[0069] The projecting light source 422 may also be used to
communicate other information. As an example, an ancillary device
may use inputs from one or more proximity sensors 412, process
those inputs to determine if the camera 418 is within the proper
distance range from the object of interest, and use one or more
light sources to communicate that the camera is within the proper
distance range, that the user needs to increase the distance
between camera and the object of interest, or that the user needs
to reduce the distance between the camera and the object of
interest.
[0070] The projecting light source 422 may also be used in
combination with the ambient light sensor 424 to communicate to the
user if the ambient light is insufficient or too strong for an
adequate quality image capture. The projecting light source 422 may
also be used to communicate information including, but not limited
to, a low battery situation or a functional defect.
[0071] The projecting light source 422 may also be used to
communicate dietary coaching information. As an example, the
projecting light source 422 might, among other things, indicate if
not enough or too much time has expired since a previous food
intake event, or may communicate to the user how he/she is doing
against specific dietary goals.
[0072] Signaling mechanisms to convey specific messages using one
or more projecting light sources 422 may include, but are not
limited to, one or more of the following: specific light
intensities or light intensity patterns; specific light colors or
light color patterns; specific spatial or temporal light patterns.
Multiple mechanisms may also be combined to signal one specific
message.
[0073] A microphone 428 may be used by the user to add specific or
custom labels or messages to a detected event and/or image. In
certain embodiments, audio captured by the microphone 428 can be
processed to assist in the determination of whether the user is
eating, drinking, commuting, exercising, working, or resting. Audio
snippets may be processed by a voice recognition engine.
[0074] In certain embodiments, the accelerometer 406 (possibly
combined with other sensors, including other inertial sensors) may,
in addition to tracking at least one parameter that is directly
related to a gesture-based behavior event, also be used to track
one or more parameters that are not directly related to that
particular event. Such parameters may, among other things, include
physical activity, sleep, stress, or illness.
[0075] In addition to the particular sensors, detectors, and
components mentioned above, the system 400 may include or cooperate
with any number of other sensors 430 as appropriate for the
particular embodiment. For example, and without limitation, the
system 400 may include or cooperate with any or all of the
following: a heartrate monitor; a physiological characteristic or
analyte sensor; a continuous glucose monitor; a GPS receiver; and
any other sensor, monitor, or detector mentioned elsewhere herein.
The system 400 obtains user status data from one or more of its
sensors, detectors, and sources, wherein the user status data
indicates a stressful activity of the user. The user status data
can be analyzed and processed by the system 400 (and/or by one or
more other components of the system 100) to determine whether the
user's current behavior is consistent with normally expected
behavior or activity. In certain embodiments, the system 400 and/or
an ancillary system 106 or device determines the user's activity
and related behavior primarily based on the output of user-worn
motion sensors, movement sensors, one or more inertial sensors
(e.g., one or more accelerometers and/or one or more gyroscopes),
one or more GPS sensors, one or more magnetometers, one or more
force or physical pressure sensors, or the like, which are suitably
configured, positioned, and arranged to measure physical movement
or motion of the user's limbs, digits, joints, facial features,
head, and/or other body parts.
[0076] In some embodiments, the system 400 includes at least one
haptic interface 440 that is suitably configured and operated to
provide haptic feedback as an output. The at least one haptic
interface 440 generates output(s) that can be experienced by the
sense of touch by the user, e.g., mechanical force, vibration,
movement, temperature changes, or the like. Haptic feedback
generated by the at least one haptic interface 440 may represent or
be associated with one or more of the following, without
limitation: reminders; alerts; confirmations; notifications;
messages; numerical values (such as measurements); status
indicators; or any other type of output provided by the system
400.
[0077] In certain embodiments, the user status data (e.g., sensor
data) is provided to a gesture recognizer unit or processor. To
this end, sensor data may be sent in raw format. Alternatively, a
source of sensor data may perform some processing (e.g., filtering,
compression, or formatting) on raw sensor data before sending the
processed sensor data to the gesture recognizer unit. The gesture
recognizer unit analyzes the incoming sensor data and converts the
incoming sensor data into a stream of corresponding gestures, which
may be predetermined or otherwise classified or categorized. The
gesture recognizer unit may use one or more ancillary inputs (such
as the output from one or more ancillary systems 106) to aid in the
gesture determination process. Nonlimiting examples of an ancillary
input include: time of day; the probability of a specific gesture
occurring based on statistical analysis of historical gesture data
for that user; geographical location; heart rate; other
physiological sensor inputs. Other ancillary inputs are also
possible.
[0078] The output of the gesture recognizer unit--the detected
gestures--can be sent to an event detector or processor. The event
detector analyzes the incoming stream of gestures to determine if
the start of an event of interest (e.g., eating a meal, going to
bed, working out) has occurred, whether an event is ongoing,
whether an event has ended, or the like. Although this description
mentions meal detection, the gesture-based physical behavior
detection system 400 may be suitably configured to monitor other
types of physical behavior or activities. Such activities include,
without limitation: reading; sleeping; smoking; getting dressed;
driving; walking; commuting; working; exercising; turning down a
bed; making a bed; brushing teeth; combing hair; talking on the
phone; inhaling or injecting a medication; and activities related
to hand hygiene or personal hygiene.
[0079] Referring again to FIG. 1, the output of the gesture-based
physical behavior detection system 104 (in some embodiments,
supplemented with the output of the ancillary system(s) 106) can be
used to detect significant or relevant changes in the patient's
activity or usual routine. For example, gesture detection can be
leveraged to identify any or all of the following events, without
limitation: a change in eating habits or eating patterns (e.g.,
eating larger or smaller portions, eating different types of
foods); eating, working, sleeping, or exercising at unusual times;
eating, working, sleeping, or exercising for periods of time that
are longer or shorter than usual; consuming unusual types of food
or drink; eating at restaurants that are out of the ordinary.
Detected changes in the user's activity can inform certain
functions, features, and/or therapy-related operations of the
system 100. For example, detected changes in patient activity can
influence one or more parameters of a closed loop medication
delivery algorithm employed by the medication delivery system 102.
Thus, a "nominal" or "default" therapy control algorithm may be
active or implemented when the user's activity tracks a typical or
routine pattern of behavior, and a different, altered, modified, or
updated therapy control algorithm may be temporarily activated or
utilized when the currently detected user activity changes. Any
number of different types, classifications, categories, or levels
of therapy control algorithms can be supported by the system
described herein, as needed to contemplate detectable (and
distinguishable) user activity patterns. In some examples,
different therapy controls algorithms may be associated with
different locations and/or time frame. For example, the system can
support: one or more therapy control algorithms for city A and one
or more other therapy control algorithms for city B; one or more
therapy control algorithms for weekends and one or more other
therapy control algorithms for weekdays; one or more therapy
control algorithms for each season; and/or one or more therapy
control algorithms for each month.
[0080] As another example, detected changes in activity can trigger
modified adjustment of a digital twin model of the patient that
simulates the patient's physiological response to medication. In
this regard, an adaptive training or learning scheme can be used to
dynamically train the digital twin model in an ongoing manner.
However, in response to detected changes in the user's activity, a
different, modified, or altered adaptive training scheme can be
used to dynamically train the digital twin model in a manner that
is appropriate for the currently detected behavior pattern. Any
number of different types, classifications, categories, or levels
of adaptive training schemes can be supported by the system
described herein, as needed to contemplate detectable (and
distinguishable) user activity patterns.
[0081] As another example, detected changes in activity can trigger
adjustment of a currently active meal prediction algorithm, a
currently active glucose prediction algorithm (when the analyte
sensor 112 is realized as a glucose sensor), and/or a currently
active dosage calculation algorithm. Similarly, detected changes in
activity can trigger the selection, activation, or enablement of a
different meal prediction algorithm, a different glucose prediction
algorithm (when the analyte sensor 112 is realized as a glucose
sensor), and/or a different dosage calculation algorithm. Any
number of different types, classifications, categories, or levels
of these therapy-related algorithms can be supported by the system
described herein, as needed to contemplate detectable (and
distinguishable) user activity patterns.
[0082] As yet another example, detected changes in user activity
may reset, pause, alter, or restart an adaptive training scheme or
an adaptive learning period during which at least one feature,
function, setting, or model associated with the medication delivery
system is adaptively trained or adjusted. In accordance with
certain embodiments, operation of the medication delivery device
102 can be controlled or regulated based on a determination that
the user's normal or routine activity has changed by some
measurable amount. For example, if the system 100 determines (from
an analysis of user gesture data) that the user's regular work,
sleep, and/or eating schedule has changed, then certain adaptive,
training, or learning functions of the medication delivery device
102 can be temporarily halted or frozen under the assumption that
something unusual has occurred in the user's normally predictable
daily routine. Accordingly, information and data that would
normally be collected and used for adaptive training/learning need
not be considered during periods of detected unusual or different
user behavior. In accordance with certain embodiments, the system
100 need not take such temporary action until multiple occurrences
of a triggering event have been recorded. For example, the system
100 may declare that a change in the user's behavior has occurred
when an unusual or new activity has been detected multiple times
over a designated period of time, such as a week, if an unusual or
new activity has been detected at least X days in a row, or the
like. Accordingly, a single instance of an unusual event or a
one-time change in user behavior need not result in any changes to
the adaptive, training, or learning functions of the medication
delivery device 102.
[0083] FIG. 9 is a flow chart that illustrates an infusion device
control process 500 according to certain embodiments. This example
assumes that the medication delivery system 102 includes an insulin
infusion device, and that the analyte sensor 112 is a glucose
sensor, such as a continuous glucose monitor that communicates with
the insulin infusion device. Accordingly, the process 500 receives
or processes glucose sensor data (task 502) and controls the
operation of the insulin infusion device to regulate the delivery
of insulin to the user (task 504). Certain therapy-related
functions or features of the insulin infusion device are influenced
or based at least in part on the glucose sensor data, such that the
insulin infusion device can respond to ongoing changes in the
user's glucose level.
[0084] This description assumes that the process 500 begins with
the insulin infusion device operating in a normal or default mode
that contemplates the user behaving in a predictable manner that is
consistent with a historical activity, eating pattern, work
schedule, and the like. Accordingly, the process 500 performs
adaptive training of at least one feature, function, setting,
parameter, factor, or model that is associated with the insulin
infusion device, e.g., basal amount, bolus schedule, a digital twin
simulation of the patient, a glucose prediction algorithm, a
delivery control algorithm for insulin (or parameters thereof such
as a time constant, a gain value, a threshold, or a limit), an
insulin dosage or delivery limit, or the like (task 506). In
certain implementations, the adaptive training is based at least in
part on the glucose sensor data provided by the glucose sensor. The
adaptive training enables the insulin infusion device and/or
associated models, algorithms, or control schemes to dynamically
adapt and adjust to changes in the user's glucose level, as
measured by the glucose sensor. The adaptive training may also be
influenced by gesture data and/or ancillary data that is associated
with the user, as described above.
[0085] As mentioned previously, the insulin infusion device can be
controlled to operate in a closed loop mode or a hybrid closed loop
mode (where basal insulin is automatically controlled, and insulin
boluses are manually controlled) to automatically deliver insulin
to the user in accordance with a therapy control algorithm (e.g., a
closed loop insulin delivery algorithm). In this context, the
adaptive training can be used to train at least one
therapy-altering factor of the therapy control algorithm, such as,
without limitation: a time constant; a threshold, a glucose limit,
an insulin delivery limit, or the like.
[0086] The system 100 may include, utilize, or cooperate with a
physiological model of the user (a digital twin) that simulates a
physiological response of the user to delivery of medication, such
as insulin. The physiological model can be used with or without an
insulin infusion device. For example, the physiological model can
be used in connection with the patient care application 110 to
provide therapy guidance or recommendations to a user of a manual
insulin delivery system or device (syringes, an injection pen, or
the like). In certain implementations, the adaptive training can be
used to train the physiological model, such that the model
accurately simulates the actual response of the user. Various
techniques, such as machine learning or artificial intelligence,
can be leveraged to adaptively train the model.
[0087] The insulin infusion device may include, utilize, or
cooperate with other adaptive algorithms, e.g., a meal prediction
algorithm, a glucose prediction algorithm, an insulin dosage
calculation algorithm, or a bolus calculation algorithm. In certain
embodiments, the adaptive training can be used to train these and
possibly other adaptive algorithms, settings, or control
schemes.
[0088] The process 500 receives activity-identifying data that
indicates a current behavior pattern of the user (e.g., physical
activity information including time, type, duration, and/or
intensity; meal information including time, duration, portion size,
and/or carbohydrate amount) (task 508) and analyzes or processes at
least some of the received activity-identifying data to determine
whether the currently detected behavior pattern differs from a
currently implemented therapy or expected behavior pattern of the
user (e.g., typical physical activity or meal activity for the
user) (query task 510). As mentioned above, the
activity-identifying data is generated by sensors, detector units,
or other sources of data that are included with or associated with
a suitably configured gesture-based physical behavior detection
system 104, 400 (e.g., the accelerometer 406, the gyroscope 410,
the proximity sensor 412, one or more other sensors 430, the
microphone 428, and/or the camera 418). Accordingly, the
activity-identifying data may be generated at least in part from
gesture data obtained for the user, e.g., location. Depending on
the particular embodiment, at least some of the
activity-identifying data may include user status data generated or
provided by at least one ancillary system 106 or device (other than
the gesture-based physical behavior detection system 104, 400) that
monitors certain characteristics, status or condition of the user.
Accordingly, the activity-identifying data may be generated at
least in part from such user status data.
[0089] The process 500 continues by analyzing or processing at
least some of the received activity-identifying data to determine
whether there has been a change in the user's activity (e.g., a
threshold number of events outside the expected behavior pattern of
the user are detected during a particular timeframe) (query task
510). If analysis of the activity-identifying data does not reveal
a significant or relevant change in the user's activity (e.g., a
threshold number of events outside the expected behavior pattern of
the user are not detected during a particular timeframe) (the "No"
branch of query task 510), then the process 500 continues to
operate the insulin infusion device and perform adaptive training,
as described above. Accordingly, FIG. 9 depicts the "No" branch of
query task 510 leading back to task 502. If, however, the
activity-identifying data indicates that the current behavior
pattern differs from the currently implemented therapy behavior
pattern of the user (the "Yes" branch of query task 510), then the
process 500 takes appropriate action to temporarily halt, modify,
or change the currently implemented adaptive training routine to
compensate for the change in activity.
[0090] In accordance with some embodiments, if an activity change
is indicated by the gesture data, then the process 500 generates a
confirmation message or notification for the user (task 512). A
message may document or explain the detected situation, for
example, "Great job! We noticed that you have been walking more
lately. Would you like us to update your therapy settings
accordingly?" As another example, a confirmation message may simply
ask the user to approve certain actions: "A change in your usual
daily routine has been detected. OK to update your insulin delivery
scheme?" As yet another example, a notification may read as
follows: "New behavior pattern detected. Please update dynamic
system learning and training." The confirmation message requests
user authorization to alter the adaptive training. In this regard,
the confirmation message may include an active user interface
element, such as a button or a link, that allows the user to
confirm or initiate the altering of the adaptive training. This
description assumes that the process 500 receives user
authorization to alter the adaptive training and proceeds to alter
(e.g., temporarily stop, modify, reset, change, or reclassify) the
adaptive training (task 514). Thus, the adaptive training is
altered in response to a detected activity change and further in
response to receiving user authorization to alter the adaptive
training. In alternative embodiments, altering the adaptive
training occurs automatically in response to a detected activity
change, without any user input or involvement. In certain
scenarios, the adaptive training proceeds in accordance with the
altered, changed, or modified adaptive training scheme.
[0091] Although not always required, operation of the insulin
infusion device may be adjusted or changed in an
activity-correlated manner to compensate for the detected change in
activity, the manner or extent of the activity change, the time
period associated with the changed activity, etc. In certain
embodiments, the process 500 changes at least one therapy-altering
factor of the currently active therapy control algorithm to obtain
an appropriate updated therapy control algorithm that compensates
for the detected activity change (task 516). For this scenario, the
process 500 continues by operating the insulin infusion device to
automatically deliver the insulin medication to the user in
accordance with the updated therapy control algorithm (task
518).
[0092] In some implementations, a remote data processing system
(e.g., a cloud-based system such as the data processing system 116
shown in FIG. 1) receives and processes the activity-identifying
data to determine whether the user's behavior pattern is different
than the currently implemented therapy behavior pattern and, if so,
sends at least one command, instruction, or control signal to one
or more destination devices, such as the insulin infusion device.
The at least one command, instruction, or control signal causes the
destination device to alter, pause, modify, or change the adaptive
training. In certain embodiments, a user device 108 receives and
processes the activity-identifying data, and communicates with a
destination device to influence or impact the adaptive training. In
yet other embodiments, the medication delivery system 102 (e.g.,
the insulin infusion device) receives and processes the
activity-identifying data in the manner described herein, and
initiates altering, changing, or pausing of the adaptive
training.
[0093] In certain scenarios, the system reverts to a previous
adaptive training scheme, a previous therapy control algorithm,
and/or a previous therapy-related algorithm (e.g., meal prediction,
glucose prediction, dosage calculation) when certain criteria is
satisfied. If appropriate to revert to a previous state (the "Yes"
branch of query task 520), then the process 500 may return to task
502 and continue as described above, with a previous state
activated, selected, or implemented. For example, the immediately
preceding adaptive training scheme may be reinstated, or any
previously utilized adaptive training scheme can be utilized. As
another example, if the current adaptive training scheme was
paused, then it can be restarted to resume training as usual. If
reverting to a previous state is inappropriate, untimely, or
unnecessary (the "No" branch of query task 520), then the process
500 exits so that the system may continue using the current
adaptive training scheme, therapy control algorithm, and other
therapy-related algorithms (if applicable).
[0094] Reverting back to a previous state may be performed
automatically after a predetermined period of time. In this regard,
if the altered adaptive training scheme has been active for the
designated period of time (e.g., 12 hours, one day, or eight
hours), then the process 500 can automatically revert to a previous
state. As another example, reverting back to a previous state may
require user involvement. Thus, in accordance with some
embodiments, the process 500 reverts back to a previous adaptive
training scheme only in response to receiving a user-initiated
command, instruction, or confirmation. As yet another example,
reverting back to a previous state may be initiated when updated or
current activity-identifying data indicates that an updated
(current) behavior pattern of the user corresponds to a previously
implemented therapy behavior pattern. In other words, if ongoing
activity-identifying data suggests that the user's behavior is
again following a historical and identifiable pattern, then the
process 500 may revert back to a previous adaptive training scheme
that is appropriate for the currently detected behavior
pattern.
[0095] As mentioned above, the process 500 (at query task 510)
determines whether the user's activity or behavior has changed,
based on the received activity-identifying data. The
activity-identifying data may include, for example, any of the
following: raw (uncharacterized or unprocessed) or processed sensor
data generated by the gesture-based physical behavior detection
system 104, 400; gesture data provided by the system 104, 400; raw
(uncharacterized or unprocessed) or processed sensor data generated
by one or more ancillary systems 106; and raw (uncharacterized or
unprocessed) or processed sensor data generated by the analyte
sensor 112. In certain embodiments, the device or system that
analyzes user activity has already been trained with historical
data such that it can compare the received activity-identifying
data against historical data, trends, patterns, and/or conditions
that are known to be correlated with user activities or
behavior.
[0096] FIG. 10 is a flow chart that illustrates a gesture training
process 600 according to certain embodiments. As mentioned above,
the system 100 can be initialized or trained with historical data
for purposes of determining physical behavior events, based on
obtained gesture data and/or ancillary user status data.
Accordingly, the process 600 can be employed with certain
embodiments to train the activity detection feature. It should be
appreciated that other methodologies, including those that need not
employ "training" per se, can be utilized in an implementation of
the system 100.
[0097] The process 600 obtains gesture training data, which is
provided by the gesture-based physical behavior detection system
104, 400 during one or more training sessions or periods of time
(task 602). Alternatively or additionally, the process 600 obtains
ancillary user status training data, which is provided by one or
more ancillary systems 106 during one or more training sessions or
periods of time (task 604). The process 600 may also obtain
activity or behavior marker data, which may be entered by the user,
during the training sessions or periods of time (task 606). The
marker data can be obtained in response to the user interacting
with one or more user devices 108 to record, flag, mark, or
otherwise identify points in time or periods of time at which the
user is engaging in a particular activity or physical behavior. The
activity marker data may also include information that
characterizes or describes the type of activity or behavior and/or
other metadata related to the recorded activities. For example, the
user can indicate points in time or periods of time corresponding
to activities such as: working; sleeping; eating; watching
television or a movie; driving or commuting; exercising; travelling
overseas; vacationing; walking the dog; playing video games;
playing music; etc. The process 600 may continue by temporally
correlating the obtained training data (e.g., gesture training data
and/or ancillary user status training data) with the obtained
marker data (task 608). The temporal correlation can be utilized to
identify and record certain activity-indicating or
lifestyle-indicating gestures performed by the user and/or to
identify and record user status information obtained during the
marked period of user activity (task 610), along with corresponding
time/date stamp data.
[0098] The training process 600 may continue by classifying,
categorizing, or labeling certain physical behavior events (as
indicated by the collected training data) that correspond to the
user's activities, lifestyle, eating habits, daily routine, and/or
behavior habits (task 612). In this regard, the system 100 can be
trained in a way that links detectable user activity to the
operation and control of the medication delivery system 102.
Accordingly, the process 600 may generate and save one or more
activity-correlated therapy control algorithms, settings, device
configurations, adaptive training schemes, or the like.
[0099] The various tasks performed in connection with a process
disclosed herein may be performed by software, hardware, firmware,
or any combination thereof. It should be appreciated that an
embodiment of an illustrated process may include any number of
additional or alternative tasks, the tasks shown in the figures
need not be performed in the illustrated order, and a disclosed
process may be incorporated into a more comprehensive procedure or
process having additional functionality not described in detail
herein. Moreover, one or more of the tasks shown in a figure could
be omitted from an embodiment of the depicted process as long as
the intended overall functionality remains intact.
[0100] While at least one exemplary embodiment has been presented
in the foregoing detailed description, it should be appreciated
that a vast number of variations exist. It should also be
appreciated that the exemplary embodiment or embodiments described
herein are not intended to limit the scope, applicability, or
configuration of the claimed subject matter in any way. Rather, the
foregoing detailed description will provide those skilled in the
art with a convenient road map for implementing the described
embodiment or embodiments. It should be understood that various
changes can be made in the function and arrangement of elements
without departing from the scope defined by the claims, which
includes known equivalents and foreseeable equivalents at the time
of filing this patent application.
* * * * *