U.S. patent application number 17/521879 was filed with the patent office on 2022-05-19 for smart windowing to reduce power consumption of a head-mounted camera used for ippg.
This patent application is currently assigned to Facense Ltd.. The applicant listed for this patent is Facense Ltd.. Invention is credited to Ari M. Frank, Gil Thieberger, Arie Tzvieli.
Application Number | 20220151504 17/521879 |
Document ID | / |
Family ID | 1000006009591 |
Filed Date | 2022-05-19 |
United States Patent
Application |
20220151504 |
Kind Code |
A1 |
Tzvieli; Arie ; et
al. |
May 19, 2022 |
Smart windowing to reduce power consumption of a head-mounted
camera used for iPPG
Abstract
Disclosed herein is utilization of windowing for efficient
capturing of imaging photoplethysmogram signals (iPPG signals) with
head-mounted cameras (e.g., cameras mounted to frames of
smartglasses). In order to save power involved in obtaining iPPG
signals, in one embodiment, a head-mounted camera with an image
sensor that supports changing of its region of interest (ROI) is
utilized to capture images of a region comprising skin on a user's
head. A computer calculates quality scores for iPPG signals
extracted from windows in the images, and selects a proper subset
of the iPPG signals whose quality scores reach a threshold. The
computer then reads from the camera at least one ROI that covers
one or more of the windows from which the proper subset of the iPPG
signals is extracted. Optionally, the at least one ROI read from
the camera covers below 75% of the skin region's area.
Inventors: |
Tzvieli; Arie; (Berkeley,
CA) ; Frank; Ari M.; (Haifa, IL) ; Thieberger;
Gil; (Kiryat Tivon, IL) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Facense Ltd. |
Kiryat Tivon |
|
IL |
|
|
Assignee: |
Facense Ltd.
Kiryat Tivon
IL
|
Family ID: |
1000006009591 |
Appl. No.: |
17/521879 |
Filed: |
November 9, 2021 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
63140453 |
Jan 22, 2021 |
|
|
|
63122961 |
Dec 9, 2020 |
|
|
|
63113846 |
Nov 14, 2020 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
A61B 5/02438 20130101;
A61B 5/6803 20130101; H04N 5/347 20130101; A61B 5/02427 20130101;
A61B 5/0205 20130101; A61B 5/1455 20130101; A61B 5/14546
20130101 |
International
Class: |
A61B 5/024 20060101
A61B005/024; A61B 5/00 20060101 A61B005/00; A61B 5/145 20060101
A61B005/145; A61B 5/1455 20060101 A61B005/1455; A61B 5/0205
20060101 A61B005/0205 |
Claims
1. A system configured to utilize windowing for efficient capturing
of imaging photoplethysmogram signals (iPPG signals), comprising:
an inward-facing head-mounted camera configured to capture images
of a region comprising skin on a user's head (skin region)
utilizing an image sensor that supports changing of its region of
interest (ROI); and a computer configured to: calculate quality
scores for iPPG signals extracted from windows in the images;
select a proper subset of the iPPG signals whose quality scores
reach a threshold; and read from the camera at least one ROI that
covers one or more of the windows from which the proper subset of
the iPPG signals is extracted; wherein the at least one ROI read
from the camera covers below 75% of the skin region's area.
2. The system of claim 1, wherein the quality scores for the iPPG
signals are proportional to a ratio AC/DC, wherein the quality
scores for the iPPG signals are proportional to a ratio AC/DC,
where the AC component represents absorption of pulsatile arterial
blood, and the DC component represents overall light absorption of
tissue, venous blood, and non-pulsatile arterial blood.
3. The system of claim 1, wherein the quality scores for the iPPG
signals are calculated using a machine learning-based approach that
utilizes at least one of the following signal quality metrics as
feature values: correlation of the iPPG signals with an iPPG beat
template, correlation of the iPPG signals with an iPPG beat
template after linearly stretching or compressing to the length of
the iPPG beat template, correlation of a resampled dynamic time
warping version the iPPG signals with an iPPG beat template,
percentage of the iPPG signals that are not clipped, and
signal-to-noise ratios of the iPPG signals.
4. The system of claim 1, wherein the quality scores for the iPPG
signals are calculated based on a ratio of power of the iPPG
signals around the pulse rate to power of noise in a passband of a
bandpass filter used in the calculation of the iPPG signals.
5. The system of claim 1, wherein the at least one ROI read from
the camera covers below 10% of the skin region's area.
6. The system of claim 1, wherein the computer is further
configured to read from the camera the at least one ROI at an
average frame rate higher than a maximal frame rate at which
full-resolution images can be read from the camera.
7. The system of claim 1, wherein the image sensor further supports
changing its binning value, and the computer is further configured
to: apply at least two different binning values to at least one of
the windows, calculate at least two quality scores for iPPG signals
extracted from the at least one of the windows when the at least
two different binning values were applied, respectively, select a
binning value with a corresponding quality score that is maximal,
and read from the camera at least one of the at least one ROI
according to the binning value.
8. The system of claim 7, wherein using binning with the selected
binning value reduces at least in half the time it takes the
computer to read the camera compared to reading the at least one
ROI in full resolution.
9. The system of claim 1, wherein the at least one ROI comprises
multiple ROIs, the image sensor supports setting multiple ROIs, and
the multiple ROIs are captured simultaneously by the camera.
10. The system of claim 1, wherein the at least one ROI comprises
multiple ROIs, and the multiple ROIs are captured serially by the
camera.
11. The system of claim 1, wherein the windows are selected to
cover an area expected to undergo a detectable change in hemoglobin
concentration due to a certain physiological response.
12. The system of claim 11, wherein the computer is further
configured to select two different proper subsets of the iPPG
signals for two different physiological responses, and to utilize
two different ROIs to cover two different windows from which the
two different proper subsets of the iPPG signals are extracted.
13. A method comprising: capturing images of a region comprising
skin on a user's head (skin region) utilizing an inward-facing
head-mounted camera comprising an image sensor that supports
changing of its region of interest (ROI); calculating quality
scores for imaging photoplethysmogram signals (iPPG signals)
extracted from windows in the images; selecting a proper subset of
the iPPG signals whose quality scores reach a threshold; and
reading from the camera at least one ROI that covers one or more of
the windows from which the proper subset of the iPPG signals is
extracted; wherein the at least one ROI read from the camera covers
below 75% of the skin region's area.
14. The method of claim 13, further comprising calculating the
quality scores for the iPPG signals using a machine learning-based
approach that utilizes at least one of the following signal quality
metrics as feature values: correlation of the iPPG signals with an
iPPG beat template, correlation of the iPPG signals with an iPPG
beat template after linearly stretching or compressing to the
length of the iPPG beat template, correlation of a resampled
dynamic time warping version the iPPG signals with an iPPG beat
template, percentage of the iPPG signals that are not clipped, and
signal-to-noise ratios of the iPPG signals.
15. The method of claim 13, further comprising reading from the
camera the at least one ROI at an average frame rate higher than a
maximal frame rate at which full-resolution images can be read from
the camera.
16. The method of claim 13, wherein the image sensor further
supports changing its binning value, and further comprising:
applying at least two different binning values to at least one of
the windows, calculating at least two quality scores for iPPG
signals extracted from the at least one of the windows when the at
least two different binning values were applied, respectively,
selecting a binning value with a corresponding quality score that
is maximal, and reading from the camera at least one of the at
least one ROI according to the binning value.
17. The method of claim 13, wherein the windows are selected to
cover an area expected to undergo a detectable change in hemoglobin
concentration due to a certain physiological response, and further
comprising: selecting two different proper subsets of the iPPG
signals for two different physiological responses, and utilizing
two different ROIs to cover two different windows from which the
two different proper subsets of the iPPG signals are extracted.
18. A non-transitory computer readable medium storing one or more
computer programs configured to cause a processor-based system to
execute steps comprising: capturing images of a region comprising
skin on a user's head (skin region) utilizing an inward-facing
head-mounted camera comprising an image sensor that supports
changing of its region of interest (ROI); calculating quality
scores for imaging photoplethysmogram signals (iPPG signals)
extracted from windows in the images; selecting a proper subset of
the iPPG signals whose quality scores reach a threshold; and
reading from the camera at least one ROI that covers one or more of
the windows from which the proper subset of the iPPG signals is
extracted; wherein the at least one ROI read from the camera covers
below 75% of the skin region's area.
19. The non-transitory computer readable medium of claim 18,
wherein the image sensor further supports changing its binning
value, and further comprising instructions configured to cause a
processor-based system to execute steps comprising: applying at
least two different binning values to at least one of the windows,
calculating at least two quality scores for iPPG signals extracted
from the at least one of the windows when the at least two
different binning values were applied, respectively, selecting a
binning value with a corresponding quality score that is maximal,
and reading from the camera at least one of the at least one ROI
according to the binning value.
20. The non-transitory computer readable medium of claim 18,
further comprising instructions configured to cause a
processor-based system to read from the camera the at least one ROI
at an average frame rate higher than a maximal frame rate at which
full-resolution images can be read from the camera.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Patent
Application No. 63/113,846, filed Nov. 14, 2020, U.S. Provisional
Patent Application No. 63/122,961, filed Dec. 9, 2020, and U.S.
Provisional Patent Application No. 63/140,453 filed Jan. 22,
2021.
BACKGROUND
[0002] Photoplethysmography (PPG) is a simple and widely used
optical technique for detecting blood volume changes in the
microvascular bed of tissue, which can be used to calculate values
of a wide range of physiological signals such as heart rate,
respiration rate, and blood pressure, to name a few. An imaging
photoplethysmogram signal (iPPG signal) is a type of PPG signal
that is recorded in a non-contact method using a camera. iPPG
signals can be useful for long-term monitoring of physiological
signals and can be obtained in a manner that is both comfortable an
unobtrusive using head-mounted devices. For example, lightweight
cameras can be embedded in head-mounted systems, such as
smartglasses, in order to collect images of regions of the face,
from which iPPG signals can be extracted. However, operating
cameras over long periods may require expenditure of a lot of power
from the limited supplies typically available to battery-operated
wearable devices. Thus, there is a need for a way to acquire iPPG
signals in an efficient manner, in order to save power and enable
longer device operating times.
SUMMARY
[0003] Some embodiments described herein utilize head-mounted
sensors to obtain images of an area on a user's head. These images
may be indicative of a blood volume changes due to pulsatile blood
flow in the area on the user's head, from which imaging
photoplethysmogram signals (iPPG signals) are extracted. As opposed
to the case with PPG that involves contact photoplethysmogram
devices, imaging photoplethysmography (iPPG), involves
photoplethysmogram devices that do not require contact with the
skin, and iPPG signals may be obtained by a non-contact sensor,
such as a video camera. Other names known in the art for iPPG
include: remote photoplethysmography (rPPG), remote
photoplethysmographic imaging, remote imaging photoplethysmography,
remote-PPG, and multi-site photoplethysmography (MPPG).
[0004] In order to save power involved in obtaining iPPG signals
using head-mounted cameras, some embodiments described herein
involve cameras that utilize an image sensor that supports changing
of its region of interest (ROI). This enables evaluation of
different windows in the images to determine which one or more
windows provide iPPG signals of a desired quality. Then after the
selection of the one or more windows, they are read in order to
extract the iPPG signals from those windows (and not from the full
images). Thus, after the selection of the one or more windows,
extracting iPPG signals becomes more power efficient, because less
data needs to be read from the camera, transmitted to a processor
and/or processed by a processor (compared to extraction of iPPG
signals from the full images).
[0005] One aspect of this disclosure involves a system that
utilizes windowing for efficient capturing of imaging
photoplethysmogram signals (iPPG signals). In one embodiment, the
system includes at least an inward-facing head-mounted camera and a
computer. The camera capture images of a region comprising skin on
a user's head (skin region) utilizing an image sensor that supports
changing of its region of interest (ROI). The computer calculates
quality scores for iPPG signals extracted from windows in the
images, and selects a proper subset of the iPPG signals whose
quality scores reach a threshold. The computer then proceeds to
read from the camera at least one ROI that covers one or more of
the windows from which the proper subset of the iPPG signals are
extracted. Optionally the at least one ROI read from the camera
covers below 75% of the skin region's area. Optionally, the at
least one ROI read from the camera covers below 10% of the skin
region's area. Optionally, the computer is further configured to
read from the camera the at least one ROI at an average frame rate
higher than a maximal frame rate at which full-resolution images
can be read from the camera.
[0006] Various types of quality scores may be utilized to select
the quality scores. In one example, the quality scores for the iPPG
signals are proportional to a ratio AC/DC, where the AC component
represents absorption of the pulsatile arterial blood, and the DC
component represents the overall light absorption of the tissue,
venous blood, and non-pulsatile arterial blood. In another example,
the quality scores for the iPPG signals are calculated using a
machine learning-based approach that utilizes at least one of the
following signal quality metrics as feature values: correlation of
the iPPG signals with an iPPG beat template, correlation of the
iPPG signals with an iPPG beat template after linearly stretching
or compressing to the length of the iPPG beat template, correlation
of a resampled dynamic time warping version the iPPG signals with
an iPPG beat template, percentage of the iPPG signals that are not
clipped, and signal-to-noise ratios of the iPPG signals. In yet
another example, the quality scores for the iPPG signals are
calculated based on a ratio of power of the iPPG signals around the
pulse rate to power of noise in a passband of a bandpass filter
used in the calculation of the iPPG signals.
[0007] In one embodiment, the image sensor of the camera also
supports changing its binning value. In this embodiment, the
computer performs the following: applies at least two different
binning values to at least one of the windows, calculates at least
two quality scores for iPPG signals extracted from the at least one
of the windows when the at least two different binning values were
applied, respectively, selects a binning value with a corresponding
quality score that is maximal, and reads from the camera at least
one of the at least one ROI according to the binning value.
Optionally, using binning with the selected binning value reduces
at least in half the time it takes the computer to read the camera
compared to reading the at least one ROI in full resolution.
[0008] Another aspect of this disclosure involves a method that
includes at least the following steps: capturing images of a region
comprising skin on a user's head (skin region) utilizing an
inward-facing head-mounted camera comprising an image sensor that
supports changing of its region of interest (ROI); calculating
quality scores for imaging photoplethysmogram signals (iPPG
signals) extracted from windows in the images; selecting a proper
subset of the iPPG signals whose quality scores reach a threshold;
and reading from the camera at least one ROI that covers one or
more of the windows from which the proper subset of the iPPG
signals are extracted; wherein the at least one ROI read from the
camera covers below 75% of the skin region's area.
[0009] In one embodiment, the method involves calculating the
quality scores for the iPPG signals using a machine learning-based
approach that utilizes at least one of the following signal quality
metrics as feature values: correlation of the iPPG signals with an
iPPG beat template, correlation of the iPPG signals with an iPPG
beat template after linearly stretching or compressing to the
length of the iPPG beat template, correlation of a resampled
dynamic time warping version the iPPG signals with an iPPG beat
template, percentage of the iPPG signals that are not clipped, and
signal-to-noise ratios of the iPPG signals.
[0010] In one embodiment, the reading from the camera the at least
one ROI is done at an average frame rate higher than a maximal
frame rate at which full-resolution images can be read from the
camera.
[0011] Yet another aspect of this disclosure involves a
non-transitory computer readable medium storing one or more
computer programs configured to cause a processor-based system to
execute steps of one or more embodiments of the aforementioned
method.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The embodiments are herein described by way of example only,
with reference to the following drawings:
[0013] FIG. 1 illustrates an embodiment of a system the utilizes
windowing for efficient capturing of imaging photoplethysmogram
signals (iPPG signals);
[0014] FIG. 2 illustrates quality scores calculated for iPPG
signals extracted from different areas on the cheek;
[0015] FIG. 3 illustrates an embodiment of a system that operates a
camera asynchronously;
[0016] FIG. 4 illustrates an embodiment of smartglasses that
include a contact head-mounted photoplethysmography device in the
nosepiece and a camera coupled to the frame;
[0017] FIG. 5 illustrates an embodiment of a system that detects an
abnormal medical event;
[0018] FIG. 6 illustrates an example of smartglasses that include
two PPG devices and which utilize selection of advantageous timings
to reconstruct informative portions of a PPG signal;
[0019] FIG. 7 illustrates an embodiment of a system that collects
images used for iPPG, which utilizes multiple light sources;
[0020] FIG. 8 illustrates an example of capturing interlaced images
of a region illuminated from different illumination directions;
and
[0021] FIG. 9A and FIG. 9B are schematic illustrations of possible
embodiments for computers.
DETAILED DESCRIPTION
[0022] Herein the terms "photoplethysmogram signal",
"photoplethysmographic signal", "photoplethysmography signal", and
other similar variations are interchangeable and refer to the same
type of signal. A photoplethysmogram signal may be referred to as a
"PPG signal", or an "iPPG signal" when specifically referring to a
PPG signal obtained from a camera. The terms "photoplethysmography
device", "photoplethysmographic device", "photoplethysmogram
device", and other similar variations are also interchangeable and
refer to the same type of device that measures a signal from which
it is possible to extract the photoplethysmogram signal. The
photoplethysmography device may be referred to as "PPG device".
[0023] Sentences in the form of "a sensor configured to measure a
signal indicative of a photoplethysmogram signal" refer to at least
one of: (i) a contact PPG device, such as a pulse oximeter that
illuminates the skin and measures changes in light absorption,
where the changes in light absorption are indicative of the PPG
signal, and (ii) a non-contact camera that captures images of the
skin, where a computer extracts the PPG signal from the images
using an imaging photoplethysmography (iPPG) technique. Other names
known in the art for iPPG include: remote photoplethysmography
(rPPG), remote photoplethysmographic imaging, remote imaging
photoplethysmography, remote-PPG, multi-site photoplethysmography
(MPPG), camera-based blood perfusion, camera-based hemoglobin
concentration, and camera-based blood flow. Additional names known
in the art for iPPG from facial images include: facial hemoglobin
concentration map, facial hemoglobin concentration changes, dynamic
hemoglobin concentration/information extraction, facial blood flow
map, facial blood flow changes, facial blood pulsation, facial
blood perfusion, and transdermal optical imaging.
[0024] A PPG signal is often obtained by using a pulse oximeter,
which illuminates the skin and measures changes in light
absorption. Another possibility for obtaining the PPG signal is
using an imaging photoplethysmography (iPPG) device. As opposed to
contact PPG devices, iPPG does not require contact with the skin
and is obtained by a non-contact sensor, such as a video
camera.
[0025] A time series of values measured by a PPG device, which is
indicative of blood flow changes due to pulse waves, is typically
referred to as a waveform (or PPG waveform to indicate it is
obtained with a PPG device). Analysis of PPG signals usually
includes the following steps: filtration of a PPG signal (such as
applying bandpass filtering and/or heuristic filtering), extraction
of feature values from fiducial points in the PPG signal (and in
some cases may also include extraction of feature values from
non-fiducial points in the PPG signal), and analysis of the feature
values.
[0026] One type of features that is often used when performing
calculations involving PPG signals involves fiducial points related
to the waveforms of the PPG signal and/or to functions thereof
(such as various derivatives of the PPG signal). There are many
known techniques to identify the fiducial points in the PPG signal,
and to extract the feature values. Examples of features that can be
extracted from the PPG signal, together with schematic
illustrations of the feature locations on the PPG signal, can be
found in the following four publications and their references: (i)
Charlton, Peter H., et al. "Assessing mental stress from the
photoplethysmogram: a numerical study." Physiological measurement
39.5 (2018): 054001; (ii) Ahn, Jae Mok. "New aging index using
signal features of both photoplethysmograms and acceleration
plethysmograms." Healthcare informatics research 23.1 (2017):
53-59; (iii) Peltokangas, Mikko, et al. "Parameters extracted from
arterial pulse waves as markers of atherosclerotic changes:
performance and repeatability." IEEE journal of biomedical and
health informatics 22.3 (2017): 750-757; and (iv) Peralta, Elena,
et al. "Optimal fiducial points for pulse rate variability analysis
from forehead and finger photoplethysmographic signals"
Physiological measurement 40.2 (2019): 025007. Although these four
references describe manual feature selection, the features may be
selected using any appropriate feature engineering technique,
including using automated feature engineering tools.
[0027] Unless there is a specific reference to a specific
derivative of the PPG signal, phrases of the form of "based on the
PPG signal" refer to the PPG signal and any derivative thereof.
Algorithms for filtration of the PPG signal (and/or the images in
the case of iPPG), extraction of feature values from fiducial
points in the PPG signal, and analysis of the feature values
extracted from the PPG signal are well known in the art, and can be
found for example in the following references: (i) Allen, John.
"Photoplethysmography and its application in clinical physiological
measurement." Physiological measurement 28.3 (2007); (ii) Elgendi,
Mohamed. "On the analysis of fingertip photoplethysmogram signals."
Current cardiology reviews 8.1 (2012); (iii) Holton, Benjamin D.,
et al. "Signal recovery in imaging photoplethysmography."
Physiological measurement 34.11 (2013), (iv) Sun, Yu, and Nitish
Thakor. "Photoplethysmography revisited: from contact to
noncontact, from point to imaging." IEEE Transactions on Biomedical
Engineering 63.3 (2015), (v) Kumar, Mayank, Ashok Veeraraghavan,
and Ashutosh Sabharwal. "DistancePPG: Robust non-contact vital
signs monitoring using a camera." Biomedical optics express 6.5
(2015), and (vi) Wang, Wenjin, et al. "Algorithmic principles of
remote PPG." IEEE Transactions on Biomedical Engineering 64.7
(2016).
[0028] In the case of iPPG, the input comprises images having
multiple pixels. The images from which the iPPG signal and/or
hemoglobin concentration patterns are extracted may undergo various
preprocessing to improve the signal, such as color space
transformation, blind source separation using algorithms such as
independent component analysis (ICA) or principal component
analysis (PCA), and various filtering techniques, such as
detrending, bandpass filtering, and/or continuous wavelet transform
(CWT). Various preprocessing techniques known in the art that may
assist in extracting iPPG signals from images are discussed in
Zaunseder et al. (2018), "Cardiovascular assessment by imaging
photoplethysmography--a review", Biomedical Engineering 63(5),
617-634.
[0029] Various embodiments described herein involve calculations
based on machine learning approaches. Herein, the terms "machine
learning approach" and/or "machine learning-based approaches" refer
to learning from examples using one or more approaches. Examples of
machine learning approaches include: decision tree learning,
association rule learning, regression models, nearest neighbors
classifiers, artificial neural networks, deep learning, inductive
logic programming, support vector machines, clustering, Bayesian
networks, reinforcement learning, representation learning,
similarity and metric learning, sparse dictionary learning, genetic
algorithms, rule-based machine learning, and/or learning classifier
systems. Herein, a "machine learning-based model" is a model
trained using one or more machine learning approaches.
[0030] Herein, "feature values" (also known as feature vector,
feature data, numerical features, and inputs) may be considered
input to a computer that utilizes a model to perform the
calculation of a value (e.g., an output, "target value", or label)
based on the input. It is to be noted that the terms "feature" and
"feature value" may be used interchangeably when the context of
their use is clear. However, a "feature" typically refers to a
certain type of value, and represents a property, while "feature
value" is the value of the property with a certain instance (i.e.,
the value of the feature in a certain sample).
[0031] In addition to feature values generated based on
measurements taken by sensors mentioned in a specific embodiment,
at least some feature values utilized by a computer of the specific
embodiment may be generated based on additional sources of data
that were not specifically mentioned in the specific embodiment.
Some examples of such additional sources of data include:
contextual information, information about the user being,
measurements of the environment, and values of physiological
signals of the user obtained by other sensors.
[0032] Sentences in the form of "inward-facing head-mounted camera"
refer to a camera configured to be worn on a user's head and to
remain pointed at the region it captures (sometimes referred to as
ROI), which is on the user's face, also when the user's head makes
angular and lateral movements. A head-mounted camera (which may be
inward-facing and/or outward-facing) may be physically coupled to a
frame worn on the user's head, may be physically coupled to
eyeglasses using a clip-on mechanism (configured to be attached to
and detached from the eyeglasses), may be physically coupled to a
hat or a helmet, or may be mounted to the user's head using any
other known device that keeps the camera in a fixed position
relative to the user's head.
[0033] The term "smartglasses" refers to any type of a device that
resembles eyeglasses, which includes a frame configured to be worn
on a user's head and electronics to operate one or more
sensors.
[0034] The term "visible-light camera" refers to a non-contact
device designed to detect at least some of the visible spectrum,
such as a video camera with optical lenses and CMOS or CCD sensor;
visible-light camera may be sensitive to near-infrared wavelengths
below 1050 nanometer. The term "thermal camera" refers to a
non-contact device that measures electromagnetic radiation having
wavelengths longer than 2500 nanometer (nm) and does not touch the
region it measures. A thermal camera may include one sensing
element (pixel), or multiple sensing elements that are also
referred to herein as "sensing pixels", "pixels", and/or
focal-plane array (FPA). A thermal camera may be based on an
uncooled thermal sensor, such as a thermopile sensor, a
microbolometer sensor (where microbolometer refers to any type of a
bolometer sensor and its equivalents), a pyroelectric sensor, or a
ferroelectric sensor.
[0035] A reference to a "camera" herein may relate to various types
of devices. In one example, a camera may be a visible-light camera.
In another example, a camera may capture light in the ultra-violet
range. In another example, a camera may capture near-infrared
radiation (e.g., wavelengths between 750 and 2000 nm). And in still
another example, a camera may be a thermal camera.
[0036] The term "temperature sensor" refers to a device that
measures temperature and/or temperature change. The temperature
sensor may be a contact thermometer (such as a thermistor, a
thermocouple), and/or a non-contact thermal cameras (such as a
thermopile sensor, a microbolometer sensor, or a cooled infrared
sensor). Some examples of temperature sensors useful to measure
skin temperature include: thermistors, thermocouples,
thermoelectric effect, thermopiles, microbolometers, and
pyroelectric sensors. Some examples of temperature sensors useful
to measure environment temperature include: thermistors, resistance
temperature detectors, thermocouples; thermopiles, and
semiconductor-based sensors.
[0037] The term "movement sensor" refers to a sensor comprising one
or more of the following components: a 3-axis gyroscope, a 3-axis
accelerometer, and a magnetometer. The movement sensor may also
include a sensor that measures barometric pressure.
[0038] The term "acoustic sensor" refers to a device that converts
sound waves into an electrical signal. The acoustic sensor may be a
microphone, such as a dynamic microphone, a piezoelectric
microphone, a fiber-optic microphone, a Micro-Electrical-Mechanical
System (MEMS) microphone, and/or other known sensors that measure
sound waves.
[0039] Herein, the term "blood pressure" is indicative of one or
more of the following: the systolic blood pressure of the user, the
diastolic blood pressure of the user, and the mean arterial
pressure (MAP) of the user. It is specifically noted that the term
"blood pressure" is not limited to the systolic and diastolic blood
pressure pair.
[0040] The terms "substance intake" or "intake of substances" refer
to any type of food, beverage, medications, drugs,
smoking/inhaling, and any combination thereof.
[0041] FIG. 1 illustrates an embodiment of a system the utilizes
windowing for efficient capturing of imaging photoplethysmogram
signals (iPPG signals). The system includes at least an
inward-facing head-mounted camera 552 and a computer 556.
Optionally, the camera 552 and/or the computer 556 coupled to a
frame of smartglasses 550.
[0042] The inward-facing head-mounted camera 552, which is referred
to herein as "the camera 552", captures images 554 of a region that
includes skin on a user's head utilizing an image sensor that
supports changing of its region of interest (ROI).
[0043] In CMOS-based camera image sensors, such as an image sensor
that may be used by the camera 552 in some embodiments, the term
"region of interest" (ROI) may also be known as: window of interest
readout, windowing, sub-windowing, region of interest readout,
programmable region of interest, area of interest, partial readout
window, random pixel access, and direct pixel addressing. In
CCD-based camera image sensors, which may be used by the camera 552
in other embodiments, the term "region of interest" may also be
known as partial scanning.
[0044] For "an image sensor that supports changing of its ROI", the
changing of the ROI is a feature that allows reading only a portion
of the pixels that were captured, and by that increasing the
readout speed of the ROI, and optionally also reducing the camera's
duty cycle. Some image sensors also allow multiple ROI readouts in
order to simplify the operation of multiple windowing. Sentences of
the form of "set the ROI according to a subset of pixels" or "to
place the ROI around pixels covering an object" refer to setting
the coordinates of the ROI to cover the "subset of pixels" or
"pixels covering an object". Herein, pixels are considered to
"cover" a region/object if they detect light reflected from that
region/object.
[0045] The computer 556 calculates quality scores for iPPG signals
that are extracted from windows in the images 554. Optionally, this
step is performed in order to assess the quality of iPPG signals
extracted from different windows in the images. The computer 556
selects a proper subset of the iPPG signals whose quality scores
reach a threshold. The computer 556 then reads from the camera 552
at least one ROI that covers one or more of the windows from which
the proper subset of the iPPG signals is extracted. Optionally, the
computer 556 issues commands 555 to the camera 552, which describe
the at least one ROI and/or other parameters to facilitate the
reading of at least one ROI. Optionally, the at least one ROI read
from the camera 552 covers below 75% of the region that includes
skin on the user's head. Optionally, the at least one ROI read from
the camera 552 covers below 25% of said region's area. Optionally,
the at least one ROI read from the camera 552 covers below 10% of
said region's area.
[0046] In some embodiments, the images 554 are partitioned
according to a grid, such that each window includes one or more
pixels that fall within a certain square of the grid. For example,
the images 554 may be partitioned into a 10.times.10 grid, a
20.times.20 grid, or a gird with some other dimensions (in these
examples, each square or contiguous subset of squares may be
considered a window). FIG. 2 illustrates quality scores calculated
for iPPG signals extracted from different areas on the cheek
corresponding to squares of a grid 553. Each square in the grid
includes a different subset of pixels that are detected by the
image sensor of the camera 552. In other embodiments, windows in
the images 554 may not be part of a grid, may not be the same size,
and/or at least some of the windows may overlap.
[0047] The computer 556 may utilize one or more of the
computational approaches (also referred to herein as "iPPG
algorithms"), which are known in the art and/or mentioned herein,
to extract the iPPG signals from the images 554. Some examples of
iPPG algorithms known in the art are surveyed in Zaunseder, et al.
"Cardiovascular assessment by imaging photoplethysmography--a
review," in Biomedical Engineering/Biomedizinische Technik 63.5
(2018): 617-634.
[0048] The quality scores for the iPPG signals may be calculated
using various known and/or novel methods. Some examples of
approaches to quality scores are based on determining
signal-to-noise levels, waveform morphology analysis, and/or
machine learning-based approaches, as discussed below. Optionally,
calculating the quality scores may involve calculation of one or
more Signal Quality Indexes for PPG signals, which are mentioned in
Elgendi, M. in "Optimal Signal Quality Index for Photoplethysmogram
Signals", Bioengineering (Basel, Switzerland) vol. 3,4 21.
(September 2016), which is incorporated herein by reference.
[0049] In some embodiments, the quality scores for an iPPG signal
(which is in itself a PPG signal) includes a factor that is
proportional to the ratio AC/DC calculated of the iPPG signal. PPG
signals are typically composed of a pulsatile component (AC) and
non-pulsatile component (DC). The AC component is synchronized with
the heart and related to arterial pulsation, while DC component is
related to various factors such as light absorption in the tissue,
vein, and diastolic arterial blood volume. The AC component of a
PPG waveform usually has its fundamental frequency, typically
around 1 Hz, depending on heart rate. This AC component is
superimposed onto a typically larger DC component, which typically
varies slowly due to respiration, vasomotor activity and
vasoconstrictor waves. Thus, higher AC/DC ratios for a PPG
waveforms that display a pulsatile component at frequencies
corresponding to the heart rate may be considered to have a higher
quality than PPG waveforms with a lower AC/DC ratio. Thus, the
AC/DC ratios can be utilized to ascertain qualities of iPPG
signals.
[0050] Calculating the quality scores for the iPPG signals may
involve utilization of machine learning approaches. In one example,
the quality scores are calculated using a machine learning-based
approach that utilizes various PPG signal quality metrics as
features. Some examples of quality metrics that may be utilized
include one or more of the following: correlation of an iPPG signal
with an iPPG beat template, correlation of the iPPG signal with an
iPPG beat template after linearly stretching or compressing to the
length of the iPPG beat template, correlation of a resampled
dynamic time warping version the iPPG signal with an iPPG beat
template, percentage of the iPPG signal that is not clipped, and a
signal-to-noise ratio of the iPPG signal. Additional details
regarding calculation of these quality metrics are provided in the
publication Li, Qiao, and Gari D. Clifford "Dynamic time warping
and machine learning for signal quality assessment of pulsatile
signals", Physiological measurement (2012), which is incorporated
herein by reference. This publication describes a machine learning
approach to calculating a multilayer perceptron neural network that
combines several individual signal quality metrics and
physiological contexts, which may be applicable to some
embodiments.
[0051] In one example, a method for calculating the quality scores
for the iPPG signals, which may be implemented by the computer 556,
includes the following steps: Step 1, calculating an iPPG beat
template (e.g., by averaging beats in a predefined window); Step 2,
applying dynamic time warping to the iPPG beats; Step 3,
calculating signal quality metrics for each iPPG beat, for example
by applying one or more of direct matching (for each beat,
calculate correlation coefficient with the iPPG beat template),
linear resampling (selected each beat between two fiducial points,
linearly stretch or compress the beat to the length of the iPPG
beat template, and calculate the correlation coefficient), dynamic
time warping (resample the beat to length of the iPPG beat
template, and calculate the correlation coefficient), and clipping
detection (determiner periods of saturation to a maximum or a
minimum value within each beat, determine the smallest fluctuation
to be ignored, and calculate the percentage of the beat that is not
clipped); And step 4, fusing the signal quality information for a
decision, such as (i) a simple heuristic fusion of the signal
quality metrics, or (ii) a machine learning-based approach for
quality estimation, such as feeding a multi-layer perceptron neural
network with feature values comprising the signal quality metrics,
the simple heuristic fusion, and the number of beats detected
within the window.
[0052] Another known method for calculating the quality scores for
the iPPG signals is based on the idea that a PPG signal has a
fundamental frequency of oscillation equal to the pulse rate, and
the spectral power of the PPG signal is concentrated in a small
frequency band around the pulse rate. The spectral power of the
noise is distributed over the passband of the bandpass filter, such
as [0.5 Hz,5 Hz]. And the quality scores for the iPPG signals can
include a factor estimated as a ratio of (i) the power of the
recorded signal around the pulse rate, to (ii) the power of the
noise in the passband of the bandpass filter.
[0053] Following the calculation of the quality scores for the iPPG
signals, the computer 556 selects a proper subset of the iPPG
signals whose quality scores reach a threshold. In one example, the
threshold can be a fixed numeric value. In another example, the
threshold is dynamic, and is selected such that at least a certain
number of iPPG signals (e.g., the top 10%) are selected or at most
a certain number of iPPG signals (e.g., at most 25%) are selected.
In other examples, techniques for smart selection of windows of
iPPG signals, which are known in the art may be employed. Two
examples of approaches that may be utilized are included in the
following publications, which are incorporated herein by reference:
(i) Feng, Litong, et al. "Dynamic ROI based on K-means for remote
photoplethysmography" 2015 IEEE International Conference on
Acoustics, Speech and Signal Processing (ICASSP), and (ii) Bobbia,
Serge, et al. "Real-time temporal superpixels for unsupervised
remote photoplethysmography" Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition Workshops. 2018.
[0054] Following the selection of the proper subset of iPPG
signals, the computer 556 reads from the camera 552 at least one
ROI that covers one or more of the windows from which the proper
subset of the iPPG signals are extracted. Optionally, this
selective reading is achieved by issuing commands 555 which are
sent to the camera 552 and cause it to provide data obtained by
pixels belonging to the at least one ROI.
[0055] When the one at least one ROI includes multiple ROIs, there
may be different ways in which these multiple ROIs may be captured
by the camera 552. In one example, in which the at least one ROI
includes multiple ROIs, the image sensor of the camera 552 supports
setting multiple ROIs, and the multiple ROIs are captured
simultaneously by the camera 552. In another example, in which the
at least one ROI includes multiple ROIs, the multiple ROIs are
captured serially by the camera 552.
[0056] In some embodiments, during most of the time the at least
one ROI is read from the camera 552, full image data (e.g.,
including all of pixels) is not read. This selective reading can
confer several advantages. In one example, reading the at least one
ROI and not the full images can save power needed by the camera 552
to operate, by the system to transmit the data, and/or by the
computer 556 that processes the data. In another example, reading
the at least one ROI and not the full images enables the camera 552
to operate at a higher frame rate. In one example, the computer 556
reads from the camera the at least one ROI at an average frame rate
that is more than double the maximal frame rate at which
full-resolution images can be read from the camera 552.
[0057] The interaction between the camera 552 and the computer 556,
in some embodiments, can be viewed as a two stage approach: first
full images (or high resolution images) are captured (the images
554), these are analyzed to select at least one ROI that supports
high quality iPPG signals, and then the at least one ROI is read
for a certain time (which involves analyzing a lower resolution
than the images 554). This enables to save power and/or operate at
a higher frequency (higher sampling rate in the iPPG signals) when
the at last one ROI is read and not the higher resolution images.
The process of selecting the at least one ROI may occur under
different times and/or conditions. In one example, the selection is
performed once every time the system is turned on and/or worn by
the user. In another example, the selection of the at least one ROI
is done periodically, e.g., every five seconds or every minute. In
another example, quality scores for the iPPG signals extracted from
the at least one ROI are calculated periodically (e.g., every few
seconds). If the quality scores fall below a threshold, this
triggers performing a new selection of the at least one ROI (i.e.,
capturing and analyzing higher resolution images).
[0058] In some embodiments, the image sensor used by the camera 552
to capture the images 554 supports changing its binning value. In
this embodiment, the computer 556 may perform the following: apply
at least two different binning values to at least one of the
windows, calculate respective at least two quality scores for iPPG
signals extracted from the at least one of the windows when the at
least two different binning values were applied, respectively. Then
the computer 556 may select a binning value with a corresponding
quality score that is maximal (from among the scores for the
different binning values evaluated), and read from the camera 552
at least one of the at least one ROI according to the binning
value. Optionally, using binning with the selected binning value
reduces at least in half the time it takes the computer 556 to read
the camera 552 compared to reading the at least one ROI in full
resolution (without binning). Optionally, when the camera 552
supports a single binning value for an image, multiple ROIs with
different binning values are captured serially.
[0059] Different types of physiological responses manifest via
blood flow changes at different regions on the face. These blood
flow changes are often detectable via iPPG signals extracted from
images of the different regions. Thus, selection of ROIs to read
may depend, in some embodiments, on a physiological response that
is to be detected based on the iPPG signals. In one embodiment the
windows (that include the at least one ROI) are selected to cover
an area expected to undergo a detectable change in hemoglobin
concentration due to a certain physiological response. In one
example, for a certain person, a migraine may be manifested via
changes to blood flow on the forehead, while a stroke may be
manifested via changes to blood flow on both the forehead and a
cheek. Therefore, when calculating the iPPG signals for detecting
the migraine the computer 556 may select window that include a
first subset of the ROIs distributed over the forehead, and for
detecting the stroke, the computer 556 may select windows that
include a second subset of the ROIs distributed over the forehead
and the cheek. In another example, the computer 556 selects two
different proper subsets of the iPPG signals for two different
physiological responses, and utilizes two different ROIs to cover
two different windows from which the two different proper subsets
of the iPPG signals are extracted.
[0060] The following method may be used by systems modeled
according to FIG. 1. The steps described below may be performed by
running a computer program having instructions for implementing the
method. Optionally, the instructions may be stored on a
computer-readable medium, which may optionally be a non-transitory
computer-readable medium. In response to execution by a system
including a processor and memory, the instructions cause the system
to perform the following steps:
[0061] In Step 1, capturing images of a region comprising skin on a
user's head utilizing an inward-facing head-mounted camera
comprising an image sensor that supports changing of its region of
interest (ROI).
[0062] In Step 2, calculating quality scores for imaging
photoplethysmogram signals (iPPG signals) extracted from windows in
the images captured in Step 1.
[0063] In Step 3, selecting a proper subset of the iPPG signals
whose quality scores reach a threshold.
[0064] And in Step 4, reading from the camera at least one ROI that
covers one or more of the windows from which the proper subset of
the iPPG signals is extracted. Optionally, the at least one ROI
read from the camera covers below 75% of the region's area.
Optionally, reading from the camera the at least one ROI in this
step is don at an average frame rate higher than a maximal frame
rate at which full-resolution images can be read from the
camera.
[0065] In one embodiment, calculating the quality scores for the
iPPG signals in Step 2 involves using a machine learning-based
approach that utilizes at least one of the following signal quality
metrics as feature values: correlation of the iPPG signals with an
iPPG beat template, correlation of the iPPG signals with an iPPG
beat template after linearly stretching or compressing to the
length of the iPPG beat template, correlation of a resampled
dynamic time warping version the iPPG signals with an iPPG beat
template, percentage of the iPPG signals that are not clipped, and
signal-to-noise ratios of the iPPG signals.
[0066] In one embodiment, the image sensor of the camera utilized
in Steps 1 and 4 supports changing its binning value, and method
optionally involves the following steps: applying at least two
different binning values to at least one of the windows,
calculating at least two quality scores for iPPG signals extracted
from the at least one of the windows when the at least two
different binning values were applied, respectively, selecting a
binning value with a corresponding quality score that is maximal,
and reading from the camera at least one of the at least one ROI
according to the binning value.
[0067] In one embodiment, the windows utilized in Step 2 are
selected to cover an area expected to undergo a detectable change
in hemoglobin concentration due to a certain physiological
response. Optionally, the method includes a step involving
selecting two different proper subsets of the iPPG signals for two
different physiological responses, and utilizing two different ROIs
to cover two different windows from which the two different proper
subsets of the iPPG signals are extracted.
[0068] Without limiting the disclosed embodiments, an advantage of
the present invention pertains to its utility for efficient
processing of iPPG signals. Typically, calculations involving iPPG
signal (referred to herein as "iPPG calculations") involve a form
of a Discrete Fourier Transform (DFT) of the pixel values, followed
by a band-pass filter. Since the DFT takes a limited number of
samples in a limited sampling window, this is similar to
multiplying the original signal by a box function that is zero
everywhere outside of the sampling window. Multiplication in the
time domain translates into convolution in the Fourier domain, thus
the DFT returns the spectrum of the original signal convolved with
a sinc function, which significantly reduces the sparsity of the
original signal. This means that not all the images from which iPPG
signals are extracted provide the same amount of information that
is useful for the iPPG calculations. Some images are captured at
times that make them more informative for the iPPG calculations
(referred to as advantageous timings), while other images are
captured at times that make them less informative for the iPPG
calculations. However, if the system could capture the images
asynchronously according to the frequency of the iPPG signal to be
recovered, then the output of the DFT would have shorter sinc
tails, which would improve its reconstruction.
[0069] FIG. 3 illustrates a system that operates a camera
asynchronously. Optionally, the asynchronous operation of the
camera is intended to make collection of imaging photoplethysmogram
signals (iPPG signals) and/or calculations involving iPPG signals
more efficient than existing techniques in which a camera is
operated continually and synchronously. In one embodiment, the
system includes a camera 564, a contact sensor 562, and a computer
568. Optionally, one or more of the camera 564, the contact sensor
562, and the computer 568 may be head-mounted. Optionally, one or
more of the camera 564, the contact sensor 562, and the computer
568 may be coupled to a frame of smartglasses 560.
[0070] The camera 564 captures images of a region comprising skin
on a user's head. In one example, the region includes at least a
portion of a cheek of the user. In another example, the region
includes at least a portion of a temple of the user. In still
another example, the region includes at least a portion of the
forehead of the user. In some embodiments, the camera 564 may be a
head-mounted camera, while in other embodiments, the camera 564 may
be a remote camera. In one example, the camera 564 may a webcam. In
another example, the camera 564 belongs to a battery-operated
non-head-mounted mobile device, located more than 10 cm from the
region.
[0071] The contact sensor 562 measures a signal 563 indicative of
cardiac activity of the user. For example, the signal 563 may be
indicative of electrical potential changes due to the cardiac
activity and/or arterial blood volume changes due to the cardiac
activity.
[0072] In one embodiment, the contact sensor 562 includes a contact
photoplethysmography device and the signal 563 is a
photoplethysmogram signal (PPG signal). In one example, the contact
photoplethysmography device may be embedded in a watch or band worn
on the user's wrist. In another example, the contact
photoplethysmography device may be embedded in a head-mounted
system such as embedded in a nosepiece of a temple of a pair of
smartglasses.
[0073] In another embodiment, the contact sensor 562 includes an
electrocardiograph, and the signal 563 is an electrocardiogram
signal. In one example, the electrocardiograph includes electrodes
attached the user body, which are embedded in a smart shirt worn by
the user. In another example, the electrocardiograph is embedded in
device worn by the user, such as a watch or smartglasses. In yet
another example, the electrocardiograph is embedded in a patch
affixed to the user's body.
[0074] The computer 568 detects, based on the signal 563,
advantageous timings at which to capture images 569 for a purpose
of extracting imaging photoplethysmogram signals (iPPG signals)
from the images. The computer 568 commands the camera 564 to
capture the images 569 according to the advantageous timings.
Optionally, the computer 568 extracts portions of the iPPG signals
from the captured images 569. Optionally, the computer 568 detects
a physiological response from the extracted iPPG signals, such as
the user's heart rate and/or the user's blood pressure.
[0075] In some embodiments, advantageous timings refer to times at
which certain events are expected to be detectable in an iPPG
signal extracted from images captured by the camera 564. Thus, in
some examples, advantageous timings may be periodic, occurring at
predetermined intervals relative of the user's cardiac activity, as
detected via the signal 563. For example, advantageous timings may
be calculated as time intervals relative to points in time
representing the starts of an R-wave (e.g., when the contact sensor
562 includes an electrocardiograph) or an arrival of a pulse wave
or a systolic peak (e.g., the when the contact sensor 562 includes
a contact photoplethysmography device). For example, for some iPPG
calculations, images captured at times of the following fiducial
points are more informative than images captured at other times:
the systolic notch (which is the minimum at the PPG signal onset),
the systolic peak (which is the maximum of the PPG signal), and in
some cases also the dicrotic notch and/or the diastolic peak (which
is the first local maximum of the PPG signal after the dicrotic
notch and before 0.8 of the duration of the cardiac cycle).
Additionally or alternatively, for some iPPG calculations that are
based on the derivatives of the PPG signal, it may be beneficial to
capture the images at times optimized for one or more of the
following fiducial points in the first and/or second derivatives of
the PPG signal: the maximum slope peak in systolic of the velocity
photoplethysmogram (VPG), the local minima slope in systolic of
VPG, the global minima slope in systolic of VPG, the maximum slope
peak in diastolic of VPG, the maximum of the acceleration
photoplethysmogram (APG), and the minimum of the APG.
[0076] It is noted that "advantageous timings" refers to specific
points in time and/or intervals in time that are characterized by a
certain offset relative to the time of a cardiac activity reference
event. Examples of cardiac activity reference events include one or
more of the following: times of ventricular systoles, times of QRS
spikes (e.g., as measured by an electrocardiograph), or times of
systolic peaks (e.g., as measured by a contact photoplethysmography
device at a certain location in the body). In some embodiments,
"advantageous timings" may be characterized by one or more
intervals relative to cardiac activity reference events. For
example, in one embodiment, advantageous timings may be a window
0.1 seconds wide around the expected time of a systolic peak.
Herein, "advantageous timings" refers to a group that includes
significantly more timings that are more informative than the
averaged fixed-rate sample for the PPG calculation versus timings
that are less informative than the averaged fixed-rate sample for
the PPG calculation. Optionally, the advantageous timings may
include just the best sampling timings for the PPG calculation.
[0077] Typically, due to its being obtained by a contact sensor,
the signal 563 will be less noisy than iPPG signals extracted from
images captured by the 564. Additionally, iPPG signals have a
characteristic and predictable form of a PPG signal. Thus, as
described above, knowing when a certain reference event occurred in
the signal 563 can be utilized to determine, with high accuracy,
advantageous timings during which certain features of the iPPG
signals will occur.
[0078] In some embodiments, calculating the advantageous timings of
a certain feature that may be extracted from an iPPG signal (e.g.,
systolic notch, systolic peak, maxima or minima of VPG, etc.)
involves determination of offsets between the signal 563 and iPPG
signals extracted from images captured by the camera 564. The
offset between the signal 563 and an iPPG signal typically
corresponds to the difference in time in which certain cardiac
activity-related events manifest in the signal 563 and the iPPG
signal. When the signal 563 manifests cardiac events before their
manifestation in iPPG signal, this offset may be referred to as a
"delay". For example, a start of a cardiac cycle appears earlier in
a signal of a contact ECG (e.g., via an ECG R-peak) compared to an
iPPG signal extracted from images captured by a head-mounted
camera.
[0079] Calculating the advantageous timings may, in some
embodiments, involve collection of data over multiple cardiac
cycles, as follows. Images captured by the camera 564 are evaluated
in order to detect times at which times the certain feature occurs
in iPPG signals extracted from the images 569. These times of
occurrences are compared to the times of reference cardiac events
in the signal 563 (in the same cardiac cycle or an adjacent cardiac
cycle) in order to determine the timing offset at which the certain
feature occurs in iPPG signal (relative to the time of occurrence
of the reference cardiac event). Examples of reference cardiac
events include an R-peak in an ECG signal or a systolic peak in a
PPG signal. By collecting this data over multiple cardiac cycles,
statistics of the timing offsets can be calculated, such as the
average timing offset or a parameters of a distribution of the
timing offsets. These statistics may then be used to select the
advantageous timings.
[0080] In one embodiment, the computer 568 selects advantageous
timings so they fall within a window of a predetermined size (e.g.,
.+-.0.05 seconds) around the expected time of a certain feature in
the iPPG signal, while taking into account the offset time between
manifestation of cardiac events in the signal 563 and the iPPG
signal. For example, at least some of the advantageous timings are
selected to fall in a window that is .+-.0.05 seconds around the
expected time of a systolic peak in the iPPG signal.
[0081] In another embodiment, based on the distribution of the
timing offsets between manifestation of cardiac events in the
signal 563 and an iPPG signal (e.g., as measured over multiple
cardiac cycles), the computer 568 selects the advantageous timings
to capture at least a predetermined proportion of those occurrences
of the certain feature (e.g., at least 95% or at least 99%). For
example, at least some of the advantageous timings are selected to
form a window in which 95% of the dicrotic notches in the iPPG
signal are expected to fall.
[0082] Knowing the advantageous timings for capturing the images
that are more informative for the iPPG calculations can reduce the
power consumption significantly by reducing the average frame rate
and/or reducing the amount of image processing calculations. In one
example, the advantageous timings (during an average cardiac cycle)
cover less than 25% of the duration of the average cardiac
cycle.
[0083] In some embodiments, the advantageous timings are utilized
by the computer 568 to operate the camera 564 in an asynchronous
mode instead of at a usual fixed frame rate. Optionally, the number
of the images 569 captured during the advantageous timings, when
the camera 564 is operated in the asynchronous mode, is less than
20% of the number of images that would have been captured were the
camera 564 to capture images continually at a fixed frame rate.
Optionally, the volume of image data captured in the images 569
during the advantageous timings is less than 20% of the volume of
image data that would have been captured, were the camera 564 to
capture images continually at the fixed frame rate and fixed
resolution.
[0084] As discussed above, calculating the advantageous timings may
be done by analyzing the signal 563 to detect occurrences of a
certain reference cardiac event and then determining a certain
offset (delay) after which the one or more features (e.g., certain
fiducial points) are expected to be detected in iPPG signals
extracted from images captured by the camera 564.
[0085] In one embodiment, the advantageous timings are detected
based on analyzing measurements of a contact PPG device (that
typically provides a signal with lower noise than an iPPG signal
extracted from the images). Optionally, the computer 568 calculates
a delay between the PPG signal and the iPPG signals (e.g.,
difference in detection times of systolic peaks), and adjusts the
advantageous timings based on the delay. Because in this
embodiment, the contact sensor 562 is used to trigger the camera
564, it is usually preferred that the contact sensor 562 be located
at a location to which the pulse wave has a shorter travel time
from the heart, compared to the location measured by the camera
564. When this is not the case, the computer 568 may predict the
timing of the next pulse wave at the camera's location based on the
timing of the current pulse wave at the contact sensor's
location.
[0086] In another embodiment, in which the contact sensor 562
includes an electrocardiogram (ECG) device and the signal 563
includes an ECG signal, the offset between the ECG and iPPG signals
may is utilized to set the advantageous timings. This offset may be
calculated using various suitable methods, such as measuring the
difference between times dicrotic notches in the iPPG signals and
R-peaks in the ECG signal, and/or measuring the time difference
between iPPG peaks (e.g., systolic peaks) and ECG R-peaks.
[0087] Blood flow to the head can be dynamic, and depend on several
factors such as the cardiac activity level, posture, etc. Thus,
selected advantageous timings may become less accurate over time
(e.g., they may not cover manifestation of certain features they
were intended to). This may necessitate their recalculation and
selection using the process described above.
[0088] In some embodiments, advantageous timings are reselected
after a predetermined period has elapsed. For example, advantageous
timings may be reselected every minute, every five minutes, or
every thirty minutes, etc. Additionally or alternatively,
advantageous timings may be reselected when the physiological state
changes in a significant manner. For example, change above a
predetermined threshold to a physiological signal such as the heart
rate, blood pressure, or heart rate variability may trigger the
reselection of advantageous timings. In another example, a change
in posture (e.g., from sitting to standing) or a change in the
activity level (e.g., from being stationary to walking), can
trigger a reselection of the advantageous timings. Additionally or
alternatively, advantageous timings may be reselected when their
quality deteriorates. For example, if advantageous timings are
selected to cover certain fiducial points in the iPPG signals
(e.g., systolic peaks or dicrotic notches), but these are not
detected in the iPPG signals extracted from images collected during
the advantageous timings (e.g., because a shift in the offsets
between the signal 563 and the iPPG signals), then lack of
identification of the fiducial points in the iPPG signals may
trigger a reselection of the advantageous timings.
[0089] In some embodiments, the computer 568 may command the camera
to operate in different ways during times that are not advantageous
timings. In one embodiment, the computer 568 may command the camera
564 to refrain from capturing images during at least some of the
periods that do not include advantageous timings (e.g., intervals
in which there are no advantageous timings). Optionally, the
periods which do not correspond to advantageous timings, during
which camera 564 is commanded not to capture images, cover at least
50% of the time. In another embodiment, the computer 568 commands
the camera 564 to operate in a low-power mode for at least some of
the time between the advantageous timings. Optionally, capturing
the images according to the advantageous timings (and operating in
a low power mode between advantageous timings) reduces duty cycle
of the camera 564 to below half compared to duty cycle with a fixed
frame rate, to achieve essentially the same iPPG quality level.
Optionally, the iPPG quality level is determined based on accuracy
of annotation of fiducial points in the iPPG signals extracted from
the images 569. For example, the iPPG quality level may be a value
indicating the percentage of fiducial points (e.g., systolic peaks)
correctly annotated (e.g., within a certain tolerance from the
times determined based on the signal 563). Optionally, the iPPG
quality level is determined based on accuracy of physiological
signals determined based on the images 569. For example, the iPPG
quality level may be a value of the accuracy of heart rate
expressed as a divergence in the calculated heart (e.g., a
difference from the heart rate determined based on the signal
563).
[0090] In other embodiments, the computer 568 may command the
camera 564 to operate at different binning levels during
advantageous timings and times that do not include advantageous
timings. In one embodiment, the computer 568 commands the camera
564 to capture a second set of images that are interlaced between
the images 569. Optionally, the images 569 are captured with a
first level of binning, and the second set of images are captured
with a second level of binning that is higher than the first level
of binning. Optionally, the second set of images are captured with
a lower resolution compared to the images 569. Optionally, the
second level of binning results in at least four times reduction in
image resolution compared to the first level of binning, and a
number of images captured with the second level of binning equals,
or is greater than, a number of images captured with the first
level of binning.
[0091] Having the second set of images (which are interlaced
between the advantageous timings) can be used for various purposes,
such as averaging noise affecting the iPPG signals and/or reducing
impairment of the iPPG signals by incident light by calculating
normalized AC/DC ratios based on the images 569 and the second set
of images.
[0092] In one example, a method to calculate the iPPG signals,
which involves reducing impairment by incident light, includes the
following steps: In step 1, extracting the blood perfusion signals
at sub-regions of the region by spatially averaging the images. In
step 2, extracting the AC components of the blood perfusion signals
using a band-pass filter (such as 0.5 Hz to 5 Hz) that receives
either the images or the images and the second set of images. In
step 3, extracting the DC components of the blood perfusion signals
using a low-pass filter (such as 0.3 Hz cutoff) that receives
either the images or the images and the second set of images. And
in step 4, calculating the normalized AC/DC ratios at the
sub-regions to reduce impairment to the iPPG signals by incident
light.
[0093] In some embodiments, the computer 568 sets a sampling rate
of the contact sensor 562 in proportion to a regularity of the
user's heart rate. For example, the regularity may be a value
proportional to the heart rate variability (HRV). In one example,
the more regulated the user's heart rate is (e.g., lower values of
HRV), the lower the sampling rate of the contact sensor 562 can be,
because it is easier for the computer 568 to predict the timing of
the next pulse (and advantageous timings are expected to be more
accurate).
[0094] FIG. 4 illustrates an embodiment in which the contact sensor
562 comprises a contact head-mounted photoplethysmography device
(in the nosepiece of a pair of smartglasses), the signal 563 is a
photoplethysmogram signal (PPG signal), and the camera 564 is
head-mounted (coupled to the frame). Advantageous timings 565 that
are illustrated include times corresponding to occurrence of
systolic peaks.
[0095] In one embodiment, the computer 568 calculates an offset
between the PPG signal (the signal 563) and iPPG signals, and
selects the advantageous timings while accounting for the delay.
For example, the delay represents offsets between when certain
fiducial points, such as systolic peaks, dicrotic notches, etc.,
appear in the PPG signal and when they manifest in iPPG signals
extracted from images taken with the camera 564. In this example,
the computer 568 calculates the advantageous trimmings by adding
the offsets to the times at which the fiducial points are detected
in the PPG signal.
[0096] In one embodiment, in which the camera 564 belongs to a
battery-operated non-head-mounted mobile device located more than
10 cm from the region, the computer 568 includes a head-mounted
computer and a non-head-mounted computer that are communicate over
a wireless communication channel. In this embodiment, the
advantageous timings include significantly more timings of images
that are more informative compared to images that are less
informative for the purpose of extracting the iPPG signals.
Optionally, capturing the images 569 according to the advantageous
timings, in this embodiment, reduces duty cycle of the camera 564
to below half compared to a duty cycle with a fixed frame rate, to
achieve essentially the same iPPG quality level.
[0097] In another embodiment, in which the camera 564 and the
contact sensor 562 are mounted in a smartwatch, capturing the
images 569 according to the advantageous timings reduces duty cycle
of the camera 564 to below half compared to a duty cycle with a
fixed frame rate, to achieve essentially the same iPPG quality
level. In this embodiment, the computer 568 may calculate blood
pressure for the user based on a difference in pulse arrival times
in the signal 563 and iPPG signals extracted from images captured
by the camera 564, as discussed in more detail in U.S. Pat. No.
10,349,887 titled "Blood pressure measuring smartglasses", which is
incorporated herein by reference. In one example, the contact
sensor 562 includes a contact photoplethysmography device (disposed
in the smartwatch), the signal 563 is a photoplethysmogram signal
(PPG signal), and the advantageous timings comprise times
corresponding to occurrence of at least one of the following types
of fiducial points in the PPG signal: systolic notches, systolic
peaks, dicrotic notches, and diastolic peaks. Optionally, the
computer 568 calculates delays between the PPG signal and the iPPG
signals, and selects the advantageous timings while accounting for
the delays. For example, a delay between the PPG signal and an iPPG
signal may correspond to the time difference between manifestation
of a certain type of fiducial points (e.g., systolic peaks) in the
PPG signal and the iPPG signal. In this example, selecting the
advantageous timings while accounting for the delay may involve
setting the advantageous timings to include a window surrounding
the expected manifestation time of the certain fiducial point, when
the expected manifestation time is obtained by adding the delay to
the time of manifestation of the certain fiducial point in the PPG
signal.
[0098] The following method may be used by systems modeled
according to FIG. 3. The steps described below may be performed by
running a computer program having instructions for implementing the
method. Optionally, the instructions may be stored on a
computer-readable medium, which may optionally be a non-transitory
computer-readable medium. In response to execution by a system
including a processor and memory, the instructions cause the system
to perform the following steps:
[0099] In Step 1, capturing, by a camera (e.g., the camera 564),
images of a region comprising skin on a user's head.
[0100] In Step 2, measuring, by a contact sensor (e.g., the contact
sensor 562), a signal indicative of cardiac activity of the
user.
[0101] In Step 3, detecting, based on the signal, advantageous
timings for capturing the images for a purpose of extracting
imaging photoplethysmogram signals (iPPG signals) from the
images.
[0102] And in Step 4, commanding the camera to capture the images
according to the advantageous timings
[0103] In one embodiment, the method includes a step of extracting
portions of the iPPG signals from the images captured in Step 4.
Optionally, the method also includes a step of detecting a
physiological response from the extracted iPPG signals, such as the
user's heart rate and/or the user's blood pressure.
[0104] In one embodiment, the method optionally includes a step of
commanding the camera to operate in a low-power mode for at least
some of the time between the advantageous timings. Optionally,
capturing the images according to the advantageous timings reduces
duty cycle of the camera to below half compared to the duty cycle
with a fixed frame rate, to achieve essentially the same iPPG
quality level.
[0105] In one embodiment, the contact sensor utilized in Step 2 and
the camera utilized in Step 1 are head-mounted, and the method
optionally includes a step of commanding the camera to capture a
second set of images that are interlaced between the images. In
this embodiment, the images captured in Step 1 are captured with a
first level of binning, and the second set of images are captured
with a second level of binning that is higher than the first level
of binning. Optionally, the method also includes a step of reducing
impairment of the iPPG signals by incident light by calculating
normalized AC/DC ratios base on the images and the second set of
images, and/or utilizing the second set of images for averaging
noise affecting the iPPG signals.
[0106] In one embodiment, the advantageous timings include
significantly more timings of images that are more informative
compared to images that are less informative for the purpose of
extracting the iPPG signals, and capturing the images according to
the advantageous timings reduces duty cycle of the camera to below
half compared to duty cycle with a fixed frame rate to achieve
essentially the same iPPG quality level.
[0107] In one embodiment, the contact sensor utilized in Step 2
includes an electrocardiograph, and the signal measured in Step 2
is an electrocardiogram signal. Optionally, the method includes a
step of calculating a delay between the electrocardiogram signal
and the iPPG signals, and choosing the advantageous timings while
accounting for the delay.
[0108] FIG. 5 illustrates a system that detects an abnormal medical
event. Some embodiments of the illustrated system utilize
advantageous timings to reconstruct informative portions of a
photoplethysmogram signal in order to efficiently detect the
abnormal medical event based on an asymmetrical change to blood
flow recognizable from the portions of the photoplethysmogram
signal. In one embodiment, the system includes at least a first
device 582, a second device 584, and a computer 588. Optionally, at
least some of these components are head-mounted and/or coupled to a
frame of smartglasses 580.
[0109] The first device 582 measures a first signal 583 that is
indicative of a first photoplethysmogram signal (PPG.sub.S1) at a
first region on a side of the user's head. The second device 584
measures a second signal 586 that is indicative of a second
photoplethysmogram signal (PPG.sub.S2) at a second region, which is
on the other side of the user's head. Optionally, PPG.sub.S1
arrives at the first region before PPG.sub.S2 arrives at the second
region, such that for a typical cardiac cycle, a systolic peak will
manifest in PPG.sub.S1 before it manifests in PPG.sub.S2.
[0110] It is noted that sentences in the form of "first and second
regions on different sides of the head" refer to either (i) the
first region on the right side of the head and the second region on
the left side of the head, respectively, or (ii) the first region
on the left side of the head and the second region on the right
side of the head, respectively. The right and left sides of the
head are identified according to the vertical symmetry axis that
divides a human face, which passes through the middle of the
forehead and the tip of the nose.
[0111] Various types of devices may be utilized in order to obtain
PPG.sub.S1 and PPG.sub.SR2. In some embodiments, the first device
582 and/or the second device 584 are contact photoplethysmography
devices (contact PPG devices). Herein, a "contact
photoplethysmography device" is a photoplethysmography device that
comes in contact with the user's skin, and typically occludes the
area being measured. An example of a contact photoplethysmography
device is the well-known pulse oximeter. It is to be noted that in
some embodiments, in order to bring the contact PPG device close,
such that it touches the skin, various apparatuses may be utilized,
such as spacers (e.g., made from rubber or plastic), and/or
adjustable inserts that can help bridge possible gaps. In other
embodiments, the first device 582 and/or the second device 584 may
be cameras that are utilized to obtain imaging photoplethysmography
signals (iPPG signals).
[0112] In one embodiment, the second device 584 includes an
inward-facing head-mounted camera having more than 30 pixels.
Optionally, this camera captures images of a region covering a skin
area greater than 2. Optionally, the camera is located more than 5
mm from the second region (i.e., the closest distance between a
point in the second region and the camera is larger than 5 mm). In
this embodiment, PPG.sub.S2 comprises one or more iPPG signals that
are recognizable from color changes in the images captured by the
camera. In one example, the second region is located on a cheek of
the user and/or above one of the user's eyes.
[0113] In another embodiment, the second device 584 includes an
inward-facing head-mounted camera that utilizes an image sensor
comprising at least 3.times.3 pixels that are configured to detect
electromagnetic radiation having wavelengths in at least a portion
of the range of 200 nm to 1200 nm. Optionally, in this embodiment,
the system includes an active light source that illuminates a
portion of the second region. In one example, the active light
source is a head-mounted light source that illuminates the portion
of the second region with electromagnetic radiation having
wavelengths in at least a portion of the range of 750 nm to 1200
nm.
[0114] Herein, sentences of the form "an iPPG signal is
recognizable from color changes in images" refer to effects of
blood volume changes due to pulse waves that may be extracted from
a series of images. These changes may be identified and/or utilized
by a computer (e.g., in order to generate a signal indicative of
the blood volume at the region), but need not necessarily be
recognizable to the naked eye (e.g., because of their subtlety, the
short duration in which they occur, or involvement of light outside
of the visible spectrum). For example, blood flow may cause facial
skin color changes that corresponds to different concentrations of
oxidized hemoglobin due to varying volume of blood at a certain
region due to different stages of a cardiac pulse, and/or the
different magnitudes of cardiac output.
[0115] In one embodiment, the first device 582 and the second
device 584 each include head-mounted contact photoplethysmography
devices. Optionally, the head-mounted contact photoplethysmography
devices communicate with the computer 588 over wired communication
links Having the first and second signals be transmitted to the
computer 588 over wired communication links and not over a wireless
communication link may assist is preserving user privacy, in some
embodiments.
[0116] In another embodiment, the first device 582 includes a
head-mounted contact photoplethysmography device and the second
device 584 includes an ear-mounted contact photoplethysmography
device (e.g., a PPG sensor in an earbud). Such a design may be
beneficial in various situations. For example, some smartglasses
designs support larger batteries compared to earbuds. As a result,
the combination of a first head-mounted sensor (operating at a
higher duty cycle) and a second ear-mounted sensor may provide the
benefit of extending the earbud's operation time (until draining
its battery), and maybe even reducing the manufacturing cost of the
earbud by reducing its hardware requirement specification as a
result of the asynchronous sampling.
[0117] The computer 588 detects, based on the first signal 583,
advantageous timings to measure the second signal 586 for the
purpose of reconstructing informative portions of PPG.sub.S2. The
computer 588 commands the second device 584 to measure the second
signal during the advantageous timings. The computer 588 then
detects the abnormal medical event based on an asymmetrical change
to blood flow recognizable in PPG.sub.S1 and the informative
portions of PPG.sub.S2. Optionally, the computer 588 commands the
second device 584 to measure the second signal during the
advantageous timings by issuing commands 587 that indicate when the
second device 584 is to operate to measure the second signal 586.
In one example, the commands 587 include timings and/or intervals
of timings at which the second device 584 is to operate. In another
example, the commands 587 comprise signals that prompt the second
device 584 to take a measurement (in response to receiving a
signal), and the computer 588 sends a signal for each of the
advantageous timings (when that time arrives).
[0118] Examples of computers that may be utilized in embodiments
described herein, such as the computer 588, the computer 556, the
computer 568, and computer 608, are computers modeled according to
computer 400 or computer 410 illustrated in FIG. 9A and FIG. 9B,
respectively. It is to be noted that the use of the singular term
"computer" is intended to imply one or more computers, which
jointly perform the functions attributed to "the computer" herein.
In particular, in some embodiments, some functions attributed to a
"computer" (e.g., one of the aforementioned computers) may be
performed by a processor on a wearable device (e.g., smartglasses)
and/or a computing device of the user (e.g., smartphone), while
other functions may be performed on a remote processor, such as a
cloud-based server. For example, some operations that are performed
the computer 588 in some embodiments, such as preprocessing
PPG.sub.S1 and/or the informative portions of PPG.sub.S2 may be
performed by a processor on a pair of smartglasses or a smartphone,
while other functions, such as and determining whether the user is
experiencing the abnormal medical event, may be performed on a
remote processor, such as a cloud-based server. In other
embodiments, essentially all functions attributed to the computer
herein may be performed by a processor on a wearable device (e.g.,
smartglasses 580 to which the first and second devices are coupled)
and/or some other device carried by the user, such as a smartwatch
or smartphone.
[0119] Obtaining the PPG signals PPG.sub.S1 and PPG.sub.S2 (or its
informative portions) from measurements taken by the first device
582 and/or the second device 584 may involve, in some embodiments,
performing various preprocessing operations in order to assist in
calculations and/or in extraction of the PPG signals. Optionally,
the measurements may undergo various preprocessing steps prior to
being used by the computer to detect the abnormal medical event,
and/or as part of the process of the detection of the abnormal
medical event. Some non-limiting examples of the preprocessing
include: normalization of pixel intensities (e.g., to obtain a
zero-mean unit variance time series signal), and conditioning a
time series signal by constructing a square wave, a sine wave, or a
user defined shape, such as that obtained from an ECG signal or a
PPG signal as described in U.S. Pat. No. 8,617,081.
[0120] In some embodiments, in which the at least the first device
582 and/or the second device 584 are cameras, images taken by the
cameras may undergo various preprocessing to improve the signal,
such as color space transformation (e.g., transforming RGB images
into a monochromatic color or images in a different color space),
blind source separation using algorithms such as independent
component analysis (ICA) or principal component analysis (PCA), and
various filtering techniques, such as detrending, bandpass
filtering, and/or continuous wavelet transform (CWT). Various
preprocessing techniques known in the art that may assist in
extracting an PPG signals from images are discussed in Zaunseder et
al. (2018), "Cardiovascular assessment by imaging
photoplethysmography--a review", Biomedical Engineering 63(5),
617-634. An example of preprocessing that may be used in some
embodiments is given in U.S. Pat. No. 9,020,185, titled "Systems
and methods for non-contact heart rate sensing", which describes
how a times-series signals obtained from video of a user can be
filtered and processed to separate an underlying pulsing signal by,
for example, using an ICA algorithm.
[0121] The purpose, in some embodiments described herein, for
measuring the second signal during the advantageous timings is to
reconstruct informative portions of PPG.sub.S2. These informative
portions of PPG.sub.S2 generally include measurements taken at
times when PPG.sub.S2 displays properties that are useful for
making determinations based on PPG.sub.S2, such as determinations
made by machine learning-based algorithms. As stated above, such
useful times may include times of manifestation of fiducial points
and/or times in which derivatives of PPG.sub.S2 display certain
properties (such as maxima of minima) The idea is that the
advantageous timings are selected such that measurements taken
during these times convey practically the same quality of
information about PPG.sub.S2 as could be obtained by continuous
measurements of the second signal (which can be used for a complete
reconstruction of PPG.sub.S2), for the purpose of performing
typical calculations on PPG signals. These calculations may include
determination of values of physiological signals such as heart
rate, heart rate variability, respiration rate, blood pressure,
detection of parameters related to blood flow, and/or detection of
an abnormal medical event. Thus, the reconstructed informative
portions of PPG.sub.S2 may be viewed as subsets of values of the
fully reconstructed PPG.sub.S2 (were the second device 584 to
measure continuously) that are obtained from portions of the second
signal measured during the advantageous timings.
[0122] In some embodiments, the advantageous timings are detected
based on analyzing the first signal in order to calculate a delay
between PPG.sub.S1 and PPG.sub.S2 (e.g., difference in detection
times of systolic peaks in PPG.sub.S1 and PPG.sub.S2). The values
of the advantageous timings are set according to the delay. Because
in this embodiment, the first device 582 is used to select timings
for the second device 584, it is usually preferred that the first
device 582 be located at a location to which the pulse wave has a
shorter travel time from the heart, compared to the location
measured by the second device 584. When this is not the case, the
computer 588 may predict the timing of the next pulse wave at the
second device's location based on the timing of the current pulse
wave at the first device's location.
[0123] In one embodiment, the computer 588 selects advantageous
timings so they fall within a window of a predetermined size (e.g.,
.+-.0.05 seconds) around the expected time of a certain feature in
PPG.sub.S2, while taking into account the offset time between
manifestation of cardiac events in the first signal 583 and the
second signal 586 (the aforementioned "delay"). For example, at
least some of the advantageous timings are selected to fall in a
window that is .+-.0.05 seconds around the expected time of a
systolic peak in the PPG.sub.S2.
[0124] In another embodiment, the computer 588 calculates a
distribution of the timing offsets between manifestation of cardiac
events in the in the first signal 583 and the second signal 586
(e.g., as measured over multiple cardiac cycles). Based on this
distribution, the computer 588 selects the advantageous timings to
capture at least a predetermined proportion of those occurrences of
the certain feature (e.g., at least 95% or at least 99%). For
example, at least some of the advantageous timings are selected to
form a window in which 95% of the dicrotic notches in PPG.sub.S2
are expected to fall.
[0125] Various types of abnormal medical event cause asymmetrical
changes to blood flow, which may be detected depending on the
locations of the first and second regions. Some examples of medical
conditions that may be detected by the computer 588, in some
embodiments, include one or more of the following: ischemic stroke,
a migraine, a headache, cellulitis, dermatitis, ear infection, and
congestive heart failure (CHF) exacerbation. However, usually when
PPG devices are used to measure signals that are indicative of the
blood flow, the higher the sampling rate of each PPG device, the
more power it consumes.
[0126] In order to reduce the power consumption of the devices, in
some embodiments, the more informative timings for sampling the PPG
signal (referred to as advantageous timings) are estimated based on
the first signal that is sampled at a higher rate compared to the
second signal, and the second device (which is operated at a lower
duty cycle) is triggered asynchronously to sample the PPG signal at
the advantageous timings for discrete reconstruction. This mode of
operation can reduce the power consumption of the second device 584
by both reducing its average sampling rate and reducing the amount
of signals to process.
[0127] In some embodiments, a sum of the periods during which the
second device 584 measures the second signal 586 according to the
advantageous timings is less than half a sum of periods during
which the first device 582 measures the first signal 583.
Optionally, the first device 582 and the second device 584 are
contact PPG devices, and the sum of the periods during which the
second device 584 measures the second signal 586 is less than 10%
the sum of periods during which the first device 582 measures the
first signal 583.
[0128] As discussed in more detail above, "advantageous timings"
may refer to specific points in time and/or time intervals that are
characterized by a certain offset relative to the time of a cardiac
activity reference event. Additionally, advantageous timings may be
selected to include different types of events occurring during
certain points in time and/or interval of time.
[0129] In some embodiments, advantageous timings include expected
times of manifestation of one or more of the following fiducial
points in PPG.sub.S2: the systolic notch (which is the minimum at
the PPG signal onset), the systolic peak (which is the maximum of
the PPG signal), and in some cases also the dicrotic notch and/or
the diastolic peak (which is the first local maximum of the PPG
signal after the dicrotic notch and before 0.8 of the duration of
the cardiac cycle). Additionally or alternatively, for some
calculations based on the derivatives of the PPG signal, it may be
beneficial to reconstruct PPG.sub.S2 to have advantageous timings
include one or more of the following fiducial points in the first
and/or second derivatives of the PPG.sub.S2 signal: the maximum
slope peak in systolic of the velocity photoplethysmogram (VPG),
the local minima slope in systolic of VPG, the global minima slope
in systolic of VPG, the maximum slope peak in diastolic of VPG, the
maximum of the acceleration photoplethysmogram (APG), and the
minimum of the APG.
[0130] In some embodiments, the advantageous timings include
significantly more timings for measuring the second signal that are
more informative than measuring the second signal at a fixed rate
for the purpose of reconstructing the informative portions of
PPG.sub.S2 versus timings for measuring the second signal that are
less informative than measuring the second signal at the fixed rate
for the purpose of reconstructing the informative portions of
PPG.sub.S2.
[0131] In some embodiments, measuring the second signal during the
advantageous timings involves measuring the second signal less than
50% of the time (compared to continuous measuring that is not
restricted to the advantageous timings). Thus, for example, less
than 50% of the timings of measurements with the second device 584
(when operated without restricting to advantageous timings) are
considered timings belonging to the advantageous timings.
Optionally, measuring the second signal during the advantageous
timings involves measuring the second signal less than 20% of the
time (e.g., less than 20% of the timings of measurements with the
second device 584 are considered timings belonging to the
advantageous timings).
[0132] Blood flow to the head can be dynamic, and depend on several
factors such as the cardiac activity level, posture, etc. Thus,
selected advantageous timings may become less accurate over time
(e.g., they may not cover manifestation of certain features they
were intended to). This may necessitate their adjustment in order
to continue to reconstruct the informative portions of PPG.sub.S2.
Such an adjustment may involve recalculation of the advantageous
timings using the process described above or a temporary change
(e.g., widening of intervals around expected times of manifestation
of fiducial points).
[0133] In one embodiment, the system illustrated in FIG. 5 includes
a head-mounted movement sensor, such as inertial measurement unit
(IMU) 581, which measures movement signal 585, which is indicative
of movements of the user's body. When the user moves at certain
levels and/or for certain durations, this may cause change to the
user's blood pressure that changes the delay between PPG.sub.S1 and
PPG.sub.S2. Optionally, in this embodiment, if the movement signal
585 reaches a predetermined threshold, the computer 588 adjusts the
advantageous timings as a function of the movement signal 585. For
example, the computer 588 may readjust the advantageous timings, by
performing their reselection if the predetermined threshold is
reached.
[0134] In some embodiments, an additional photoplethysmography
device is utilized in the detection of the abnormal medical event.
This device may be mounted on a limb of the user, and communicate
with the computer 588 over a wireless communication link.
Optionally, the additional device takes measurements during times
set according to the advantageous timings while accounting for a
delay that is a function of delay between pulse arrival times to
the first region and the limb. In this embodiment, the computer 588
may reconstruct a PPG signal from the limb based on the timed
measurements, and detect an abnormal medical condition based on an
asymmetrical change to blood flow recognizable in the measurements
(at the first and second regions and the limb), and/or a change to
pulse arrival times recognizable in these measurements. Examples of
optional wireless communication links include Bluetooth Low Energy
(BLE) or ZigBee. In addition, because head-mounted PPG devices
usually provide a higher-quality PPG signal compared to wrist/leg
mounted PPG devices, timing the limb based PPG device according to
the head-mounted PPG device may provide the benefit of timing the
asynchronous samples based on a higher-quality signal.
[0135] Herein, detecting the abnormal medical event may mean
detecting that the user is suffering from the abnormal medical
event and/or that there is an onset of the abnormal medical event.
Additionally, an "abnormal" medical event may be a medical event
that the user has yet to experience, or does not experience most of
the time.
[0136] In some embodiments, detecting the abnormal medical event
may involve calculating one or more of the following values: an
indication of whether or not the user is experiencing the abnormal
medical event, a value indicative of an extent to which the user is
experiencing the abnormal medical event, a duration since the onset
of the abnormal medical event, and a duration until an onset of the
abnormal medical event.
[0137] Detection of an abnormal medical event may involve, in some
embodiments, detection of an asymmetrical change to blood flow.
When the blood flow on both sides of the head and/or body are
monitored, asymmetric changes can sometimes be recognized. These
changes are typically different from symmetric changes that can be
caused by factors such as physical activity (which typically
affects the blood flow on both sides in the same way). An
asymmetric change to the blood flow can mean that one side has been
affected by an event, such as a stroke, which does not influence
the other side. In one example, the asymmetric change to blood flow
involves a change in blood flow velocity on left side of the head
that is at least 10% greater or 10% lower than a change in blood
flow velocity on one right side of the head. In another example,
the asymmetric change to blood flow involves a change in the volume
of blood the flows during a certain period in the left side of the
head that is at least 10% greater or 10% lower than the volume of
blood that flows during the certain period in the right side of the
head. In yet another example, the asymmetric change to blood flow
involves a change in the direction of the blood flow on one side of
the head (e.g., as a result of a stroke), which is not necessarily
observed at the symmetric location on the other side of the
head.
[0138] Referring to an asymmetrical change to blood flow as being
"recognizable in PPG.sub.S1 and the informative portions of
PPG.sub.S2" means that values extracted from PPG.sub.S1 and the
informative portions of PPG.sub.S2 provide an indication that an
asymmetric change to the blood flow has occurred. That is, a
difference that has emerged in PPG.sub.S1 and the informative
portions of PPG.sub.S2 may reflect a change in blood flow velocity
on one side of the head, a change in blood flow volume, and/or a
change in blood flow direction, as described in the examples above.
It is to be noted, that the change in blood flow does not need to
be directly quantified from the values PPG.sub.S1 and the
informative portions of PPG.sub.S2 in order for it to be
"recognizable in PPG.sub.S1 and the informative portions of
PPG.sub.S2". Rather, in some embodiments, feature values generated
based on PPG.sub.S1 and the informative portions of PPG.sub.S2 may
be used by a machine learning-based model to detect a phenomenon,
such as the abnormal medical event, which is associated with the
asymmetrical change in blood flow.
[0139] In some embodiments, the computer 588 detects the abnormal
medical event by utilizing previously taken PPG signals of the
user, from a period that precedes the current abnormal medical
event being detected at that time. This enables an asymmetrical
change to be observed, since it provides a baseline according to
which it is possible to compare current PPG.sub.S1 and informative
portions of PPG.sub.S2, such that it may be determined that a
change to blood flow on one side of the head is not the same as a
change on the other side of the head.
[0140] A baseline for the blood flow may be calculated in various
ways. In a first example, the baseline is a function of the average
measurements of the user (which include previously taken PPG.sub.S1
and informative portions of PPG.sub.S2), which were taken before
the occurrence of the abnormal medical event. In a second example,
the baseline may be a function of the situation the user is in,
such that previous measurements taken during similar situations are
weighted higher than previous measurements taken during less
similar situations. A PPG signal may show different characteristics
in different situations because of the different mental and/or
physiological states of the user in the different situations. As a
result, such a situation-dependent baseline can improve the
accuracy of detecting the abnormal medical event. In a third
example, the baseline may be a function of an intake of some
substances (such as food, beverage, medications, and/or drugs),
such that previous measurements taken after consuming similar
substances are weighted higher than previous measurements taken
after not consuming the similar substances and/or after consuming
less similar substances. A PPG signal may show different
characteristics after the user consumes different substances
because of the different mental and/or physiological states the
user may enter after consuming the substances, especially when the
substances include things such as medications, drugs, alcohol,
and/or certain types of food. As a result, such a
substance-dependent baseline can improve the accuracy of detecting
the abnormal medical event.
[0141] There are various types of abnormal medical events that may
be detected based on PPG signals that reflect an asymmetrical
change to blood flow, which is recognizable in PPG.sub.S1 and the
informative portions of PPG.sub.S2.
[0142] In some embodiments, the abnormal medical event may involve
the user experiencing an ischemic stroke. An occurrence of an
ischemic stroke often involves a blood clot that changes the blood
flow to certain regions of the brain. One or more of several
mechanisms may be the cause of changes to blood flow that are
observed following an onset of an ischemic stroke. Blood flow may
change due to a stroke because of flaccid muscles (on one side of
the face) that use less oxygen and demand less blood. In such an
event, local regulation mechanisms may generate signals to the
smooth muscles that decrease the diameter of the arteries (which
can reduce blood flow). Additionally or alternatively, blood flow
may change due to a stroke because of nerve control changes that
occur due to reduced blood flow to the brain (a neurogenic
mechanism); the same nerves that control the muscles can also be
involved in the control of the constriction/dilation of blood
vessels. Another possible cause of changes to blood flow involves
obstruction-related passive changes. Blood that flows through the
major vessels (in the base of the brain it is either the carotid
(front) or vertebral (back) arteries, must flow out through one of
the branches. When one pathway is blocked or restricted (due to the
stroke), more blood has to go through collateral pathways (which
may change the blood flow). Thus, changes to the blood flow in the
face (and other areas of the head), especially if they are
asymmetric, can be early indicators of a stroke.
[0143] In one embodiment, the abnormal medical event is ischemic
stroke, and the asymmetrical change to the blood flow recognizable
in PPG.sub.S1 and PPG.sub.S2 involves an increase in asymmetry
between blood flow on the different sides of the head, with respect
to a baseline asymmetry between blood flow on the different sides
of the head. Herein, the term "ischemic stroke" also includes
Transient Ischemic Attack (TIA), known as "mini stroke".
[0144] In some embodiments, the abnormal medical event may involve
the user having a migraine or another form of headache. With
migraines and other headaches, vasoconstriction of facial or
cranial blood vessels may lead to asymmetric changes in blood flow
between the left and right sides of the head. Compensatory
mechanisms may change smooth muscle constriction around blood
vessels, further exacerbating this asymmetry. This vasoconstriction
can lead to differential surface blood flow, muscle contraction,
and facial temperature changes, leading to asymmetric blood flow.
As each individual's particular patterns of vasoconstriction would
be unique to the individual, the asymmetric phenomena may be
different for different users. Thus, measuring deviation from the
user's baseline blood flow patterns may increase the accuracy of
detecting these asymmetric phenomena, in some embodiments.
Additionally, the time course of migraine or headache usually
involves an occurrence over the course of minutes to hours (from
the onset of changes to blood flow), and usually occurs with a
characteristic pattern, allowing it to be differentiated from signs
of other medical, artificial or external causes, which manifest
different patterns of blood flow and/or time courses.
[0145] In one embodiment, the abnormal medical event is migraine,
and the asymmetrical change to the blood flow recognizable in
PPG.sub.S1 and the informative portions of PPG.sub.S2 is indicative
of a pattern of a certain change to facial blood flow, which is
associated with at least one previous migraine attack, determined
based on data comprising previous PPG.sub.S1 and previous
informative portions of PPG.sub.S2, which were measured more than 5
minutes before the previous migraine attack. Optionally, the time
of the beginning of the previous migraine attack corresponds to the
time at which the user became aware of the previous migraine
attack.
[0146] In another embodiment, the abnormal medical event is
headache, and the asymmetrical change to the blood flow
recognizable in PPG.sub.S1 and the informative portions of
PPG.sub.S2 is indicative of at least one of: a change in
directionality of facial blood flow, and reduction in blood flow to
one side of the face.
[0147] In one embodiment, PPG.sub.S1 arrives at the first region
before PPG.sub.S2 arrives at the second region. Optionally, the
first device 582 and the second device 584 are both embedded in the
smartglasses 580. For example, one of these devices may be embedded
in the nosepiece of the smartglasses 580, while the other may be
embedded in a temple of the smartglasses 580. Optionally, the
asymmetrical change to the blood flow that is recognizable in
PPG.sub.S1 and the informative portions of PPG.sub.S2 corresponds
to a deviation of PPG.sub.S1 and PPG.sub.S2 compared to a baseline
that is calculated based on previous measurements of PPG.sub.S1 and
PPG.sub.S2 taken before the abnormal medical event.
[0148] FIG. 6 illustrates an example of smartglasses that include
two PPG devices, which utilize selection of advantageous timings to
reconstruct informative portions of PPG.sub.S2. In this embodiment,
the first device 582 is embedded in a temple of the smartglasses
580 and the second device 584 is embedded in the nosepiece.
PPG.sub.S1 manifests before PPG.sub.S2 (as evident from the pulse
wave being shifted to the left in PPG.sub.S1 relative to
PPG.sub.S2), and therefore can be utilized to determine the
advantageous timings 590, which in this example are intervals in
which the systolic peaks of PPG.sub.S2 are anticipated.
Measurements of PPG.sub.S2 taken during the advantageous timings
590, are considered in this embodiment to be the "informative
portions of PPG.sub.S2".
[0149] Detecting the abnormal medical event may involve comparison
to a baseline that is based on previously taken measurements. In
one example, the computer 588 may calculate a baseline for the
difference between systolic blood pressure values calculated based
on PPG.sub.S1 and PPG.sub.S2 (or only the informative portions of
PPG.sub.S2); this baseline difference is denoted
.DELTA..sub.baseline. Optionally, .DELTA..sub.baseline is
calculated based on PPG.sub.S1 and PPG.sub.S2 measured at several
different occasions, on different days. At a current time the
computer 588 calculates one or more current differences between
systolic blood pressure values calculated based on PPG.sub.S1 and
the informative portions PPG.sub.S2; this current difference is
denoted .DELTA..sub.current). When the difference between
.DELTA..sub.baseline and .DELTA..sub.current exceeds a certain
threshold, the computer 588 may detect an abnormal medical event
(e.g., a possible stroke). For example, the computer 588 may alert
about an abnormal medical event if the
|.DELTA..sub.baseline-.DELTA..sub.current|>.delta., for a
predetermined value 0.3, such as .delta.=5 mmHg or .delta.=10 mmHg.
Optionally, the computer 588 detects the abnormal medical event if
|.DELTA..sub.baseline-.DELTA..sub.current|>.delta. for systolic
blood pressure values calculated based on successive measurements
taken during a predetermined period. For example, the abnormal
medical event if
|.DELTA..sub.baseline-.DELTA..sub.current|>.delta. for systolic
blood pressure values calculated during a period of at least one
minute or a period of at least five minutes.
[0150] In some embodiments, the computer 588 calculates first and
second systolic blood pressure values based on PPG.sub.S1 and the
informative portions PPG.sub.S2, and the asymmetrical change to the
blood flow recognizable in PPG.sub.S1 and the informative portions
PPG.sub.S2 involves an increase in a difference between the first
and second systolic blood pressure values that exceeds a threshold.
To calculate each of the systolic blood pressure values, the
computer 588 may utilize one or more approaches known in the art,
such as the approaches mentioned in Hosanee, Manish et al.
"Cuffless Single-Site Photoplethysmography for Blood Pressure
Monitoring." Journal of clinical medicine vol. 9,3 723. 7 Mar.
2020, doi:10.3390/jcm9030723, which is incorporated herein by
reference. In other embodiments, the computer 588 may utilize an
additional signal indicative of cardiac activity to calculate the
first and second systolic blood pressure values based on Pulse
Arrival Times (PATs) at the first and second regions, respectively.
In one example, the additional signal indicative of cardiac
activity is an electrocardiogram signal measured by an
electrocardiograph (ECG) device. In another example, the additional
signal indicative of cardiac activity is a PPG signal measured at a
third region, at which pulse waves arrival earlier than they do at
the first and second regions. Additional discussion regarding
methods for calculating blood pressure based on PATs may be found
in U.S. Pat. No. 10,349,887, titled "Blood pressure measuring
smartglasses", which is incorporated herein by reference.
[0151] In some embodiments, a pulse arrival times (PAT) from a PPG
signal represents a time at which the value representing blood
volume (in the waveform represented in the PPG) begins to rise
(signaling the arrival of the pulse). Alternatively, the PAT may
represent a different time, with respect to the pulse waveform,
such as the time at which a value representing blood volume reaches
a maximum or a certain threshold, or the PAT may be the average of
the time the blood volume is above a certain threshold. Another
approach that may be utilized to calculate a PAT from an iPPG
signal is described in Sola et al., "Parametric estimation of pulse
arrival time: a robust approach to pulse wave velocity", in
Physiological measurement 30.7 (2009): 603, which describe a family
of PAT estimators based on the parametric modeling of the anacrotic
phase of a pressure pulse.
[0152] In some embodiments, the computer 588 utilizes a machine
learning-based approach to detect the abnormal medical event. This
involves generating feature values based on PPG.sub.S1 the
informative portions of PPG.sub.S2, and possibly other data, as
discussed below. The computer 588 then utilizes a model to
calculate, based on the feature values, a value indicative of
whether the user is experiencing the abnormal medical event.
Optionally, the model is generated from previously taken
measurements of PPG.sub.S1 and PPG.sub.S2 of the user taken at
times for which it was known whether the user experienced the
abnormal medical event. Optionally, the model is generated from
previously taken measurements of PPG.sub.S1 and PPG.sub.S2 of one
or more other users taken at times for which it was known whether
the one or more other users experienced the abnormal medical event.
Optionally, previous measurements of PPG.sub.S2 may restricted to
include values of informative portions or complete measurements
(not restricted to the informative portions).
[0153] Various types of feature values may be generated based on
PPG signals and utilized in embodiments described herein. In one
embodiment, the computer 588 generates feature values based on data
that includes PPG.sub.S1 and the informative portions of PPG.sub.S2
(e.g., values from measurements of the first signal 583 and the
second signal 586 during some current time period) and/or the
previous measurements of PPG.sub.S1 and PPG.sub.S2 (e.g., values
from measurements of the first and the second signals during one or
more earlier time periods).
[0154] In one example, the feature values may include values of the
first signal 583 and/or the second signal 586. In another example,
the feature values may include values of PPG.sub.S1 and the
informative portions of PPG.sub.S2 (taken in a current or earlier
time periods), such as amplitude values of PPG signals and/or
feature values derived from waveforms in PPG.sub.S1 and/or
PPG.sub.S2. Optionally, these feature values may relate to
properties of a pulse waveform, which may be a specific pulse
waveform (which corresponds to a certain beat of the heart), or a
window of pulse waveforms (e.g., an average property of pulse
waveforms in a certain window of time). Some examples of feature
values that may be generated based on a pulse waveform include: the
area under the pulse waveform, the amplitude of the pulse waveform,
a derivative and/or second derivative of the pulse waveform, a
pulse waveform shape, pulse waveform energy, and pulse transit time
(to the respective region at which it is measured). Some additional
examples of features may be indicative one or more of the
following: a magnitude of a systolic peak, a magnitude of a
diastolic peak, duration of the systolic phase, and duration of the
diastolic phase.
[0155] In another example, the feature values may include values
indicative of pulse arrival times (PATs) calculated based on
PPG.sub.S1 and the informative portions of PPG.sub.S2 or the
previous measurements of on PPG.sub.S1 and PPG.sub.S2. In still
another example, at least one of the feature values is indicative
of a difference in maximal amplitudes between PPG.sub.S1 and the
informative portions of PPG.sub.S2 relative to a difference in
maximal amplitudes between the previous measurements of PPG.sub.S1
and PPG.sub.S2. And in yet another example, at least one of the
feature values is indicative of a difference in pulse arrival times
between PPG.sub.S1 and PPG.sub.S2 relative to a pulse arrival time
between the previous measurements of PPG.sub.S1 and PPG.sub.S2.
[0156] In some embodiments, at least some feature values may be
generated based on other data sources (in addition to PPG signals).
In some examples, at least some feature values may be generated
based on other sensors, such as movement sensors (which may be
head-mounted, wrist-worn, or carried by the user some other way),
head-mounted thermal cameras (e.g., as mentioned above), or other
sensors used to measure the user. In other examples, at least some
feature values may be indicative of environmental conditions, such
as the temperature, humidity, and/or extent of illumination (e.g.,
as obtained utilizing an outward-facing head-mounted camera).
Additionally, some feature values may be indicative of physical
characteristics of the user, such as age, sex, weight, Body Mass
Index (BMI), skin tone, and other characteristics and/or situations
the user may be in (e.g., level of tiredness, consumptions of
various substances, etc.)
[0157] Stress is a factor that can influence the diameter of the
arteries, and thus influence calculated values that relate to the
PPG signals and/or blood flow. In one embodiment, the computer
receives a value indicative of a stress level of the user, and
generates at least one of the feature values based on the received
value. Optionally, the value indicative of the stress level is
obtained using a thermal camera.
[0158] Hydration is a factor that affects blood viscosity, which
can affect the speed at which the blood flows in the body. In one
embodiment, the computer 588 receives a value indicative of a
hydration level of the user, and generates at least one of the
feature values based on the received value. Optionally, the system
includes an additional camera that detects intensity of radiation
that is reflected from a region of exposed skin of the user, where
the radiation is in spectral wavelengths chosen to be
preferentially absorbed by tissue water. In one example, said
wavelengths are chosen from three primary bands of wavelengths of
approximately 1100-1350 nm, approximately 1500-1800 nm, and
approximately 2000-2300 nm. Optionally, measurements of the
additional camera are utilized by the computer 588 as values
indicative of the hydration level of the user.
[0159] The model utilized to detect the abnormal medical event may
be generated, in some embodiments, based on data obtained from one
or more users, corresponding to times in which the one or more
users were not affected by the abnormal medical event, and
additional data obtained while the abnormal medical event occurred
and/or following that time. Thus, this training data may reflect
PPG signals and/or blood flow both at normal times, and changes to
PPG signals and/or blood flow that may ensue due to the abnormal
medical event. This data may be used to generate samples, each
sample including feature values generated based on PPG signals of a
user and optionally additional data (as described above), and a
label. The label is a value related to the status of the abnormal
medical event. For example, the label may be indicative of whether
the user, at the certain time, experienced the abnormal medical
event. In another example, the label may be indicative of the
extent or severity of the abnormal medical event at the certain
time. In yet another example, the label may be indicative of the
duration until an onset of the abnormal medical event. In still
another example, the label may be indicative of the duration that
has elapsed since the onset of the abnormal medical event.
[0160] The following method may be used by systems modeled
according to FIG. 5. The steps described below may be performed by
running a computer program having instructions for implementing the
method. Optionally, the instructions may be stored on a
computer-readable medium, which may optionally be a non-transitory
computer-readable medium. In response to execution by a system
including a processor and memory, the instructions cause the system
to perform the following steps:
[0161] In Step 1, measuring, by first and second devices, first and
second signals indicative of photoplethysmogram signals (PPG.sub.S1
and PPG.sub.S2, respectively) at first and second regions on
different sides of a user's head. For example, the first and second
devices used in this step may be the first device 582 and the
second device 584.
[0162] In Step 2, detecting, based on the first signal,
advantageous timings for measuring the second signal for a purpose
of reconstructing informative portions of PPG.sub.S2.
[0163] In Step 3, commanding the second device to measure the
second signal during the advantageous timings.
[0164] And in Step 4, detecting the abnormal medical event based on
an asymmetrical change to blood flow recognizable in PPG.sub.S1 and
the informative portions of PPG.sub.S2.
[0165] In one example, the abnormal medical event detected in Step
4 is ischemic stroke, and detecting the asymmetrical change
comprises detecting an increase in asymmetry between blood flow on
the different sides of the head, with respect to a baseline
asymmetry between blood flow on the different sides of the
head.
[0166] In another example, the abnormal medical event detected in
Step 4 is a migraine, and detecting the asymmetrical change
comprises detecting a pattern of a certain change to facial blood
flow, which is associated with at least one previous migraine
attack, determined based on data comprising previous PPG.sub.S1 and
PPG.sub.S2, which were measured more than 5 minutes before the
previous migraine attack.
[0167] In yet another example, the abnormal medical event detected
in Step 4 is a headache, and detecting the asymmetrical change
comprises detecting at least one of: a change in directionality of
facial blood flow, and reduction in blood flow to one side of the
face.
[0168] In one embodiment, the method optionally includes a step of
calculating first and second systolic blood pressure values based
on PPG.sub.S1 and PPG.sub.S2. In this embodiment, detecting the
asymmetrical change in Step 4 involves detecting an increase in a
difference between the first and second systolic blood pressure
values that exceeds a threshold.
[0169] In one embodiment, the method optionally includes the
following steps: (i) generating feature values based on data
comprising PPG.sub.S1, the informative portions of PPG.sub.S2, and
the previous measurements of PPG.sub.S1 and PPG.sub.S2, and (ii)
utilizing a model for calculating, based on the feature values, a
value indicative of whether the user is experiencing the abnormal
medical event. Optionally, at least one of the feature values is
indicative of at least one of the following: a difference in
maximal amplitudes between PPG.sub.S1 and the informative portions
of PPG.sub.S2 relative to a difference in maximal amplitudes
between the previous measurements of PPG.sub.S1 and the informative
portions of PPG.sub.S2, and a difference in a pulse arrival time
between PPG.sub.S1 and PPG.sub.S2 relative to a pulse arrival time
between the previous measurements of PPG.sub.S1 and PPG.sub.S2.
[0170] FIG. 7 illustrates a system that collects images used for
iPPG, which utilizes multiple light sources. In one embodiment, the
system includes an inward-facing head-mounted camera 604 (also
referred to herein as "camera 604"), a first head-mounted light
source 602a and a second head-mounted light source 602b (also
referred to herein as "light sources 602a and 602b", respectively),
and a computer 608. In some embodiments, the system may optionally
include additional components, such as a movement sensor 607 and/or
frames of smartglasses 600, to which one or more of the
aforementioned components of the system may be coupled.
[0171] The camera 604 captures images of a region comprising skin
on a user's head. In one example, the region includes a portion of
the user's forehead. In another example, the region includes a
portion of a cheek of the user. In some embodiments, the camera 604
includes a CMOS or a CCD image sensor. Optionally, the image sensor
does not include a near-infrared filter that filters below a
wavelength 945 nm.
[0172] In some embodiments, at least a portion of the region
captured in images taken by the camera 604 is illuminated from
different illumination directions, during certain times in which
the system operates. For example, the portion may be illuminated by
multiple light emitting diodes (LEDs) positioned at different
locations. The discussion below refers to utilization of two light
sources that illuminate the region from two different illumination
directions, however some embodiments may include more than two
light sources that illuminate the region from more than two
illumination directions.
[0173] The light sources 602a and 602b illuminate at least a
portion of the region from different illumination directions,
differing by at least 10.degree.. That is, the difference between
the illumination direction of the first light source 602a and the
second light source 602b is at least 10.degree.. Optionally, the
illumination directions differ by more than 40.degree.. In one
embodiment, the light sources 602a and 602b, as well as the camera
604, are coupled to the frame of the smartglasses 600. Optionally,
the light sources 602a and 602b are located on different sides of
the camera 604. Optionally, the light sources 602a and 602b are
located more than 2 cm away from each other. Optionally, the light
sources 602a and 602b are located more than 4 cm away from each
other.
[0174] Herein, the "illumination direction" of a light source is
represented by vector in the direction of the center of the light
emitted by the light source (i.e., the vector representing the
average direction of a ray of light emitted by the light source).
When directions of different light sources are compared with
respect to a region on the face that is being illuminated by them,
such as the case of the "at least a portion of the region"
mentioned above, the difference is defined as the difference in the
angle that is expressed in terms of either (i) the angle formed at
the point of the intersection of the two vectors (if the two
vectors intersect), or (ii) the angle at an intersection of
projections of the two vectors on the plane whose normal is at the
center of the "at least a portion of the region", if the two
vectors do not intersect. FIG. 8 illustrates the first light source
602a and the second light source 602b that illuminate at least a
portion of the region 605 from different illumination directions
which differ by an angle .alpha.>10.degree..
[0175] Herein, stating that "at least a portion of the region" is
illuminated by the light sources 602a and 602b means that light
emitted by the light sources 602a and 602b, possibly at different
times (due to their operation being synchronized as described
below), reaches the portion of the region and is reflected by it.
This reflected light is detected by the image sensor of the camera
604 (and thus affects values of at least some of the pixels in the
interlaced images 606). In one example, the "at least a portion of
the region" comprises the entire region, such that all the pixel
values in the interlaced images 606 may be affected by the
illumination of the light sources 602a and 602b. In other examples,
the "at least a portion of the region" consists less than the whole
region, i.e., some pixel values in the interlaced images 606 are
not affected by the illumination of the light sources 602a and
602b, since light from the light sources 602a and/or 602b does not
reach certain areas on the face that are visible in the interlaced
images 606. In one example, the portion of the region illuminated
by both the light sources 602a and 602b comprises less than half of
the area of the region that is captured in the interlaced images
606.
[0176] It is to be noted that in the explanation above, references
to light emitted by a light source reaching or not reaching a
certain area may also be interpreted as the intensity of the light
emitted by the light source that reaches the certain area being
above or below a certain threshold. Thus, in the discussion above,
a weak intensity of illumination of an area (which falls below the
certain threshold) may be considered to not illuminate the
area.
[0177] The computer 608 synchronizes the operation of the light
sources 602a and 602b and the camera 604, such that the camera 604
captures an interlaced sequence of images 606. The captured
interlaces sequence of images 606 includes: a first sequence of
images captured while illumination of the portion of the region by
the first light source 602a is more intense than illumination of
the portion of the region by the second light source 602b, and a
second sequence of images captured while the illumination of the
region by the second light source 602b is more intense than the
illumination of the region by the first light source 602a.
Optionally, while the images in the first sequence are captured,
the portion of the region is not illuminated by the second light
source 602b. Optionally, while the images in the second sequence
are captured, the portion of the region is not illuminated by the
first light source 602a.
[0178] The first and second sequences of images may be interlaced
with variable segment sizes that may be predetermined and/or
optimized according to the performance of an algorithm that
processes the images. For example, assuming that images belonging
to first sequence are denoted 1,2,3,4, . . . and images belonging
to the second sequence are denoted a,b,c,d, . . . then the images
may be captured according to various schemes in order to form an
interlaced sequence of images. In one example, the images may be
interlaced successively to form the interlaced sequence
[a,1,b,2,c,3,d,4, . . . ]. In a second example, the images may be
interlaced dynamically, such as [1,a,2,3,b,4,5,c,d,e,6 . . . ] or
[a,b,1,2,3,c,d,4,5,6, . . . ]. Optionally, dynamic interlacing may
be performed according to quality of iPPG signals extracted from
each of the sequences of images in order to obtained a better
signal from combining the sequences. FIG. 8 illustrates an example
in which images are interlaced according to the pattern
[1,2,a,b,3,4,c,d,5,6,e,f, . . . ].
[0179] In some embodiments, the computer 608 may issue commands
609a to the first light source 602a and/or the second light source
602b, which include operating parameters for the first light source
602a and/or the second light source 602b, such as timings at which
to emit light in order to obtained a desired sequence of
illumination that enables capturing the interlaced sequence of
images 606. Additionally or alternatively, the computer 608 may
issue commands 609b to the camera 604 in order for it to operate in
a manner that captures the interlaced sequence of images 606. For
example, the commands 609b may include timings for capturing images
and/or a desired frame rate at which to capture images.
[0180] In some embodiments, the computer 608 extracts one or more
iPPG signals based on the interlaced images 606 (which may also be
referred to herein as "calculating iPPG signals"). Optionally, the
computer 608 extracts the iPPG signals from regions in the
interlaced images 606, which cover the portion of the region
illuminated (possibly at different times) by the first light source
602a and the second light source 602b. Optionally, to extract the
iPPG signals, the computer 608 may utilize one or more of the
computational approaches mentioned herein. Optionally, the computer
608 calculates values of a physiological response based on iPPG
signals recognizable in the interlaced sequence of images 606, such
as the heart rate, respiration rate, and/or blood pressure. For
example, the computer 608 may utilize one or more computational
approaches known in the art for calculating such physiological
responses based on a PPG signal with an input that is an iPPG
signal extracted from the interlaced images 606.
[0181] In some embodiments, the computer 608 calculates signal
quality indexes for the one or more iPPG signals extracted from the
interlaced images 606, such as the quality score indexes mentioned
in Elgendi, Mohamed, in "Optimal Signal Quality Index for
Photoplethysmogram Signals." Bioengineering (Basel, Switzerland)
vol. 3,4 21. 22 Sep. 2016, which is incorporated herein by
reference. These signal quality indexes are based on various
properties such as perfusion, kurtosis, skewness, relative power,
non-stationanty, zero crossing, entropy, and the matching of
systolic wave detectors. In one example, the computer 608
calculates signal to noise ratios for the one or more iPPG signals
extracted from the interlaced images 606.
[0182] In some embodiments, the computer 608 calculates separate
iPPG signals from each of the first and second sequences that are
interlaced. Thus, for example, in one embodiment, the computer 608
may extract a first iPPG signal from the images [1,2,3,4, . . . ]
and extract a second iPPG signal from the images [a,b,c,d, . . . ].
Thus, algorithms utilizing the iPPG signals as input, e.g., in
order to detect a physiological response, may do so based on
multiple iPPG signals that may have different properties, such as
different noise levels, due to the images from which they were
generated being illuminated differently.
[0183] In one embodiment, the separate iPPG signals that are
extracted from each of the first and second sequences, which are
part of the interlaced images 606, are combined into a single iPPG
signal by averaging their values. Optionally, the separate iPPG
signals are combined using a dynamic weighted average approach in
which the weight assigned to each of the iPPG signals is
proportional to a signal quality index calculated for that signal
over a certain period (e.g., a duration of a few seconds).
[0184] In still other embodiments, in order to extract iPPG
signals, the computer 608 utilizes an algorithm that suppresses
some surface reflections embedded in the images based on
differences between certain images captured while the portion of
the region was illuminated from the different illumination
directions.
[0185] The following is an example of such an algorithm for
extracting iPPG signals (also referred to herein as an "iPPG
algorithm"), which is adapted to suppress some of the surface
reflections embedded in the interlaced images 606 based on
differences between images captured while the region was
illuminated from different directions. For simplicity, the
following description of algorithm involves one color, but the
extension of the algorithm to handle multiple colors is
straightforward. The channels of the selected color from each frame
(of the interlaced images 606) are divided into grids of N pixels
(such as 1.times.1, 4.times.4, 10.times.10). The grids confining
the face region are the Regions Of Interest (ROIs). Optionally,
these ROIs are tracked across frames with a motion tracker (some
head-mounted setups do not require such a tracker). The spatial
average of the intensity of a pixel y.sub.i.sup.L(t) within
ROI.sub.i, when it is illuminated by illumination L (L=1, . . . , k
for k different illumination states), at time t, is modeled as:
y.sub.i.sup.L(t)=I.sub.i.sup.L(.alpha..sub.i.sup.Lp(t)+b.sub.i.sup.L)+q.s-
ub.i.sup.L(t)
[0186] The illumination L may have different values based on the
way of operating the light sources and the number of light sources.
For example, when there are two light sources that are operated
interchangeably, then L has two states (i.e., k=2). When the
computer 608 operates the light sources in more than two
combinations, then L shall have additional states (i.e., k>2).
Assuming there are two light sources operating interchangeably,
I.sub.i.sup.L is the incident light intensity in ROI.sub.i when
illuminated by light source L. .alpha..sub.i.sup.L is the strength
of blood perfusion in ROI.sub.i when illuminated by light source L,
b.sub.i.sup.L is the surface reflectance from the skin in ROI.sub.i
when illuminated by light source L, and q.sub.i.sup.L(t) is the
camera quantization noise for the camera 604 which is capturing the
images. p(t) denotes the pulsatile pulse wave, which is indicative
of the volume of pulsatile blood and represents a value that is
independent of the illumination of ROI.sub.i.
[0187] Some part of the incident light penetrates beneath the skin
surface, and gets modulated by the pulsatile pulse wave p(t) due to
light absorption, before being back-scattered. .alpha..sub.i.sup.L
represents the strength of modulation of light back-scattered from
the subsurface due to the pulse wave change, and primarily depends
on the average blood perfusion in the selected ROI.sub.i, and
depend on the wavelength of the incident light. When using multiple
colors .alpha..sub.i.sup.L would depend also on the wavelength,
which is in contrast with b.sub.i.sup.L that essentially does not
depend on the wavelength, and thus should further improve the
result.
[0188] Thus, when incident illumination I.sub.i.sup.L falls on skin
ROI.sub.i, there is typically little difference between values of
the components .alpha..sub.i.sup.1p(t) and .alpha..sub.i.sup.2p(t)
since sub-surface reflection typically involves numerous scattering
events which make this value less dependent on the direction of
illumination. In contrast, there is typically a much bigger
difference between b.sub.i.sup.1 and b.sub.i.sup.2, which is the
surface reflection that does not involve as many scattering events
as sub-surface reflection, and as such, can greatly depend on
differences in the illumination directions between the light
sources.
[0189] Then y.sub.i.sup.L(t) may be temporally filtered using a
bandpass filter, such as [0.5 Hz,5 Hz], to reject the out of band
component of the skin surface reflection
(I.sub.i.sup.Lb.sub.i.sup.L) and other noise outside the band of
interest to obtain filtered pixel intensities y.sub.i.sup.L(t).
When the pulse rate is known (such as when received from a contact
PPG), then the bandpass filter that is applied can be much
narrower.
[0190] The filtered values y.sub.i.sup.L(t) can then be used to
obtain an estimation of the PPG signal {circumflex over (p)}(t). In
one example, y.sub.i.sup.L(t) for i=1, . . . , N and L=1, . . . k
are combined using a weighted average to receive the iPPG signal,
using the following formula {circumflex over
(p)}(t)=.SIGMA..sub.i=1 . . . N,L=1 . . .
kG.sub.i.sup.Ly.sub.i.sup.L(t), where the weights G.sub.i.sup.L may
be computed based on the idea of maximal ratio diversity, as
discussed in Park, et al. (2018), "Direct-global separation for
improved imaging photoplethysmography" In Proceedings of the IEEE
Conference on Computer Vision and Pattern Recognition Workshops
(pp. 1375-1384), which is incorporated herein by reference. Maximum
ratio diversity assigns weights that are proportional to the
root-mean-squared (RMS) value of the signal component, and
inversely proportional to the mean-squared noise in y.sub.i.sup.L,
in order to maximize the signal-to-noise ratio of the overall
calculated iPPG signal.
[0191] Thus, this PPG algorithm has an advantage of dynamically
assigning different weights to y.sub.i.sup.L(t) based on the noise
that is detected with different illumination directions, which can
strengthen the signal from which {circumflex over (p)}(t) is
estimated, compared to a scenario in which only images illuminated
from a single direction are used.
[0192] In some embodiments, the need to suppress some of the
surface reflections embedded images captured by the camera 604 can
arise under certain conditions. For example, suppression of surface
reflections may help improve iPPG signal extraction when the
illumination from the environment changes significantly over a
short period, such as when the user moves about. In one embodiment,
the system includes a head-mounted movement sensor 607, such as an
inertial measurement unit (IMU). In this embodiment, the computer
608 suppress some of the surface reflections embedded in the images
(e.g., using the aforementioned iPPG algorithm), after detecting a
movement above a threshold. Optionally, the computer 608 reduces
rate of suppressing of some of the surface reflections embedded in
the images after measuring movement below a threshold for a second
certain duration. For example, when movement falls below the
threshold, the computer 608 may refrain from instructing the light
sources 602a and 602b to operate intermittently or reduce the times
in which they are utilized to illuminate the region of the face
captured in the images 606.
[0193] Different forms of illumination may be utilized to obtain
the interlaced sequence of images 606. In some embodiments, the
first and second light sources 602a and 602b emit essentially at
the same spectrum band. For example, spectrum bands of light
emitted from the first and second light sources 602a and 602b
overlap or are within less than 10 nm for each other. Optionally,
spectrum bands of light emitted from the first and second light
sources 602a and 602b fall within the NIR spectrum range (e.g., a
narrow range somewhere between 780 nm and 1100 nm). In one example,
the first and second light sources 602a and 602b emit light at a
wavelength of 850 nm or 940 nm. In another example, the first and
second light sources 602a and 602b emit light in a visible
wavelength, such as 550 nm. Optionally, the first and second light
sources 602a and 602b emit interchangeably (i.e., the first light
source 602a does not emit while the second light source 602b is
emitting light, and vice versa). Optionally, each of the first and
second light sources 602a and 602b includes at least two emitters
that emit light in two different spectrum bands. Optionally, the
computer 608 calculates iPPG signal quality indexes for signals
extracted when each of the illumination bands were used and selects
to utilize a band for which the signal quality index was
higher.
[0194] In other embodiments, the first and second light sources
602a and 602b emit light falling in different spectrum bands. For
example, the first light source 602a emits light at a wavelength of
850 nm and the second light source 602b emits light at a wavelength
of 940 nm. Optionally, the computer 608 evaluates signal quality
indexes calculated for iPPG signals extracted from images captured
while the region was illuminated by light with different spectrum
bands emitted by the first and second light sources 602a and 602b,
and selects for each light source a spectrum band that yields an
iPPG signal with a quality that reaches a certain threshold. In one
example, the computer 608 may calculate signal to noise ratios for
iPPG signals extracted from images illuminated at the different
spectrum bands, and then instruct the camera 604 to capture more
images using illumination with a spectrum band that yields a higher
value of signal to noise.
[0195] When polarized light hits the skin surface without
penetrating the skin, the light's polarization is mostly retained.
However, when the polarized light penetrates the skin, the light
loses its polarization due to the many scattering events that
occur. This difference enables the system to utilize
cross-polarization to reduce noise due to non-penetrating
reflections. In one embodiment, the first light source 602a and/or
the and second light source 602b emit light with a certain
polarization and the camera 604 comprises a polarizer that filters
light with the certain polarization. For example, the first light
source 602a and/or the and second light source 602b may emit light
with horizontal polarization, while the camera 604 Includes a
vertical polarizer. With this form of cross-polarization, most of
the horizontal polarized surface reflection is rejected, which will
strengthen the signal obtained from sub-surface reflections that
are modulated by pulsatile blood flow.
[0196] Another approach that may be utilized in some embodiments to
strengthen iPPG signals is for the first light source 602a and/or
the and second light source 602b to emit structured light (i e,
utilize a structured illumination pattern such as stripes to
illuminate the region). Thus, some portions of the region may
receive more light than others. In one example of structured
illumination, a high-frequency binary pattern is projected onto the
skin from at least one of the first light source 602a and the
second light source 602b. The skin areas that are directly lighted
contain the surface reflections, while the skin areas that are not
directly lighted contain both the global component and direct
component of light. Because the direct reflections act as an
all-pass filter while the sub-surface scattering act as a low pass
filter, by comparing between adjacent lighted and not lighted
region, the computer 608 can differentiate between the penetrating
and non-penetrating reflections. Additional discussion regarding
techniques the computer 608 may employ to better differentiate
between penetrating and non-penetrating reflections based on
analyzing skin areas that are directly lighted and skin areas that
are not directly lighted is provided in the aforementioned
reference of Park and Veeraraghavan (2018).
[0197] In some embodiments, the interlaced sequence of images 606
may be utilized by the computer 608 to generate an avatar for the
user based. Being able to better estimate the extents of surface
reflections, based on the interlaced sequence of images, may help
the computer 608 to render a better avatar that suffers less from
inaccuracies due to inaccurate facial models and/or due to not
being able to differentiate between specular and diffuse
reflections.
[0198] The following method may be used by systems modeled
according to FIG. 7. The steps described below may be performed by
running a computer program having instructions for implementing the
method. Optionally, the instructions may be stored on a
computer-readable medium, which may optionally be a non-transitory
computer-readable medium. In response to execution by a system
including a processor and memory, the instructions cause the system
to perform the following steps:
[0199] In Step 1, synchronizing operations of first and second
head-mounted light sources, that illuminate at least a portion of a
region comprising skin on a user's head from different illumination
directions differing by at least 10.degree.. In one example, the
first and second head-mounted light sources are the first and
second head-mounted light sources 602a and 602b, respectively.
Optionally, synchronizing their operation involves issuing commands
609a that are indicative of timings at which to operate each of the
light sources.
[0200] And in Step 2, capturing, by an inward-facing head-mounted
camera, an interlaced sequence of images of the region comprising:
a first sequence of images captured while illumination of the
region by the first light source is more intense than illumination
of the region by the second light source, and a second sequence of
images captured while the illumination of the region by the second
light source is more intense than the illumination of the region by
the first light source.
[0201] In one embodiment, the method optionally includes a step of
calculating iPPG signals based on the interlaced sequence of images
captured in Step 2. Optionally, the method further includes steps
involving: utilizing an algorithm configured to suppress some
surface reflections embedded in the images, for calculating the
imaging photoplethysmogram signals, based on differences between
images captured while the portion of the region was illuminated
from the different illumination directions.
[0202] In one embodiment, the first and second light sources emit
at different spectrum bands, and the method includes the following
steps: calculating signal to noise ratios for the different
spectrum bands, and capturing more images with the higher signal to
noise illumination.
[0203] US Patent Application 2019/0223737A1, which is herein
incorporated by reference in its entirety and is a previous patent
application of the Applicant of this invention, discusses and
illustrates in paragraphs 0040-0049, together with their associated
drawings, various examples of head-mounted systems equipped with
head-mounted cameras, which can be adapted to be utilized with some
of the embodiments herein. For example, these paragraphs illustrate
various inward-facing head-mounted cameras coupled to an eyeglasses
frame, illustrate cameras that capture regions on the periorbital
areas, illustrate an optional computer that may include a
processor, memory, a battery and/or a communication module,
illustrate inward-facing head-mounted cameras coupled to an
augmented reality devices, illustrate head-mounted cameras coupled
to a virtual reality device, illustrate head-mounted cameras
coupled to a sunglasses frame, illustrate cameras configured to
capture various regions, such as the forehead, the upper lip, the
cheeks, and sides of the nose, illustrate inward-facing
head-mounted cameras mounted to protruding arms, illustrate various
inward-facing head-mounted cameras having multi-pixel sensors (FPA
sensors) configured to capture various regions, illustrate
head-mounted cameras that are physically coupled to a frame using a
clip-on device configured to be attached/detached from a pair of
eyeglasses in order to secure/release the device to/from the
eyeglasses, illustrate a clip-on device holds at least an
inward-facing camera, a processor, a battery, and a wireless
communication module, illustrate right and left clip-on devices
configured to be attached behind an eyeglasses frame, illustrate a
single-unit clip-on device configured to be attached behind an
eyeglasses frame, and illustrate right and left clip-on devices
configured to be attached/detached from an eyeglasses frame and
having protruding arms to hold the inward-facing head-mounted
cameras.
[0204] It is noted that the elliptic and other shapes of the
regions captured by cameras and other sensing devices in some of
the drawings are just for illustration purposes, and the actual
shapes of the regions are usually not as illustrated. Furthermore,
illustrations and discussions of a camera represent one or more
cameras, where each camera may have the same field of view (FOV)
and/or different FOVs. A camera includes multiple sensing elements,
and the illustrated region captured by the camera usually refers to
the total region captured by the camera, which is made of multiple
regions that are respectively captured by the different sensing
elements. The positions of the cameras in the figures are just for
illustration, and the cameras may be placed at other positions.
[0205] Various embodiments described herein involve a head-mounted
system (HMS) that may be connected, using wires and/or wirelessly,
with a device carried by the user and/or a non-wearable device. The
HMS may include a battery, a computer, sensors, and a
transceiver.
[0206] FIG. 9A and FIG. 9B are schematic illustrations of possible
embodiments for computers (400, 410) that are able to realize one
or more of the embodiments discussed herein that include a
"computer". The computer (400, 410) may be implemented in various
ways, such as, but not limited to, a microcontroller, a computer on
a chip, a system-on-chip (SoC), a system-on-module (SoM), a
processor with its required peripherals, a server computer, and/or
any other computer form capable of executing a set of computer
instructions. Further, references to a computer or a processor
include any collection of one or more computers and/or processors
(which may be at different locations) that individually or jointly
execute one or more sets of computer instructions. This means that
the singular term "computer" is intended to imply one or more
computers, which jointly perform the functions attributed to "the
computer". In particular, some functions attributed to the computer
may be performed by a computer on a wearable device (e.g.,
smartglasses) and/or a computer of the user (e.g., smartphone),
while other functions may be performed on a remote computer, such
as a cloud-based server.
[0207] The computer 400 includes one or more of the following
components: processor 401, memory 402, computer readable medium
403, user interface 404, communication interface 405, and bus 406.
The computer 410 includes one or more of the following components:
processor 411, memory 412, and communication interface 413.
[0208] Functionality of various embodiments may be implemented in
hardware, software, firmware, or any combination thereof. If
implemented at least in part in software, implementing the
functionality may involve a computer program that includes one or
more instructions or code stored or transmitted on a
computer-readable medium and executed by one or more processors.
Computer-readable media may include computer-readable storage
media, which corresponds to a tangible medium such as data storage
media, and/or communication media including any medium that
facilitates transfer of a computer program from one place to
another. Computer-readable medium may be any media that can be
accessed by one or more computers to retrieve instructions, code,
data, and/or data structures for implementation of the described
embodiments. A computer program product may include a
computer-readable medium. In one example, the computer-readable
medium 403 may include one or more of the following: RAM, ROM,
EEPROM, optical storage, magnetic storage, biologic storage, flash
memory, or any other medium that can store computer readable
data.
[0209] A computer program (also known as a program, software,
software application, script, program code, or code) can be written
in any form of programming language, including compiled or
interpreted languages, declarative or procedural languages. The
program can be deployed in any form, including as a standalone
program or as a module, component, subroutine, object, or another
unit suitable for use in a computing environment. A computer
program may correspond to a file in a file system, may be stored in
a portion of a file that holds other programs or data, and/or may
be stored in one or more files that may be dedicated to the
program. A computer program may be deployed to be executed on one
or more computers that are located at one or more sites that may be
interconnected by a communication network.
[0210] Computer-readable medium may include a single medium and/or
multiple media (e.g., a centralized or distributed database, and/or
associated caches and servers) that store one or more sets of
instructions. In various embodiments, a computer program, and/or
portions of a computer program, may be stored on a non-transitory
computer-readable medium, and may be updated and/or downloaded via
a communication network, such as the Internet. Optionally, the
computer program may be downloaded from a central repository, such
as Apple App Store and/or Google Play. Optionally, the computer
program may be downloaded from a repository, such as an open source
and/or community run repository (e.g., GitHub).
[0211] At least some of the methods described herein are
"computer-implemented methods" that are implemented on a computer,
such as the computer (400, 410), by executing instructions on the
processor (401, 411). Additionally, at least some of these
instructions may be stored on a non-transitory computer-readable
medium.
[0212] As used herein, references to "one embodiment" (and its
variations) mean that the feature being referred to may be included
in at least one embodiment of the invention. Separate references to
embodiments may refer to the same embodiment, may illustrate
different aspects of an embodiment, and/or may refer to different
embodiments.
[0213] Sentences in the form of "X is indicative of Y" mean that X
includes information correlated with Y, up to the case where X
equals Y. Sentences in the form of "provide/receive an indication
(of whether X happened)" may refer to any indication method.
[0214] The word "most" of something is defined as above 51% of the
something (including 100% of the something) Both a "portion" of
something and a "region" of something refer to a value between a
fraction of the something and 100% of the something. The word
"region" refers to an open-ended claim language, and a camera said
to capture a specific region on the face may capture just a small
part of the specific region, the entire specific region, and/or a
portion of the specific region together with additional region(s).
The phrase "based on" indicates an open-ended claim language, and
is to be interpreted as "based, at least in part, on".
Additionally, stating that a value is calculated "based on X" and
following that, in a certain embodiment, that the value is
calculated "also based on Y", means that in the certain embodiment,
the value is calculated based on X and Y. Variations of the terms
"utilize" and "use" indicate an open-ended claim language, such
that sentences in the form of "detecting X utilizing Y" are
intended to mean "detecting X utilizing at least Y", and sentences
in the form of "use X to calculate Y" are intended to mean
"calculate Y based on X".
[0215] The terms "first", "second" and so forth are to be
interpreted merely as ordinal designations, and shall not be
limited in themselves. A predetermined value is a fixed value
and/or a value determined any time before performing a calculation
that utilizes the predetermined value. When appropriate, the word
"value" may indicate a "predetermined value". The word "threshold"
indicates a "predetermined threshold", which means that the value
of the threshold, and/or the logic used to determine whether the
threshold is reached, is known before start performing computations
to determine whether the threshold is reached.
[0216] The embodiments of the invention may include any variety of
combinations and/or integrations of the features of the embodiments
described herein. Although some embodiments may depict serial
operations, the embodiments may perform certain operations in
parallel and/or in different orders from those depicted. Moreover,
the use of repeated reference numerals and/or letters in the text
and/or drawings is for the purpose of simplicity and clarity and
does not in itself dictate a relationship between the various
embodiments and/or configurations discussed. The embodiments are
not limited in their applications to the order of steps of the
methods, or to details of implementation of the devices, set in the
description, drawings, or examples. Moreover, individual blocks
illustrated in the figures may be functional in nature and
therefore may not necessarily correspond to discrete hardware
elements.
[0217] Certain features of the embodiments, which may have been,
for clarity, described in the context of separate embodiments, may
also be provided in various combinations in a single embodiment.
Conversely, various features of the embodiments, which may have
been, for brevity, described in the context of a single embodiment,
may also be provided separately or in any suitable sub-combination.
Embodiments described in conjunction with specific examples are
presented by way of example, and not limitation. Moreover, it is
evident that many alternatives, modifications, and variations will
be apparent to those skilled in the art. It is to be understood
that other embodiments may be utilized and structural changes may
be made without departing from the scope of the embodiments.
Accordingly, this disclosure is intended to embrace all such
alternatives, modifications, and variations that fall within the
spirit and scope of the appended claims and their equivalents.
* * * * *