U.S. patent application number 14/877171 was filed with the patent office on 2016-04-14 for systems and methods for generating orthotic device models by surface mapping and extrusion.
The applicant listed for this patent is Sols Systems Inc.. Invention is credited to Daniel Bersak, Ciaran N. Murphy, Kegan L. Schouwenburg, Jeff Smith.
Application Number | 20160101571 14/877171 |
Document ID | / |
Family ID | 55654838 |
Filed Date | 2016-04-14 |
United States Patent
Application |
20160101571 |
Kind Code |
A1 |
Schouwenburg; Kegan L. ; et
al. |
April 14, 2016 |
SYSTEMS AND METHODS FOR GENERATING ORTHOTIC DEVICE MODELS BY
SURFACE MAPPING AND EXTRUSION
Abstract
A method for generating an orthotic device is disclosed. In one
implementation, the method includes generating, based on image data
representative of a body part of a patient, three-dimensional model
data representative of the body part. A surface defined by the
three-dimensional model data is identified, the surface
corresponding to a contact site at which an orthotic device is to
contact the body part. CAD model data is generated that defines a
volume corresponding to an extrusion operation performed on the
identified surface. The CAD model data is transmitted to a
three-dimensional printer to produce the orthotic device based on
the CAD model data.
Inventors: |
Schouwenburg; Kegan L.; (New
York, NY) ; Bersak; Daniel; (Brooklyn, NY) ;
Smith; Jeff; (Brooklyn, NY) ; Murphy; Ciaran N.;
(West Brookfield, MA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Sols Systems Inc. |
New York |
NY |
US |
|
|
Family ID: |
55654838 |
Appl. No.: |
14/877171 |
Filed: |
October 7, 2015 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
62061667 |
Oct 8, 2014 |
|
|
|
62061668 |
Oct 8, 2014 |
|
|
|
Current U.S.
Class: |
602/5 ;
382/128 |
Current CPC
Class: |
B33Y 80/00 20141201;
A61F 5/0127 20130101; B33Y 50/02 20141201; G05B 19/4099 20130101;
G05B 2219/49007 20130101; A61B 5/0077 20130101; B29C 64/386
20170801; A61B 5/1074 20130101; G05B 2219/35134 20130101; B33Y
30/00 20141201; A61B 5/4851 20130101; A61B 5/6898 20130101; B33Y
50/00 20141201; B33Y 10/00 20141201 |
International
Class: |
B29C 67/00 20060101
B29C067/00; G06T 17/00 20060101 G06T017/00; A61F 5/01 20060101
A61F005/01; G06T 7/00 20060101 G06T007/00 |
Claims
1. A method comprising: receiving, by a processing device, image
data captured by a client device, wherein the image data is
representative of a body part of a patient; generating, based on
the image data, three-dimensional model data representative of the
body part; identifying, by the processing device, a surface defined
by the three-dimensional model data, the surface corresponding to a
contact site at which an orthotic device is to contact the body
part; generating CAD model data, wherein the CAD model data defines
a volume corresponding to an extrusion operation performed on the
identified surface; and transmitting the CAD model data to a
three-dimensional printer, wherein the three-dimensional printer is
to produce the orthotic device based on the CAD model data.
2. The method of claim 1, wherein the image data comprises at least
one of a still image or a video.
3. The method of claim 1, wherein the client device is a mobile
device of the patient.
4. The method of claim 3, further comprising: receiving, by the
processing device, patient information data, wherein the patient
information data is received from the client device.
5. The method of claim 1, wherein the orthotic device is selected
from a group consisting of a helmet, body armor, sports equipment,
a prosthetic, a cast, a splint, a clothing article, a furniture
piece, a vehicle seat, vehicle or robotic control mechanism, a
physical therapy device, a glove, and a surgical instrument.
6. The method of claim 1, wherein the body part is selected from a
group consisting of a foot, a leg, a torso, an arm, a hand, a
wrist, a neck, and a head.
7. The method of claim 1, wherein the data comprises inertial
measurement unit data, and wherein generating the three-dimensional
model data of the body part further comprises generating the
three-dimensional model data of the body part based on the inertial
measurement unit data.
8. The method of claim 1, wherein generating the CAD model data
comprises: extruding, by the processing device, the identified
surface to define the volume; and integrating the defined volume
into a parametric CAD model to generate the CAD model data.
9. The method of claim 1, wherein the orthotic device produced by
the three-dimensional printer is to comprise a first
patient-registered surface and a second non-patient-registered
surface.
10. The method of claim 1, wherein generating the three-dimensional
model data of the body part comprises: identifying a unique visual
pattern within a captured image of the image data; determining an
orientation of the unique visual pattern; and extracting depth
information from the unique visual pattern, wherein the
three-dimensional model data is generated based at least in part on
the depth information.
11. A system comprising: a memory; and a processing device coupled
to the memory, wherein the processing device is to: receive image
data captured by a client device, wherein the image data is
representative of a body part of a patient; generate, based on the
image data, three-dimensional model data representative of the body
part; identify a surface defined by the three-dimensional model
data, the surface corresponding to a contact site at which an
orthotic device is to contact the body part; generate CAD model
data, wherein the CAD model data defines a volume corresponding to
an extrusion operation performed on the identified surface; and
transmit the CAD model data to a three-dimensional printer, wherein
the three-dimensional printer is to produce the orthotic device
based on the CAD model data.
12. The system of claim 11, wherein the image data comprises at
least one of a still image or a video.
13. The system of claim 11, wherein the client device is a mobile
device of the patient.
14. The system of claim 13, wherein the processing circuitry is
further to: receive patient information data, wherein the patient
information data is received from the client device.
15. The system of claim 11, wherein the orthotic device is selected
from a group consisting of a helmet, body armor, sports equipment,
a prosthetic, a cast, a splint, a clothing article, a furniture
piece, a vehicle seat, vehicle or robotic control mechanism, a
physical therapy device, a glove, and a surgical instrument.
16. The system of claim 11, wherein the body part is selected from
a group consisting of a foot, a leg, a torso, an arm, a hand, a
wrist, a neck, and a head.
17. The system of claim 11, wherein the data comprises inertial
measurement unit data, and wherein to generate the
three-dimensional model data of the body part, the processing
device is further to: generate the three-dimensional model data of
the body part based on the inertial measurement unit data.
18. The system of claim 11, wherein to generate the CAD model data,
the processing device is further to: extrude the identified surface
to define the volume; and integrate the defined volume into a
parametric CAD model to generate the CAD model data.
19. The system of claim 11, wherein the orthotic device produced by
the three-dimensional printer is to comprise a first
patient-registered surface and a second non-patient-registered
surface.
20. The system of claim 11, wherein to generate the
three-dimensional model data of the body part, the processing
device is further to: identify a unique visual pattern within a
captured image of the image data; determine an orientation of the
unique visual pattern; and extract depth information from the
unique visual pattern, wherein the three-dimensional model data is
to be generated based at least in part on the depth
information.
21. An orthotic device comprising: a patient-registered surface,
wherein the patient-registered surface is shaped to fit to a body
part of a patient, and wherein the patient-registered surface was
produced by a three-dimensional printer using data generated from
images of the body part of the patient; and a non-patient
registered surface.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of priority of both U.S.
Provisional Patent Application No. 62/061,667, filed on Oct. 8,
2014, and U.S. Provisional Patent Application No. 62/061,668, filed
on Oct. 8, 2014, which are hereby incorporated by reference herein
in their entireties.
TECHNICAL FIELD
[0002] This disclosure relates to the field of corrective orthotic
devices, in particular, to generating models from user-captured
data to produce orthotic devices.
BACKGROUND
[0003] An orthotic insert is a type of orthotic device that, when
inserted into a shoe and applied to a foot, supports the foot by
redistributing ground reaction forces while properly aligning foot
joints during motion. Orthotic inserts are typically used to treat
biomechanical deformities as well as inflammatory conditions (e.g.,
plantar fasciitis) in patients.
[0004] Various methods have been employed to produce orthotic
inserts. For example, plaster cast, gait scanning, and laser
scanning methods attempt to capture plantar geometry in a weight
bearing position. However, such methods are generally slow in
acquiring orthotic data, are expensive, and are limited in the
range of characteristics that they can provide to the resulting
orthotic device. In such methods, the orthotic device is
customizable insofar as it is designed with a particular ailment in
mind, while the treatment is implemented as a one-size-fits-all
solution that may be far from optimal for some patients.
[0005] Moreover, current methods of orthotic insert production are
generally limited to the machining of hard materials (top down
approaches). This also limits the range of characteristics
(flexibility, shock absorption, weight, etc.) of the end product.
Shapes of the orthotic inserts tend to be mixed and matched from a
database, which may result in orthotic inserts that are unique to a
particular lab or production facility but not to a particular
patient.
SUMMARY
[0006] The following is a simplified summary of the disclosure in
order to provide a basic understanding of some aspects of the
disclosure. This summary is not an extensive overview of the
disclosure. It is intended to neither identify key or critical
elements of the disclosure, nor delineate any scope of the
particular implementations of the disclosure or any scope of the
claims. Its sole purpose is to present some concepts of the
disclosure in a simplified form as a prelude to the more detailed
description that is presented later.
[0007] In one aspect of the present disclosure, a method includes
generating, based on image data representative of a body part of a
patient, three-dimensional model data representative of the body
part. A surface defined by the three-dimensional model data is
identified, the surface corresponding to a contact site at which an
orthotic device is to contact the body part. CAD model data is
generated that defines a volume corresponding to an extrusion
operation performed on the identified surface. The CAD model data
is transmitted to a three-dimensional printer to produce the
orthotic device based on the CAD model data.
[0008] In another aspect of the present disclosure, an orthotic
device includes a patient-registered surface that is shaped to fit
to a body part of a patient, the patient-registered surface having
been produced by a three-dimensional printer using data generated
from three-dimensional data or images of the body part of the
patient. The orthotic device further includes a non-patient
registered surface.
[0009] In one or more of the disclosed implementations, computing
devices for performing the operations of the above described
implementations are also disclosed. Additionally, in
implementations of the disclosure, a computer readable storage
media stores methods for performing the operations of the above
described implementations.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The present disclosure is illustrated by way of example, and
not by way of limitation, in the figures of the accompanying
drawings, in which:
[0011] FIG. 1A illustrates an example system architecture in
accordance with an implementation of the disclosure;
[0012] FIG. 1B is a block diagram illustrating features of a model
generation component in accordance with an implementation of the
disclosure;
[0013] FIG. 2A illustrate the capture of image data according to an
implementation of the disclosure;
[0014] FIG. 2B illustrates the capture of image data according to
another implementation of the disclosure;
[0015] FIG. 3 illustrates image processing for model generation
according to an implementation of the disclosure;
[0016] FIG. 4A is an illustration of a density lattice used in
generating an orthotic insert according to an implementation of the
disclosure;
[0017] FIG. 4B shows an orthotic insert produced according to an
implementation of the disclosure;
[0018] FIG. 4C shows a view of CAD model data of an extruded
surface generated based on 3D model data according to an
implementation of the disclosure;
[0019] FIG. 4D shows an alternate view of CAD model data of an
extruded surface generated based on 3D model data according to an
implementation of the disclosure;
[0020] FIG. 4E shows a view of CAD model data of an extruded
surface generated based on 3D model data according to an
implementation of the disclosure;
[0021] FIG. 5 is a flow diagram illustrating a method for producing
a patient-registered orthotic device according to an implementation
of the disclosure; and
[0022] FIG. 6 is a block diagram illustrating an exemplary computer
system for use in accordance an implementation of the
disclosure
DETAILED DESCRIPTION
[0023] Implementations are described for producing orthotic devices
from user-captured data. Image data of a body part of a patient
(e.g., the patient's foot) can be captured using a client device,
such as a mobile device having a camera. An interface implemented
on the client device can instruct a user of the device (e.g., the
patient, a physician, an assistant, etc.) to capture image data
(e.g., images and/or video) of the body part. For example, the
interface may utilize various indicators (e.g., visual cues) to
guide data capture. If the device includes an inertial measurement
unit (IMU), then IMU data may also be captured during the image
data capture, which may facilitate downstream data processing. The
device can also allow for data entry of patient information, which
can include physical parameters related to the patient (e.g.,
height, weight, age, pre-existing medical conditions, etc.).
Implementations are also described for generating
patient-registered orthotic devices by surface matching and
extrusion.
[0024] The captured image data, IMU data, and patient information
may be transmitted to a server, which uses the received data to
generate a three-dimensional (3D) model of the body part. In doing
so, relevant measurements related to the body part may be
extracted, and are in turn used to generate a CAD model of an
orthotic device (e.g., an orthotic insert). The CAD model may be
transmitted, for example, to a manufacturing facility that can
print the orthotic device using a 3D printer.
[0025] The implementations of the present disclosure provide
several advantages over traditional orthotic device technologies,
and orthotic insert technologies in particular. The implementations
described herein effectively transform a mobile device into a
scanner that is portable, lightweight, and accurate, allowing for a
patient to perform his/her own scans without requiring trained
medical personnel. In addition, a digital 3D model of a foot in
digital format facilitates error checking of the quality of the
scan through visualization and comparison with prior scans and
studies. The digital 3D model can continue to be inspected,
re-used, or re-processed by the algorithm or updated based on new
algorithms and/or updated patient data. Once complete, the digital
3D model can be transmitted from a patient device or a device of
the physician to a manufacturing facility for production of the
orthotic insert, alleviating the need to physically transport a
mold or cast (which typically cannot be re-used) to the
manufacturing facility. The level of accuracy and consistency in
the digital 3D model of a foot, as provided by the disclosed
implementations, outperforms plaster cast and gait scanning
methods. Moreover, the implementations described herein can produce
an orthotic insert with variable density by incorporating surface
and sub-surface structures that further customize the mechanical
properties of the orthotic insert, resulting in an orthotic insert
that is extremely thin, cushioned, and patient-optimized all at
once.
[0026] The term "orthotic device", as used herein, refers to any
device worn by or externally applied to an individual that provides
neuromuscular support to the individual, provides skeletal support
to the individual, and/or provides prophylactic functionality. The
term "corrective device", as used herein, refers to a type of
orthotic device that provides a therapeutic benefit to an
individual (e.g., who may be referred to herein as "a patient")
when worn by or externally applied by to the individual. While the
implementations herein are described with respect to orthotic
devices for treating or supporting a patient's foot (i.e., orthotic
shoe inserts), it is to be understood that the systems and methods
described herein are applicable to the production of other types of
devices. For example, the implementations described herein are
generally applicable to the production of devices that are
customized to fit to the human body, devices utilizing
customization and optimization related to human activity,
bio-mechanics, and anatomy, processes that can be applied to
consumer devices with or without specialized hardware or skill, and
devices that utilize components or structures that would be
difficult to produce in mass quantities with traditional
manufacturing approaches. Such devices may include, but are not
limited to, helmets, body armor, sports equipment, prosthetics,
casts, splints, clothing, furniture, vehicle seats, vehicle or
robotic control mechanisms, physical therapy devices, gloves,
surgical instruments, and sterile medical packing.
[0027] FIG. 1A illustrates an example system architecture 100, in
accordance with an implementation of the disclosure, for generating
an orthotic device. The system architecture 100 includes client
devices 110A-110Z, a modeling server 120, a production server 130,
3D printers 132A-132Z, a data store 140, and a network 150.
[0028] In one implementation, the client devices 110A-110Z may each
include computing devices such as personal computers (PCs),
laptops, mobile phones, smart phones, tablet computers, netbook
computers etc. Client devices 110A-110Z may also be referred to as
"user devices". An individual user may be associated with (e.g.,
own and/or use) one or more client devices (e.g., one or more of
client devices 110A-110Z). Client devices 110A-110Z may each be
owned and utilized by different users at different locations. As
used herein, a "user" may refer generally to an individual operator
of one or more of client devices 110A-110Z, and may be a patient
for which an orthotic device is to be produced, a clinician or
physician who may be involved in the preparation of the orthotic
device in conjunction with, or on behalf of, the patient, an
assistant to the patient, etc.
[0029] The client devices 110A-110Z may each implement user
interfaces 112A-112Z, respectively. Each of user interfaces
112A-112Z may allow a user of the respective client device
110A-110Z to send and receive information to one or more of the
modeling server 120 and the production server 130. For example, one
or more of the user interfaces 112A-112Z may be a web browser
interface that can access, retrieve, present, and/or navigate
content (e.g., web pages such as Hyper Text Markup Language (HTML)
pages) provided by the modeling server 120. In one implementation,
one or more of the user interfaces 112A-112Z may be a standalone
application (e.g., a mobile app), which may have been provided by
the modeling server 120 (e.g., as a downloadable application), that
allows a user of a respective client device 110A-110Z to send and
receive information to the modeling server 120. In one
implementation, the user interfaces 112A-112Z guide their
respective users in capturing image data of a body part, which is
utilized downstream by the modeling server 120 to generate a 3D
model of the body part. The term "image data" is intended to
include any type of visual data that can be captured by an optical
instrument (e.g., a photographic camera, a charge-coupled device
(CCD) camera, an infrared camera, etc.), including videos, static
images, and video frames. In certain implementations, image data
may include 3D data obtained from or generated by a 3D scanner
(e.g., point cloud data).
[0030] In one implementation, one or more of the client devices
110A-110Z may capture and store image data 114A-114Z, respectively,
which may include one or more static images, videos, and/or audio
data (e.g., which may be embedded within the video data or may be a
separate audio track). The image data 114A-114Z may be made
accessible to other devices of the system architecture 100 via the
network 150. For example, captured image data may be transmitted to
(e.g., streamed in real-time during capture or transmitted at a
later time after capturing the data) the modeling server 120 and/or
the data store 140. Each of client devices 110A-110Z may also
capture IMU data 116A-116Z, respectively, which may include
gyroscopic data, magnetometer data, GPS data, etc., captured by the
respective client device while in use. For example, IMU data 116A
captured while a user is operating the client device 110A to
capture image data 114A of his/her foot may be used to estimate the
orientation of the client device 110A (e.g., if the client device
is a mobile device with a camera), and may later be leveraged to
identify frames of video that best capture the foot to facilitate
downstream data processing. The IMU data 116A-116Z may be made
accessible to other devices of the system architecture 100 via the
network 150. In some implementations, IMU data captured by the
client device 110A may be used to help guide a user of the client
device 110A in capturing image data. For example, the IMU data may
indicate an orientation of the client device 110 (e.g., to
determine if the client device 110A is level) to facilitate the
accurate capture of particular views of the body part (e.g., if the
camera of the client device 110 is not level to the ground, an
indication may be generated that guides the user to properly orient
the camera).
[0031] In one implementation, the modeling server 120 may be one or
more computing devices (such as a rackmount server, a router
computer, a server computer, a personal computer, a mainframe
computer, a laptop computer, a tablet computer, a desktop computer,
etc.), data stores (e.g., hard disks, memories, databases),
networks, software components, and/or hardware components. The
modeling server 120 may include a model generation component 160
(which may be executed by a processing device of the modeling
server 120) that is capable of generating three-dimensional (3D)
models of a patient's body part (e.g., the patient's foot) based on
image data-captured by the patient/user of one of client devices
110A-110Z, as well as additional patient-related data (e.g.,
medical data). The model generation component 160 may also be
capable of generating a CAD model of an orthotic device based on
the 3D model of the patient's body part. In some implementations,
the model generation component 160 may be implemented on a
different device than modeling server 120. For example, in some
implementations, one or more of client devices 110A-110Z may
implement the model generation component 160, and modeling server
120 may be omitted from the system architecture 100. In other
implementations, the modeling server 120 may be combined with the
production server 130 as a single server. In one implementation,
the modeling server 120 may utilize high performance computing
resources (e.g., available via the Internet) by outsourcing data
processing functions to high-performance computing devices.
[0032] In one implementation, the production server 130 may be one
or more computing devices (such as a rackmount server, a router
computer, a server computer, a personal computer, a mainframe
computer, a laptop computer, a tablet computer, a desktop computer,
etc.), data stores (e.g., hard disks, memories, databases),
networks, software components, and/or hardware components. The
production server 130 may receive CAD model data from the modeling
server 120 via the network 150. The CAD model data may be
converted, by the production server 130, into a format suitable for
3D printing prior to transmitting the data to one or more of the 3D
printers 132A-132Z.
[0033] In one implementation, the 3D printers 132A-132Z are
communicatively coupled to the production server 130, as
illustrated. In some implementations, one or more of the 3D
printers 132A-132Z may be coupled to one or more of the devices of
the system architecture 100 in addition to the production server
130, which may be communicatively coupled to these devices via the
network 150. Each of the 3D printers 132A-132Z may be capable of
one or more of fused deposition modeling, stereolithography,
selective laser sintering, or any type of 3D printing technology as
would be understood by one of ordinary skill in the art.
[0034] In one implementation, the data store 140 may be a memory
(e.g., random access memory), a cache, a drive (e.g., a hard
drive), a flash drive, a database system, or another type of
component or device capable of storing data. The data store 140 may
also include multiple storage components (e.g., multiple drives or
multiple databases) that may also span multiple computing devices
(e.g., multiple server computers), and may be cloud-based. In some
implementations, the data store 140 may be a part of the modeling
server 120. In some implementations, the data store 140 may be
distributed among and accessible to one or more of the client
devices 110A-110Z, the modeling server 120, and the production
server 130. One or more of the devices of the system architecture
100 may utilize the data store 140 to store public and private
data, and the data store 140 may be configured to provide secure
storage for private data (e.g., patient-specific information).
[0035] In one implementation, the network 150 may include a public
network (e.g., the Internet), a private network (e.g., a local area
network (LAN) or wide area network (WAN)), a wired network (e.g.,
Ethernet network), a wireless network (e.g., an 802.11 network or a
Wi-Fi network), a cellular network (e.g., a Long Term Evolution
(LTE) network), routers, hubs, switches, server computers, or a
combination thereof. In some implementations, the network 150 may
be a combination of different types of networks. Image data
114A-114Z and IMU data 116A-116Z of any of client devices 110A-110Z
may be transmitted to modeling server 120 and/or production server
130 via the network 150. Likewise, 3D model data and CAD model data
may be transmitted from the modeling server 120 to any one of the
client devices 110A-110Z and the production server 130 via the
network 150.
[0036] FIG. 1B is a block diagram illustrating features of a model
generation component 160 in accordance with an implementation of
the disclosure. The model generation component 160 may be the same
as its identically named counterpart of FIG. 1A. In one
implementation, the model generation component 160 includes a
client interface module 162, an image processing module 164, a 3D
model generation module 166, and a CAD model generation module 168.
More or less components may be included in the model generation
component 160 without loss of generality. For example, two or more
of the modules may be combined into a single module, or one of the
modules may be divided into two or more modules. In one
implementation, one or more of the modules may reside on different
computing devices (e.g., different server computers, on a client
device, distributed among multiple client devices, etc.). The model
generation component 160 was described as being implemented by the
modeling server 120 of FIG. 1A, but may be implemented by any of
the client devices 110A-110Z and the production server 130. For
example, a client device (e.g., client device 110A) may be
programmed to perform some or all of the functions of the model
generation component 160. When the model generation component 160
is implemented on a client device, any functions described with
respect to the model generation component 160 that "receive",
"transmit", "generate", "retrieve", "identify", "determine",
"select", etc., are understood to refer to functions performed by
sub-systems or sub-modules within the client device rather than
across a network (e.g., the network 150), as would be appreciated
by one of ordinary skill in the art.
[0037] In one implementation, the model generation component 160 is
communicatively coupled to the data store 140. For example, the
model generation component 160 may be coupled to the data store 140
via a network (e.g., via network 150). As described with respect to
FIG. 1A, the data store 140 may be a memory (e.g., a random access
memory), a cache, a drive (e.g., a hard drive), a flash drive, a
database system, or another type of component or device capable of
storing data. The data store 106 may also include multiple storage
components (e.g., multiple drives or multiple databases) that may
also span multiple computing devices (e.g., multiple server
computers), and may be cloud-based. In one implementation, the data
store 140 may include patient information 142, patient image data
144, 3D model data 146, and CAD model data 148. While data store
140 is illustrated and described with respect to a single patient,
and it is to be understood that data store 140 may store data
associated with multiple patients, and the implementations
described herein may be performed for multiple patients
concurrently.
[0038] The term "patient information", as used herein, refers to
any alphanumeric data that may describe one or more physical
aspects of a patient. The patient information 142 may include, but
is not limited to, the patient's height, weight, age, ethnicity, a
pre-existing medical condition (e.g., a podiatric medical
condition), a measured foot length, shoe size, etc. In some
implementations, the patient information is provided by the patient
(e.g., using one of the client devices 110A-110Z). For example,
prior to being prompted to capture image data or after capturing
image data, the user may be provided with a user interface, such as
a fillable-form interface, that allows the user to enter
physiological and/or medical data associated with the patient
(e.g., the user may be the patient, a physician of the patient, or
an assistant of the patient or physician). The data may then be
transmitted to the modeling server (e.g., modeling server 120) for
processing. In some implementations, the patient information may be
provided by a physician (e.g., using one of client devices
110A-110Z and/or using modeling server 120). In some
implementations, both the patient and the physician may provide
portions of the patient information. In some implementations, some
or all of the information may have been previously stored in the
data store. In some implementations, the patient information may be
provided by more than one client device (e.g., provided by more
than one of client devices 110A-110Z).
[0039] In one implementation, the patient image data 144 includes
images and/or video captured by a user of a client device (e.g.,
one or more of client devices 110A-110Z). The patient image data
144 may also include IMU data stored along with the images and/or
video (and may be time-synchronized with video frames), which may
be used by the model generation component 160 during image
processing.
[0040] In one implementation, the model generation component 160
utilizes the client interface module 162 to send/receive
information to/from a client device. The client interface module
162 may provide an interface for requesting information from the
client device. At the user end, a user interface (e.g., user
interface 112A) may be in the form of a web page or a standalone
application that may provide an interface to enter patient
information, as well as instructions to the user as to how to
capture the image data related to a body part of the patient.
[0041] In one implementation, as illustrated in FIG. 2A, a user may
orient a client device 204 (which may correspond to any of client
devices 110A-110Z) to capture images and/or video of a foot 202 of
the patient based in accordance with instructions/indicators
provided on a display 208 of the client device 204 (e.g., a visual
cue). In one implementation, the instructions/indicators may
alternatively or additionally include audio cues and/or haptic
feedback (e.g., a vibration to indicate proper orientation of the
client device 204 with respect to the foot 202). A relative
coordinate axis 210 may be defined which may serve as a reference
point for captured IMU data. As an illustrative implementation, a
user captures video of the foot 202 at different orientations with
respect to the client device 204 and takes two or more still images
of the weight bearing heel and arch. The video and/or images may be
captured using a built-in camera 206 of the client device 204.
[0042] As illustrated in FIG. 2B, indicators (e.g., visual cues)
may be generated for display within an image window 212 within
display 208 as the client device 204 is capturing image data of the
foot 202. The indicators, such as reticles 214, may serve as
indicators for guiding the user to capture relevant regions of the
foot in weight-bearing and non-weight-bearing positions. In one
implementation, indicators may include overlaid grids, horizontal
lines, vertical lines, and foot-shaped targets/overlays. As the
user orients the built-in camera 206 and captures image data, the
client device 204 may provide additional indicators, such as
instructions 216, to inform the user of his/her progress and next
steps. In one implementation, visual, audio, and/or haptic cues may
indicate successful progress or problems with the capture process.
For example, IMU data may be used by the client device 204 to
determine that the camera motion has exceeded a threshold
translational or angular speed and/or acceleration, and a warning
indication may be generated indicating the data captured is
potentially unreliable and/or that the data should be recaptured.
In some implementations, visual indicators may change shape as
video is captured in order to guide the user through particularly
helpful views of the foot 202. For example, such visual indicators
may include arrows that direct the user. Examples of successful
views may be generated for display as a result of successful
capture or to provide suggested orientations to the user. For
example, side view 218 and back view 220 may be generated for
display as examples. Side view 218 and back view 220 may also
correspond to a 3D model of the foot generated by the model
generation component 160, which may have computed model data and
relevant measurements 222 and transmitted the relevant data back to
the client device for display. In one implementation, the client
device may generate a 3D model of the foot in real-time (e.g., when
the model generation component 160 is implemented on the client
device).
[0043] Referring back to FIG. 1B, in one implementation, the model
generation component 160 utilizes the image processing module 164.
Individual frames may be selected from the video based on image
quality (e.g. focus) and IMU data (if available) in order to best
represent the multitude of viewing angles contained in the video.
In one implementation, relative position, translational speed, and
angular speed of a camera of the client device captured by the IMU
of the client device. The data captured by the IMU may be used by
the image processing module 164 to determine the uniqueness of
information from a given video frame from other video frames based
on respective positions/orientations of the other frames, which may
be used to save processing time by eliminating video frames that
contain redundant data. In addition, the speed of the client device
may serve as an indicator of when motion artifacts (e.g., motion
blur from rolling shutter cameras) are likely to be present, thus
allowing frames captured during periods of fast camera movement to
be eliminated.
[0044] In one implementation, once the frames are selected, each of
the selected frames are filtered to remove background objects based
on, for example, colors appearing in the frame, leaving the foot
behind in the images, as illustrated in FIG. 3. In some
implementations, a special colored and/or patterned sock may be
worn by the patient during image data capture to isolate the foot
from background objects in the video frames and still images. The
pattern may be designed to aid in point finding, matching, and 3D
reconstruction. Example patterns include a randomized set of
symbols in size, orientation, and color. In general, the pattern
may be chosen such that its visual detail is robust against
aberrations that may occur due to the camera/lens components and
lighting conditions. In some implementations, 3D reconstruction may
be performed with foot images without an article worn by the
patient (e.g., based on skin tone, sub-surface scattering effects,
etc.).
[0045] In one implementation, the model generation component 160
utilizes the 3D model generation module 166 to generate 3D model
data describing the body part (e.g., a 3D reconstruction of the
body part). In one implementation, the frames selected by the image
processing module 164 are used as inputs into an algorithm that
performs registration between each frame using unique "key-points",
or identifiable groups of pixels in the images. The algorithm may
be a multi-view 3D reconstruction algorithm, a
structure-from-motion algorithm, or another suitable algorithm. The
relative location of the camera for each frame and the identified
key-points can be combined to identify the locations of the
key-points in 3D space as a point cloud representing an outer
surface of the foot. From this approach, the plantar surface
geometry can be obtained. In some implementations, a similar
approach may be applied to other body parts for the purposes of
obtaining 3D geometry, including, but not limited to other orthotic
devices, prosthetics, and organs. In some implementations, the
approach may be applied to inanimate objects (e.g., medical tools)
rather than to body parts.
[0046] For certain orientations of the foot, the 3D point cloud
data representing the surface of the foot, normal vectors of the
points, and the IMU data combined with successive calculations and
transformations of a 3D object-oriented bounding box may be used.
The point cloud model may also be converted into a "solid" model of
the foot for visualization purposes. The 3D model provides the
capability to visualize the foot in situations, for example, when
it is difficult or impractical to obtain a physical model of the
foot.
[0047] In one implementation, the 3D model generation module 166
and the image processing module 164 may derive various measurements
from the 3D model and selected images. The measurements include,
but are not limited to, foot length, weight bearing arch height
(e.g., maximum weight bearing height of the plantar arch at the
arch location, weight bearing subtalar joint angle, total adjusted
arch height, arch location (e.g., max plantar arch height location
in sagittal and transverse planes in a neutral pose), arch width
(e.g., midfoot width, width at peak arch), arch angle, prescription
arch height, ball location (e.g., sesamoid location along the
sagittal plane), mid heel location (e.g., mid calcaneus location
along the sagittal plane), ball width (e.g., maximum forefoot
width), and heel width (e.g., maximum plantar calcaneus foot
width). The selected images used to perform these derivations may
include, but are not limited to, an image of a heel of the foot in
the frontal plane, and an image of a medial arch of the foot in the
sagittal plane under weight bearing conditions.
[0048] In one implementation, one or more of measured foot length
(which may have been physically measured and entered as part of the
patient information), IMU data, or reference symbols on the sock
(e.g., patterns or objects attached to a sock), or objects located
in the environment of the foot may be used to scale the data
appropriately to ensure the accuracy of the derived measurements.
In one implementation, a known shoe size of the patient may be used
to determine the foot length. For example, foot length, L.sub.M and
L.sub.F, for male feet and female feet, respectively, may be
determined by:
L.sub.M=0.82387415*S.sub.M+18.7012954 Eq. 1
L.sub.F=0.83529411*S.sub.F+17.5176470 Eq. 2
where S.sub.M and S.sub.F are male and female shoe sizes,
respectively.
[0049] In some implementations, additional parameters may be
defined according to:
arch angle = tan - 1 ( arch height arch width ) , Eq . 3
prescription arch height = arch width * tan ( arch angle + subtalar
angle ) . Eq . 4 ##EQU00001##
In one implementation, the maximum arch height may be defined to
account for limitations imposed by the maximum angle that can be
made between the arch length and the arch height. The max angle may
be determined by the patient's forefront flexibility, according
to:
maximum arch height = arch length 2 * tan ( 0.45 + .alpha. ) , Eq .
5 ##EQU00002##
where .alpha.=0 for a rigid foot (maximum angle is 0.45 radians),
.alpha.=0.05 for a semi-rigid foot (maximum angle is 0.50 radians),
and .alpha.=0.1 for a flexible foot (maximum angle is 0.55
radians). In one implementation, if the prescription arch height is
greater than the maximum arch height, the arch height will be
modeled as the maximum arch height. Otherwise, the arch height will
be modeled as the prescription arch height. In one implementation,
if an original arch height is greater than the modeled arch height,
the modeled arch height may be set as the original arch height so
as to ensure that the orthotic device closely follows the contours
of the foot.
[0050] In one implementation, ball width (in millimeters) is
determined according to:
narrow ball width=0.2224(Foot Length)+1.0567, Eq. 6
medium ball width=0.2335(Foot Length)+1.1189, Eq. 7
wide ball width=0.2380(Foot Length)+1.1403, Eq. 8
for males, and:
narrow ball width=0.2707(Foot Length)+5.7137, Eq. 9
medium ball width=0.2853(Foot Length)+6.0209, Eq. 10
wide ball width=0.2911(Foot Length)+6.1437, Eq. 11
for females. In some implementations, the ball width may be limited
based on a particular shoe for which the orthotic insert is
designed to fit.
[0051] In one implementation, heel width (in centimeters) is
determined according to:
heel width=0.1473(Foot Length)+2.3882. Eq. 12
[0052] In one implementation, arch width may be limited according
to:
heel width.ltoreq.arch width.ltoreq.ball width, Eq. 13
where the arch width is constrained by these limits after all other
constraints are applied to the arch width.
[0053] In one implementation, arch thickness may be determined
based on a range of the patient's weight. For example, if the
patient's weight in pounds is less than 140 pounds, then the arch
thickness is modeled as 2 millimeters. If the patient's weight is
greater than or equal to 140 pounds and less than 180 pounds, the
arch thickness is modeled as 3 millimeters. If the patient's weight
is greater than or equal to 180 pounds, the arch thickness is
modeled as 4 millimeters.
[0054] In one implementation, the model generation component 160
utilizes the CAD model generation module 168 to generate CAD model
data (including parametric and non-parametric CAD model data) for
the orthotic device. The CAD model generation module 168 may use
various algorithms to determine optimal shapes and mechanical
properties of the orthotic device, which are described by the CAD
model data. While the CAD model data may be generated based on the
3D model data representing the foot, the data may describe an
orthotic insert that is designed to biomechanically adjust and
support the foot rather than match the surfaces of the foot from
the 3D model. The CAD model data may include information such as,
but not limited to, lattice designs and layouts, mesh perforation
properties (e.g., hole size/shapes, pitch, etc.), curve locations,
variations in lattice density, variable thicknesses, etc. In one
implementation, the 3D model data and patient information (e.g.,
age, height, weight, medical diagnosis, etc.) may be used as inputs
to a CAD model of the orthotic device (e.g., inputs into a
parametric CAD model) in order to output a customized model in a
format suitable for 3D printing on a 3D printer. It is noted that
3D printing is illustrative, and any suitable manufacturing method
may be utilized to produce the orthotic device.
[0055] In one implementation, the CAD model generation module 168
generates a series of surfaces that may be combined to form top,
bottom, and interior surfaces of a solid model of the orthotic
insert, as described by the CAD model data. In addition, lattice
and mesh structures can be generated to define the shape and
density profile of the orthotic insert, which can be placed within
target treatment zones (based on the derived measurements and
patient information). As illustrated in FIG. 4A, a heel portion 402
of the CAD model data is depicted, including an outer boundary 404,
a treatment boundary 406, and density profile 408 (represented as
an array of circles that may correspond to perforations in the
final orthotic insert). The treatment boundary 406 and the outer
boundary 404 may define a treatment zone specific for the patient's
heel. The density profile 408 may be applied to the treatment zone
to provide greater material density/stiffness at the left side of
the heel than the right side. A mapped heel portion 410 shows a
result of the mapping after it has been applied to the treatment
zone. An example of a 3D printed orthotic insert is shown in FIG.
4B, which is labeled to show relevant measurements taken into
account during generation of the CAD model data.
[0056] In one implementation, a heel stiffness is defined as:
k=32.6d.sub.diam-6.4d.sub.space+95.5d.sub.shell-13.4, Eq. 14 [0057]
where k is the stiffness (in kN/m), [0058] d.sub.diam is a
perforation diameter (in mm), [0059] d.sub.space is a spacing
between perforations (in mm), and [0060] d.sub.shell is a shell
size (in mm).
[0061] It is noted that the preceding equations are illustrative
examples of calculations used to identify measurements for
producing an orthotic insert. Other equations may also be used, and
the present implementations are not limited to the preceding
examples.
[0062] FIGS. 4C and 4D show views of CAD model data 452 of an
extruded surface generated based on 3D model data according to an
implementation of the disclosure. A 3D model 450 of a body part of
a patient (e.g., a portion of the patient's arm) may be generated
in accordance with any of the implementations described herein. The
3D model 450 has a surface 450A that corresponds to a location at
which the orthotic device represented by the CAD model data 452 is
to contact the body part. In some implementations, the surface 450A
may be identified automatically. For example, the 3D model
generation module 166 may compute various dimensions and/or analyze
curvature of the 3D model 450 and deduce that the surface 450A
corresponds to a target treatment site for the body part. In some
implementations, a clinician or technician may directly specify the
surface 450A. Such implementations may allow for a whole or partial
change of a previously generated surface (e.g., as generated by the
3D model generation module 166) and may impute some therapeutic
benefit (e.g., increased or decreased pressure on a particular area
of the body part). Once the surface 450A is identified, an
extrusion operation may be performed on the surface, resulting in
the CAD model data 452. Once produced based on the CAD model data
452, the orthotic device will have a patient-registered surface
452A and a non-patient-registered surface 452B. The
non-patient-registered surface 452B may be modified in any suitable
manner, for example, to include additional structural features
(e.g., buckles for coupling to other components, adhesives,
patient/clinician/manufacturer identifying information, etc.). FIG.
4E shows a view of CAD model data of an orthotic device generated
based on 3D model data according to an implementation of the
disclosure. 3D model 480 corresponds to a model of a patient's foot
that may have been generated in accordance with any of the
implementations described herein. 3D model 480 has a surface 480A
that corresponds to a location at which the orthotic device
represented by the CAD model data 482 is to contact the body part.
A surface 484 may be generated based on the surface 480A, and an
extrusion operation may be performed on the surface 484, resulting
in the CAD model data 482.
[0063] FIG. 5 is a flow diagram illustrating a method 500 for
producing a patient-registered orthotic device according to an
implementation of the disclosure. The method 500 may be performed
by processing logic that includes hardware (e.g., circuitry,
dedicated logic, programmable logic, microcode, etc.), software
(e.g., instructions run on a processing device to perform hardware
simulation), or a combination thereof. In one implementation,
method 500 may be performed by the model generation component 160
as described with respect to FIGS. 1A and 1B.
[0064] Referring to FIG. 5, method 500 begins at block 505 when
image data (e.g., image data 114A) captured by a client device
(e.g., client device 110A) is received (e.g., by the model
generation component 160), with the image data being representative
of a body part of the patient. In one implementation, the image
data is representative of a mold of a body part of the patient. In
an alternative implementation, the image data is representative of
an inanimate object. In one implementation, at least some of the
image data is received from an additional client device (such as a
client device of a physician) or data store (e.g., data store 140).
In one implementation, the body part is a foot, a leg, a torso, an
arm, a hand, a wrist, a neck, or a head.
[0065] At block 510, 3D model data representative of the body part
is generated based on the image data. In one implementation, the 3D
model data may be generated based on the image data, patient
information, and/or IMU data (e.g., IMU data 116A). In one
implementation, if the image data includes video, a plurality of
frames may be extracted from the video based on the IMU data and/or
optical flow. The 3D model data of the body part may be generated
at least in part from the plurality of frames. In one
implementation, the 3D model data is generated based on multiple
views of the body part that are represented by the image data. For
example, 3D model data may be generated by two or more views
selected from a front view, a side view, a top view, a bottom view,
or other physiologically-relevant view of the body part.
[0066] In one implementation, a unique visual pattern may be
identified within a captured image or frame of the image data. The
unique visual pattern may be a pattern of symbols, colors, etc.,
which, if recognized (e.g., by an image processing algorithm), may
be used to extract depth information from the captured image or
frame. Accordingly, the depth information, combined with depth
information from different views of the body part, may be used to
generate the 3D model data. In one implementation, the unique
visual pattern is present on an article worn on the body part of
the patient (e.g., a patterned sock as illustrated in FIGS. 2B and
3). In one implementation, the user device may have captured depth
information (e.g., using an infrared depth sensor), which may have
been received along with the image data. The depth information
captured by the user device may also be used to generate the 3D
model data.
[0067] At block 515, a surface defined by the 3D model data is
identified. The surface may correspond to a contact site at which
an orthotic device is to contact the body part. In some
implementations, the surface is identified (e.g., using the 3D
model generation module 166) by automatically deducing a target
treatment site of the body part based on patient information and/or
measurements computed from the 3D model data. In some
implementations, a clinician or technician may directly identify
the surface (e.g., by selecting the surface using one of the client
devices 110A) to change the surface wholly or in part (e.g., to
increase or decrease a pressure applied to a portion of the body
part when the orthotic device is applied to the body part).
[0068] In one implementation, the orthotic device a helmet, body
armor, sports equipment, a prosthetic, a cast, a splint, a clothing
article, a furniture piece, a vehicle seat, vehicle or robotic
control mechanism, a physical therapy device, a glove, or a
surgical instrument.
[0069] At block 520, CAD model data is generated (e.g., using the
CAD model generation module 168), which defines a volume
corresponding to an extrusion operation performed on the identified
surface. In some implementations, the identified surface is
extruded to define the volume, and the defined volume is integrated
into a parametric CAD model to generate the CAD model data. For
example, the parametric CAD model may receive specific measurements
and patient information as input, which may be used to scale and
size the orthotic device so that it fits and secures the
patient-registered surface to the target treatment site of the
patient. In some implementations, the orthotic device produced by
the three-dimensional printer is to include a first
patient-registered surface and a second non-patient-registered
surface. In some implementations, the second non-patient-registered
surface may be generated based on the parametric CAD model
data.
[0070] At block 525, the CAD model data is transmitted to a 3D
printer (e.g., 3D printer 132A). The 3D printer is to produce the
orthotic device based on the CAD model data. In one implementation,
the CAD model data is in a suitable format for 3D printing. In one
implementation, the CAD model data is converted into a suitable
format for 3D printing prior to transmitting the CAD model data to
the 3D printer. In one implementation, the CAD model data may
define one or more layers within the orthotic device. The layers
may then be printed and laminated together. In one implementation,
the modified CAD model may describe one or more portions that form
an inverse representation (e.g., a mold cavity) of the orthotic
device, and the inverse representation may be filled with material
including, but not limited to, urethane, epoxy, wax, plastic,
metal, or foam. In one implementation, the material of the one or
more layers may include, but is not limited to, nylon,
antimicrobial nylon, ABS plastic, PLA plastic, polyurethane,
leather, foam, or combinations thereof. The printed orthotic device
may be colored by dyeing or through chemical or photo exposure. The
printed orthotic device may be covered with a material, such as
leather, neoprene, a fabric, etc. In one implementation,
information on user preferences (e.g., materials, colors, etc.) for
the orthotic device may have been received in addition to the
patient information and image data. In one implementation,
identifying information is 3D printed onto (e.g., onto the
none-patient-registered surface) or within the orthotic device. For
example, a patient's name, a physician's name, a date of
manufacture, a name of the manufacturing facility, a company name,
etc. may be printed onto or within the orthotic device.
[0071] For simplicity of explanation, the methods of this
disclosure are depicted and described as a series of acts. However,
acts in accordance with this disclosure can occur in various orders
and/or concurrently, and with other acts not presented and
described herein. Furthermore, not all illustrated acts may be
required to implement the methods in accordance with the disclosed
subject matter. In addition, those skilled in the art will
understand and appreciate that the methods could alternatively be
represented as a series of interrelated states via a state diagram
or events. Additionally, it should be appreciated that the methods
disclosed in this specification are capable of being stored on an
article of manufacture, such as a computer-readable device or
storage media, to facilitate transporting and transferring such
methods to computing devices. Accordingly, the term "article of
manufacture", as used herein, is intended to include a computer
program accessible from any computer-readable device or storage
media.
[0072] FIG. 6 illustrates a diagrammatic representation of a
machine in the exemplary form of a computer system 600 within which
a set of instructions, for causing the machine to perform any one
or more of the methodologies discussed herein, may be executed. In
alternative implementations, the machine may be connected (e.g.,
networked) to other machines in a LAN, an intranet, an extranet, or
the Internet. The machine may operate in the capacity of a server
or a client machine in client-server network environment, or as a
peer machine in a peer-to-peer (or distributed) network
environment. The machine may be a personal computer (PC), a tablet
PC, a set-top box (STB), a Personal Digital Assistant (PDA), a
cellular telephone, a web appliance, a server, a network router,
switch or bridge, or any machine capable of executing a set of
instructions (sequential or otherwise) that specify actions to be
taken by that machine. Further, while only a single machine is
illustrated, the term "machine" shall also be taken to include any
collection of machines that individually or jointly execute a set
(or multiple sets) of instructions to perform any one or more of
the methodologies discussed herein. Some or all of the components
of the computer system 600 may be utilized by or illustrative of
any of client devices 110A-110Z, modeling server 120, production
server 130, 3D printers 132A-132Z, and data store 140.
[0073] The exemplary computer system 600 includes a processing
device (processor) 602, a main memory 604 (e.g., read-only memory
(ROM), flash memory, dynamic random access memory (DRAM) such as
synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static
memory 606 (e.g., flash memory, static random access memory (SRAM),
etc.), and a data storage device 618, which communicate with each
other via a bus 608.
[0074] Processor 602 represents one or more general-purpose
processing devices such as a microprocessor, central processing
unit, or the like. More particularly, the processor 602 may be a
complex instruction set computing (CISC) microprocessor, reduced
instruction set computing (RISC) microprocessor, very long
instruction word (VLIW) microprocessor, or a processor implementing
other instruction sets or processors implementing a combination of
instruction sets. The processor 602 may also be one or more
special-purpose processing devices such as an application specific
integrated circuit (ASIC), a field programmable gate array (FPGA),
a digital signal processor (DSP), network processor, or the like.
The processor 602 is configured to execute instructions 626 for
performing the operations and steps discussed herein.
[0075] The computer system 600 may further include a network
interface device 622. The computer system 600 also may include a
video display unit 610 (e.g., a liquid crystal display (LCD), a
cathode ray tube (CRT), or a touch screen), an alphanumeric input
device 612 (e.g., a keyboard), a cursor control device 614 (e.g., a
mouse), and a signal generation device 620 (e.g., a speaker). In
some implementations, the signal generation device 620 may include
a vibrational actuator (e.g., for providing haptic feedback).
[0076] The data storage device 618 may include a computer-readable
storage medium 624 on which is stored one or more sets of
instructions 626 (e.g., software) embodying any one or more of the
methodologies or functions described herein. The instructions 626
may also reside, completely or at least partially, within the main
memory 604 and/or within the processor 602 during execution thereof
by the computer system 600, the main memory 604 and the processor
602 also constituting computer-readable storage media. The
instructions 626 may further be transmitted or received over a
network 674 (e.g., the network 150) via the network interface
device 622.
[0077] In one implementation, the instructions 626 include
instructions for one or more model generation components 160, which
may correspond to the identically-named counterpart described with
respect to FIGS. 1A and 1B, and/or a software library containing
methods for performing the functions thereof. While the
computer-readable storage medium 624 is shown in an exemplary
implementation to be a single medium, the terms "computer-readable
storage medium" or "machine-readable storage medium" should be
taken to include a single medium or multiple media (e.g., a
centralized or distributed database, and/or associated caches and
servers) that store the one or more sets of instructions. The terms
"computer-readable storage medium" or "machine-readable storage
medium" shall also be taken to include any transitory or
non-transitory medium that is capable of storing, encoding or
carrying a set of instructions for execution by the machine and
that cause the machine to perform any one or more of the
methodologies of the present disclosure. The terms
"computer-readable storage medium" or "machine-readable storage
medium" shall accordingly be taken to include, but not be limited
to, solid-state memories, optical media, and magnetic media.
[0078] In the foregoing description, numerous details are set
forth. It will be apparent, however, to one of ordinary skill in
the art having the benefit of this disclosure, that the present
disclosure may be practiced without these specific details. In some
instances, well-known structures and devices are shown in block
diagram form, rather than in detail, in order to avoid obscuring
the present disclosure.
[0079] Some portions of the detailed description may have been
presented in terms of algorithms and symbolic representations of
operations on data bits within a computer memory. These algorithmic
descriptions and representations are the means used by those
skilled in the data processing arts to most effectively convey the
substance of their work to others skilled in the art. An algorithm
is herein, and generally, conceived to be a self-consistent
sequence of steps leading to a desired result. The steps are those
requiring physical manipulations of physical quantities. Usually,
though not necessarily, these quantities take the form of
electrical or magnetic signals capable of being stored,
transferred, combined, compared, and otherwise manipulated. It has
proven convenient at times, principally for reasons of common
usage, to refer to these signals as bits, values, elements,
symbols, characters, terms, numbers, or the like.
[0080] It should be borne in mind, however, that all of these and
similar terms are to be associated with the appropriate physical
quantities and are merely convenient labels applied to these
quantities. Unless specifically stated otherwise as apparent from
the foregoing discussion, it is appreciated that throughout the
description, discussions utilizing terms such as "sending",
"receiving", "transmitting", "forwarding", "caching", "causing",
"providing", "generating", "adding", "subtracting", "removing",
"analyzing", "determining", "enabling", "identifying", "modifying",
"mapping", or the like, refer to the actions and processes of a
computer system, or similar electronic computing device, that
manipulates and transforms data represented as physical (e.g.,
electronic) quantities within the computer system's registers and
memories into other data similarly represented as physical
quantities within the computer system memories or registers or
other such information storage, transmission or display
devices.
[0081] The disclosure also relates to an apparatus, device, or
system for performing the operations herein. This apparatus,
device, or system may be specially constructed for the required
purposes, or it may include a general purpose computer selectively
activated or reconfigured by a computer program stored in the
computer. Such a computer program may be stored in a computer- or
machine-readable storage medium, such as, but not limited to, any
type of disk including floppy disks, optical disks, compact disk
read-only memories (CD-ROMs), and magnetic-optical disks, read-only
memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs,
magnetic or optical cards, or any type of media suitable for
storing electronic instructions.
[0082] The words "example" or "exemplary" are used herein to mean
serving as an example, instance, or illustration. Any aspect or
design described herein as "example" or "exemplary" is not
necessarily to be construed as preferred or advantageous over other
aspects or designs. Rather, use of the words "example" or
"exemplary" is intended to present concepts in a concrete fashion.
As used in this application, the term "or" is intended to mean an
inclusive "or" rather than an exclusive "or". That is, unless
specified otherwise, or clear from context, "X includes A or B" is
intended to mean any of the natural inclusive permutations. That
is, if X includes A; X includes B; or X includes both A and B, then
"X includes A or B" is satisfied under any of the foregoing
instances. In addition, the articles "a" and "an" as used in this
application and the appended claims should generally be construed
to mean "one or more" unless specified otherwise or clear from
context to be directed to a singular form. Reference throughout
this specification to "an implementation" or "one implementation"
means that a particular feature, structure, or characteristic
described in connection with the implementation is included in at
least one implementation. Thus, the appearances of the phrase "an
implementation" or "one implementation" in various places
throughout this specification are not necessarily all referring to
the same implementation. Moreover, it is noted that the "A-Z"
notation used in reference to certain elements of the drawings is
not intended to be limiting to a particular number of elements.
Thus, "A-Z" is to be construed as having one or more of the element
present in a particular implementation. It is to be understood that
the above description is intended to be illustrative, and not
restrictive. Many other implementations will be apparent to those
of skill in the art upon reading and understanding the above
description. The scope of the disclosure should, therefore, be
determined with reference to the appended claims, along with the
full scope of equivalents to which such claims are entitled.
* * * * *