U.S. patent application number 16/548850 was filed with the patent office on 2020-02-27 for patient -centered system and methods for total orthodontic care management.
The applicant listed for this patent is Rohit C. Sachdeva. Invention is credited to Takao Kubota, Yeshwant Kumar Muthusamy, Rohit C. Sachdeva, Jitender vij.
Application Number | 20200066391 16/548850 |
Document ID | / |
Family ID | 69584088 |
Filed Date | 2020-02-27 |
![](/patent/app/20200066391/US20200066391A1-20200227-D00000.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00001.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00002.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00003.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00004.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00005.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00006.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00007.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00008.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00009.png)
![](/patent/app/20200066391/US20200066391A1-20200227-D00010.png)
View All Diagrams
United States Patent
Application |
20200066391 |
Kind Code |
A1 |
Sachdeva; Rohit C. ; et
al. |
February 27, 2020 |
PATIENT -CENTERED SYSTEM AND METHODS FOR TOTAL ORTHODONTIC CARE
MANAGEMENT
Abstract
According to one or more embodiments, a computer implemented
system for providing an orthodontic care management solution and a
method for providing an orthodontic care management solution may be
provided. The method may include receiving user data, such as by a
user interface associated with an orthodontic care management
platform. The method may further include obtaining authorization
related information associated with the orthodontic care management
solution. Additionally, the method may include determining a
treatment plan for the user based on the user data and the
authorization-related information. Also, the method may include
determining a sequencing plan associated with the treatment plan
based on an arrangement of one or more stages of operations
associated with the treatment plan. The method may further include
displaying the sequencing plan to the user. Further, the method may
include receiving feedback data associated with the treatment plan.
Additionally, the method may include updating the one or more
databases with the feedback data. Further, the method may include
performing an artificial intelligence enabled operation for
providing the orthodontic care management solution.
Inventors: |
Sachdeva; Rohit C.; (Plano,
TX) ; Kubota; Takao; (Kurume, JP) ; vij;
Jitender; (Trumbull, CT) ; Muthusamy; Yeshwant
Kumar; (Allen, TX) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Sachdeva; Rohit C. |
Plano |
TX |
US |
|
|
Family ID: |
69584088 |
Appl. No.: |
16/548850 |
Filed: |
August 23, 2019 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
62722319 |
Aug 24, 2018 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G16H 20/40 20180101;
A61C 7/22 20130101; A61F 7/00 20130101; A61C 7/145 20130101; A61F
2002/502 20130101; A61C 19/063 20130101; A61C 8/0009 20130101; A61F
2007/0017 20130101; A61C 7/08 20130101; A61C 7/12 20130101; G16H
30/40 20180101; A61C 7/10 20130101; G16H 50/20 20180101; A61C 5/30
20170201; A61F 2/50 20130101; A61C 7/146 20130101 |
International
Class: |
G16H 20/40 20060101
G16H020/40; G16H 30/40 20060101 G16H030/40; G16H 50/20 20060101
G16H050/20 |
Claims
1. A computer-implemented system for providing an orthodontic care
management solution, comprising: one or more databases configured
to store data of one or more users; and a server including computer
code for providing the orthodontic care management solution,
wherein the server comprises: at least one memory configured to
store the computer code, the computer code further comprising
computer executable instructions for performing at least one of one
or more functions comprising: receiving user data; obtaining
authorization data associated with the orthodontic care management
solution; determining a treatment plan based on the user data and
the authorization data, wherein the treatment plan comprises one or
more stages of operations; determining a sequencing plan associated
with the treatment plan, based on an arrangement of the one or more
stages of operations associated with the treatment plan; displaying
the sequencing plan to the user; receiving feedback data associated
with the treatment plan; updating the one or more databases with
the feedback data; and at least one processor configured to execute
the computer code to provide the orthodontic care management
solution.
2. The system of claim 1, wherein the user is a doctor, a patient,
a prospective patient, or a third party user.
3. The system of claim 1, wherein the computer code is accessed by
using a user device.
4. The system of claim 3, wherein the at least one processor is
further configured to execute the computer code for displaying a
virtual avatar on a user interface of the user device.
5. The system of claim 1, wherein the at least one processor is
further configured to execute the computer code for manufacturing
an orthodontic appliance based on the treatment plan.
6. The system of claim 5 wherein the manufacturing comprises
generating a 3D printed orthodontic appliance.
7. The system of claim 1, wherein the at least one processor is
further configured to execute the computer code for performing
artificial intelligence enabled operation for providing the
orthodontic care management solution.
8. The system of claim 7, wherein the artificial intelligence
enabled operation comprises a voice-to-action command.
9. The system of claim 7, wherein the artificial intelligence
enabled operation comprises an action-to-voice command.
10. The system of claim 1, wherein the at least one processor is
further configured to execute the computer code for receiving user
data by scanning a facial anatomy of the user.
11. The system of claim 10, wherein the facial anatomy comprises a
smile anatomy of the user.
12. The system of claim 1, wherein the at least one processor is
further configured to execute the computer code for obtaining
authorization data associated with the orthodontic care management
solution from a third party service provider.
13. The system of claim 12, wherein the third party service
provider comprises one or more of a bank, a financial institution,
a seller, a buyer and a manufacturer.
14. The system of claim 1, wherein the at least one or more
databases comprise at least one blockchain-enabled database.
15. A method for providing an orthodontic care management solution,
comprising: receiving user data; obtaining authorization data
associated with the orthodontic care management solution;
determining a treatment plan based on the user data and the
authorization data, wherein the treatment plan comprises one or
more stages of operations; determining a sequencing plan associated
with the treatment plan, based on an arrangement of the one or more
stages of operations associated with the treatment plan; displaying
the sequencing plan to a user on a user device; receiving feedback
data associated with the treatment plan; and updating one or more
databases with the feedback data.
16. The method of claim 15, further comprising displaying a virtual
avatar on a user interface of the user device.
17. The method of claim 15, further comprising sending a command to
a manufacturing device for manufacturing an orthodontic appliance
based on the treatment plan.
18. The method of claim 17, wherein the manufacturing device
comprises a 3D printer.
19. The method of claim 15, further comprising performing an
artificial intelligence enabled operation for providing the
orthodontic care management solution.
20. The method of claim 19, wherein the artificial intelligence
enabled operation comprises a voice-to-action command.
21. The method of claim 20, wherein the voice-to-action command is
provided by generating a virtual care navigator system.
22. The method of claim 19, wherein the artificial intelligence
enabled operation comprises an action-to-voice command.
23. The method of claim 22, wherein the action-to-voice command is
provided by generating a virtual care navigator system.
24. The method of claim 15, wherein receiving user data further
comprises scanning a facial anatomy of the user.
25. The method of claim 24, wherein the facial anatomy comprises a
smile anatomy of the user.
26. The method of claim 15, wherein obtaining authorization data
associated with the orthodontic care management solution further
comprises obtaining the authorization data from a third party
service provider.
27. A method for providing an orthodontic care management solution,
comprising: receiving user data; obtaining authorization data
associated with the orthodontic care management solution;
determining a treatment plan based on the user data and the
authorization data, wherein the treatment plan comprises one or
more stages of operations; determining a sequencing plan associated
with the treatment plan, based on an arrangement of the one or more
stages of operations associated with the treatment plan; displaying
the sequencing plan to a user on a user device; receiving feedback
data associated with the treatment plan; updating one or more
databases with the feedback data; and performing an artificial
intelligence enabled operation for providing the orthodontic care
management solution.
28. The method of claim 27, further comprising performing an auto
diagnosis of the user, wherein the user comprises a patient and the
auto diagnosis comprises patient directed auto diagnosis.
29. The method of claim 27, wherein the orthodontic care management
solution is provided automatically to the user based on historical
user data.
30. The method of claim 27, further comprising performing one or
more of automatic selection of the user data, automatic staging of
the treatment plan, automatic determination of the sequencing plan,
automatic designing of an orthodontic appliance, and automatic
fabrication of the orthodontic appliance.
31. The method of claim 30, wherein the orthodontic appliance
comprises one or more of fixed removable aligners.
32. The method of claim 27 further comprising: performing automatic
risk analysis associated with the one or more stages of operations;
performing automatic tracking of progress of treatment based on the
sequencing plan; and updating the sequencing plan automatically
based on the progress of the sequencing plan.
33. The method of claim 32, wherein the sequencing plan is updated
based on evidence of the progress of the sequencing plan.
34. The method of claim 27, wherein updating one or more databases
comprises updating learning data in the one or more databases,
wherein the learning data comprises data associated with one or
more of automatic point-of-care anticipatory learning, training
data, and assessment double-loop learning data.
35. The method of claim 27 further comprising: generating a virtual
care navigator associated with the user; and automatically
providing instructions for provision of orthodontic care management
solution by the virtual care navigator.
36. The method of claim 35, wherein providing the instructions by
the virtual care navigator comprises providing the instructions
using one or more of text, speech, and graphics.
37. The method of claim 35 further comprising providing automatic
prescription generation by the virtual care navigator, based on one
or more conditional data associated with the user.
38. The method of claim 27, wherein providing the orthodontic care
management solution comprised providing multi-language support for
the provision.
39. The method of claim 27 further comprising providing automatic
reputation management for the user based on the feedback data.
40. The method of claim 27 further comprising providing a natural
user interface for providing the orthodontic care management
solution.
41. The method of claim 27 further comprising updating an
electronic health record associated with user data.
42. The method of claim 27 further comprising searching an
electronic health record associated with user data.
43. The method of claim 27 further comprising generating a
personalized check list generation for the user, wherein the
generalized checklist is based on the sequencing plan.
44. The method of claim 27, further comprising sending a command to
a manufacturing device for manufacturing an orthodontic appliance
based on the treatment plan.
45. The method of claim 44, wherein the manufacturing device
comprises a 3D printer.
46. The method of claim 44, wherein the orthodontic appliance
comprises a sensor-based orthodontic appliance.
47. The method of claim 27, further comprising performing one or
more of image-based user tracking, text-based user tracking,
gesture-based user tracking, gaze-based user tracking, speech-based
user tracking, and search-based image auto-tagging and
retrieval.
48. The method of claim 27 wherein receiving user data further
comprises scanning a facial anatomy of the user.
49. The method of claim 48 wherein the facial anatomy of the user
comprises one or more user related factors selected from a group
comprising: skin aging, facial changes, wrinkles, speech training,
cosmetics, burns and facial anomalies.
50. The method of claim 27 further comprises generating a 3D avatar
in the likeness of the user to provide empathetic guidance
throughout the treatment.
51. The method of claim 27 further comprising performing UV mapping
of the user data to project 2D images of the user's teeth onto 3D
scans.
52. The method of claim 27 further comprising using a sensor based
orthodontic appliance to track and measure teeth movement of the
user.
53. A computer program product comprising at least one
non-transitory computer-readable storage medium having
computer-executable program code instructions stored therein for
providing an orthodontic care management solution, the
computer-executable program code instructions comprising program
code instructions for: receiving user data; obtaining authorization
data associated with the orthodontic care management solution;
determining a treatment plan based on the user data and the
authorization data, wherein the treatment plan comprises one or
more stages of operations; determining a sequencing plan associated
with the treatment plan, based on an arrangement of the one or more
stages of operations associated with the treatment plan; displaying
the sequencing plan to a user on a user device; receiving feedback
data associated with the treatment plan; updating one or more
databases with the feedback data; and performing an artificial
intelligence enabled operation for providing the orthodontic care
management solution.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a non-provisional application
corresponding to the provisional application No. 62/722,319, filed
Aug. 24, 2018 pending. This application is also related to
applications titled "Modular Aligner Devices and Methods for
Othrodontic Treatment" filed on Aug. 23, 2019 and "Modular
Orthodontic Devices and Methods Of Treatment", filed Aug. 23, 2019
the entire contents of which are incorporated by reference
herein.
FIELD OF THE INVENTION
[0002] The present invention generally relates to computerized
orthodontic care management systems. More particularly, the
invention is directed towards enabling an ecosystem for total
orthodontic care management using computerized techniques with
human interaction input when desired.
BACKGROUND ART
[0003] Orthodontics is a branch of dentistry that deals with
managing of irregularities related to alignment of teeth,
malocclusions, smile correction and other structural and aesthetic
deformities related to teeth and smile of a patient. Currently
orthodontic care is managed by a professional care provider which
is reactive and craft based. This process adds substantially to the
cost of care and leads to poor care outcomes. Provision of quality
orthodontic care involves focus on certain key dimensions, such as
patient centeredness, access to care, efficient therapeutics,
patient safety, and clinical effectiveness. Reactive orthodontic
care is fraught with errors during all stages of treatment from
communication, diagnosis, planning, monitoring and therapeutics.
These may be a result of lack of knowledge, cognitive biases,
mistakes, slips and lapses, disregard of rules and are driven by
practices of omission or commission. Improper design and use of
orthodontic appliances further compounds the errors. Improper
scheduling of patients and lack of a systematic approach in
monitoring adds to the vicious cycle of poor care. These factors
singularly or in combination negatively impact the overall
treatment quality which includes patient satisfaction, and result
in increasing the duration of care, cost of care and also lead to
irreversible biological damage to the teeth and its structures.
Since the orthodontic care systems available in the art are mostly
of reactive type and not process sensitive, it becomes difficult to
extract leanings from the patient records and history and use them
to provide better care solutions. Currently, orthodontic treatment
processes lack application of strategic approaches for optimizing
the delivery of highly reliable care within a system that is
self-learning, resilient, and anti-fragile. Additionally, in the
current systems there is no provision to incentivize superior care
providers and/or patients. Also, the orthodontic care systems
available in the art provide for limited or zero error management,
which requires careful and precise identification of causes and
sources of orthodontic errors and then finding ways to prevent and
intercept them on a continual basis. Also, currently care practices
are limited in creating a generative environment where continuous
learning feedback loops are provided to enhance operator and
patient skills. Furthermore system wide inefficiencies exist that
prevent the optimal and strategic sequencing of operations.
Collaborations to provide patient care in a cost effective and
efficient way are limited. These limitations add significantly to
the delivery, result in providing poor patient care and yet add to
escalating costs significantly.
[0004] Some of the causes for occurrence of diagnostic and
therapeutic errors in clinical practice of orthodontics include the
7 M's: Miscommunication, Misdiagnosis, Misplanning, Misprognostics,
Misprescription, Mismanagement, Misadministration, and Misaction.
The root cause of these is grounded in design of the system
deficits of operator and patient knowledge, inadequate doctor
skills, and the violation of rules.
[0005] Another cause of poor care is related to the lack of
universal measures for reporting poor care practices and outcomes.
Additionally corrective mechanisms to ameliorate the deficiencies
are limited than outcomes. These factors lead to prevalence of
unregulated practices by doctors and unrealistic demands for care
by patients In the absence of requisite reporting mechanisms to
measure and report patient care outcome and experiences, coupled
with the lack of appropriate skill enhancement for doctors and
patients, the delivery of optimal care for the patient suffers.
[0006] Currently orthodontic treatment practices remain reactive by
nature. This model of care increases the care cycle, cost and pain
to the patient. The extended care cycle may present safety issues
to the patient in terms of increasing the possibilities of damage
to the teeth such as root desorption, decalcification and increase
the likelihood of tooth cavities. Furthermore, since documentation
of patient care is limited, meaningful information regarding
patient response to treatment is lost resulting in a lack of
learning and memory in the system. These further perpetuate a craft
driven reactive approach to care rather than knowledge based
proactive generative approach to care. Other deficiencies in the
care system that impacts the patient includes the lack of
orthodontic fee transparency, the patient ignorance in terms of
evidence based treatment practices and an understanding of quality
measures for treatment outcome. Furthermore, the patient has
minimal involvement in defining, directing and managing their
personal orthodontic care. For instance designing the smile they
desire to have or the appliance of their choice to meet their
individual aesthetic needs or managing their self-care.
[0007] In consideration of the deficiencies discussed earlier,
there is a need to empower the patient to manage their personal
care when appropriate, facilitate doctors to practice cost
effective care that is safety driven and error-free and evidence
based.
[0008] In light of the discussion above, there is a need to
overcome the current deficiencies in orthodontic care delivery at
every level of the care system for all the stakeholders in order to
achieve high reliability, high performance orthodontic care in a
science based, learning driven orthodontic care system that is
transparent and authentic. This mandates the design and
implementation of applicable tools and technology within the
framework of a total orthodontic care management ecosystem.
[0009] Any discussion of the background art throughout the
specification should in no way be considered as an admission that
such background art is prior art nor that such background art is
widely known or forms part of the common general knowledge in the
field.
SUMMARY OF THE INVENTION
[0010] The present invention aims to provide highly reliable and
cost effective orthodontic care for a patient with minimal
utilization of resources. The present invention discloses a total
orthodontic care ecosystem that may provide complete diagnosis,
designing, planning, and implementing of patient care, configuring
and/or manufacturing therapeutic strategies and manufacturing
personalized appliances and evaluating care milestones and outcomes
within the framework of a self-learning, smart, and generative
orthodontic care management system through the entire life cycle of
patient care. Furthermore, the methods and systems disclosed herein
are designed to maximize the patients' self-management of their
entire care under appropriate conditions.
[0011] The total orthodontic care management ecosystem disclosed
herein can empower the appropriate patient to self-manage lifelong
care and provides error-free and reliable and high-performance
evidence based care, transparent and secure ecosystem for
orthodontic care management, using computerized care management
practices at every level with human input when desired. Such levels
may include: appropriateness of care provider, self-care management
diagnostics, prognostics based decision making, debasing,
continuous evaluation and monitoring of pre-mortem and post mortem
analysis, preplanning an evaluation checklist, risk management,
risk analysis, care milestones planning, patient motivation and
engagement and marketing practices, evidence precision and targeted
therapy that mostly deliver determinate, controlled, reliable and
predictable force systems with the use of customized, fixed,
removable and/or both orthodontic appliances in combination with
conventional appliances, and optimizing the sequencing and staging
of the therapeutics and devices driven by discrete milestones,
condition and response based scheduling, customized manufacturing
of appliances, root cause analysis in a double loop learning system
complimented with cross channel communication and shared
repositories of information and knowledge that facilitate
continuous learning for all stakeholders--patient, doctor, device
manufacturers, research teams, academia and other third party
service providers and provide evidence and performance reports to
all stake holders such as but not limited to care outcomes, patient
experiences, device capabilities and the like. Furthermore the
system is designed to incentivize all stakeholders
[0012] In one or more embodiments, a computer-implemented system
for orthodontic care management may be provided for managing the
entire orthodontic care workflow, from planning till post-active
treatment management which includes a unified, bundled, cost
effective combinatorial approach to care delivery with optimization
parameters considering but not limited to a patients' budget, the
maximum aesthetic value, the length of care of each therapeutic
strategy and its sequencing and staging. The computer-implemented
system may be used to provide an orthodontic care management
solution. The system may comprise one or more databases configured
to store data of one or more users. The system may further comprise
a server including computer code for providing the orthodontic care
management solution, wherein the server comprises: at least one
memory configured to store the computer code, the computer code
further comprising computer executable instructions for performing
at least one of one or more functions comprising: receiving user
data; obtaining authorization data associated with the orthodontic
care management solution; determining a treatment plan based on the
user data and the authorization data, wherein the treatment plan
comprises one or more stages of operations; determining a
sequencing plan associated with the treatment plan, based on an
arrangement of the one or more stages of operations associated with
the treatment plan; displaying the sequencing plan to the user;
receiving feedback data associated with the treatment plan;
updating the one or more databases with the feedback data; and at
least one processor configured to execute the computer code to
provide the orthodontic care management solution.
[0013] In one or more embodiments, the computer-implemented system
for orthodontic care management may be provided for managing the
entire orthodontic care workflow and associated human activities,
from planning till post active treatment management.
[0014] In one or more embodiments, the computer-implemented system
may be fully automated and equipped with the capabilities of
interactive human use to design a targeted visual care plan based
upon consideration of patient needs and wants, aesthetics, function
stability and biological and physical boundaries of the
craniofacial complex.
[0015] In one or more embodiments, the automated
computer-implemented system may be provided with the capabilities
of interactive human use to design a targeted plan based upon
consideration of nature of tooth movement, minimizing displacement
and collision of dental skeletal and facial structures, maximizing
planned displacements while minimizing and controlling unwanted
displacements to achieve the targeted outcome and using minimal
number of therapeutic strategies, modalities and appliances that
generate the optimal biological force systems to achieve superior
patient care.
[0016] In one or more embodiments, the automated
computer-implemented system may be provided with the capabilities
of interactive human use to design a targeted plan with
personalized appliance selection and design based upon
consideration of patient needs, patient tolerance, nature of
malocclusion, costs, aesthetics and the like.
[0017] In one or more embodiments, the automated
computer-implemented system may be equipped with the capabilities
of interactive human use to alter and update a targeted plan in
response to changing patient conditions and update personalized
appliance selection, design and subsequent care management and
workflow may be provided.
[0018] In one or more embodiments, the automated
computer-implemented system with the capabilities of interactive
human use to design a targeted plan and appliance design, displayed
in 2d or 3d modes in AR or VR environments or holographic using
text, voice, haptic input or mouse based input and the like may be
provided.
[0019] In one or more embodiments, the automated
computer-implemented system may be equipped with the capabilities
of interactive human use to display a targeted plan and appliance
design on a user avatar may be provided. The user avatar may
include a 2d or 3d model, in an AR or a VR environment, or
holographic or using text and voice. In one or more embodiments,
the computer-implemented system for an orthodontic care management
ecosystem that guides the patient in terms of whether they have
suitable characteristics such as but not limited to nature of
malocclusion, self-motivation, needs and ability to manage their
entire care process or specific phases of treatment on their own
but not limited to only post orthodontic care retention management
may be provided.
[0020] In one or more embodiments, the computer-implemented system
may provide for an orthodontic care management ecosystem that
guides the patient in terms of establishing an optimized hybrid
model that provides optimal access points for professional services
in concert with the patient managing their own care may be
provided.
[0021] In one or more embodiments, the computer-implemented system
may provide the orthodontic care management ecosystem that guides
the patient in terms of having access to professional care services
on an as-needed basis for instance but not limited to when patients
self-management of care is not tracking may be provided.
[0022] In one or more embodiments, a method for providing an
orthodontic care management solution is provided. The method may
include: receiving user data; obtaining authorization data
associated with the orthodontic care management solution;
determining a treatment plan based on the user data and the
authorization data, wherein the treatment plan comprises one or
more stages of operations; determining a sequencing plan associated
with the treatment plan, based on an arrangement of the one or more
stages of operations associated with the treatment plan; displaying
the sequencing plan to a user on a user device; receiving feedback
data associated with the treatment plan; and updating one or more
databases with the feedback data.
[0023] In one or more embodiments, another method for providing an
orthodontic care management solution is provided. The method may
include: receiving user data; obtaining authorization data
associated with the orthodontic care management solution;
determining a treatment plan based on the user data and the
authorization data, wherein the treatment plan comprises one or
more stages of operations; determining a sequencing plan associated
with the treatment plan, based on an arrangement of the one or more
stages of operations associated with the treatment plan; displaying
the sequencing plan to a user on a user device; receiving feedback
data associated with the treatment plan; updating one or more
databases with the feedback data; and performing an artificial
intelligence enabled operation for providing the orthodontic care
management solution.
[0024] In one or more embodiments, another method and a
computer-implemented system for patient smile correction may be
provided, wherein the smile correction is based upon a list of
predetermined parameters, including, but not limited to patient
wants, patient needs, patient facial anatomy, patient smile
anatomy, patient age characteristics, patient psychosocial profile,
patient medical and dental conditions and dental characteristics
such as tooth shape, size, color, gum tissue level and form, bone
characteristics, lip morphology, patient growth pattern functional
capacity cost. Further, the method and the computer based system
for patient smile correction may either be configured to operate
automatically, based on the analysis of patient's facial
characteristics, or may be driven but not limited to feedback from
a community and patient care outcome and experience relational data
bases with evidence driven research.
[0025] In one or more embodiments, the method and the system for
patient smile management may also be configured to provide
automatic feedback to a patient or doctor whether a diagnosis or
treatment plan matches standards of orthodontic care or feedback
from the professional's community.
[0026] In one or more embodiments, the method and the system for
patient smile management may include features pertinent to
designing of a smile and malocclusion correction, sequencing and
staging treatment, appliances selection, monitoring care,
evaluating care milestones, checklists and ordering device for
manufacturing and tracking.
[0027] In one or more embodiments, a method and computer
implemented system to automatically or through human interaction
approve for the planned care based upon conditional standards can
be provided by an authorizing agency. These may include but not
limited to government, insurance, or other professional
agencies.
[0028] In one or more embodiments, a method and
computer-implemented system for provision of bids for managing
treatment costs based on expected cost of treatment may be
provided. The treatment costs may be managed from various financial
service providers, such as banks, insurance companies, government
agencies, doctors, product manufacturers and the like. In one or
more embodiments, the provision of financial services may be done
in terms of crypto-currency.
[0029] In one or more embodiments, real-time treatment tracking for
sharing treatment related data on community platforms may be
provided.
[0030] In one or more embodiments, a method and computer
implemented system to allow for competitive bidding and aggregate
demand to provide the best pricing for the patient is implemented.
Access to the competitive bidding may be provided but not limited
to the financial sector, insurance, care providers, managed care
services organizations, and product manufacturers.
[0031] In one or more embodiments, a method and a
computer-implemented system for an orthodontic care management
platform may be provided. The orthodontic care management platform
may be configured to provide context and temporal dependency that
may include voice-based, image-based, or text-based checklists for
the patient or doctor in order to evaluate care progress.
[0032] In one or more embodiments, a method and a
computer-implemented system for an orthodontic care management and
display platform may be provided that allows the care provider or
patient to evaluate and monitor care with the display being
projected on smart glasses or AR, VR or holographic environment and
the display is context-dependent and directs the viewer in a guided
context-driven mode to minimize change or attention blindness.
Furthermore, the data may be presented in text or speech or video
format.
[0033] In one or more embodiments, a method and a
computer-implemented system for an orthodontic care management and
display platform may be provided where the targeted plan can be
designed through speech, gesture or haptic interface or text is
used to create a target setup or design appliances or manage the
patient record. The method and computer-implemented system may
include artificial intelligence capabilities for providing an
action-to-voice mapping of user commands. The method and
computer-implemented system may also include artificial
intelligence capabilities for providing a voice-to-action
mapping.
[0034] In one or more embodiments, a method and a
computer-implemented system for an orthodontic care management
system is provided, where the display automatically demonstrates
the pretreatment malocclusion, the target plan and appliance
design, and the current condition of the patient under care and
temporally defined future state and automatically registers the
various operator defined states by best fit to assess care progress
in terms of measured displacement changes and also provide response
statistics and analytics against a comparative relational database
from historical records of similarly treated patients.
[0035] In one or more embodiments, a method and a
computer-implemented system for an orthodontic care management
where the patient progress is tracked and all stake holders are
automatically informed of the progress both visually text or voice
may be provided.
[0036] In one or more embodiments, a method and a
computer-implemented system for an orthodontic care management may
be provided where the patient progress is tracked and based upon
care progress, the monitoring schedule and patient visits and care
related activities are automatically reconfigured based upon
assessment of current situation and predicted future states derived
from the tracking of history of the patient and universal
relational data base of historical records of similarly treated
patient.
[0037] In one or more embodiments, a method and a
computer-implemented system for an orthodontic care management
ecosystem may be provided where the patient target plan and
appliance design is derived from a relational database, historical
records of similarly treated patients.
[0038] In one or more embodiments, a method and a
computer-implemented system for associating a blockchain enabled
database with the orthodontic care management platform may be
provided. The blockchain enabled, encrypted and secured and
tamper-proof database has security, hierarchical levels of access
to various agents, time stamped data records, such as patient
dental medical records which may be speech, text, image, video, for
providing one or more but not limited to care outcomes, product
performance, doctor performance, patient adherence, learning,
performance analytics and training recommendations. In one or more
embodiments, the blockchain enabled database may be configured for
incentivizing patients for use of patient database or any agent
contributing to the enrichment of the database.
[0039] In one or more embodiments, an interactive user interface
for accessing the orthodontic care management platform may be
provided. The interactive user interface may be configured to
provide a context-based virtual avatar for accessing one or more
concierge services which include but not limited to personal
coaching and motivation for the patient or doctor, reminder and
scheduling services, personal patient care manager and
advocate.
[0040] In one or more embodiments, the orthodontic care management
platform may be configured to provide a rating and feedback
management system for evaluating patient experiences of orthodontic
care.
[0041] In one or more embodiments, the orthodontic care management
platform may be configured to provide deep machine learning
capabilities for providing a self-generative learning system for
orthodontic care management, using data derived from local
relational historical databases of patient records, provider
records, research centers, manufacturer databases, insurance
databases and the like. The learning data derived from such
databases may include but not limited to best practices, optimal
scheduling, product efficiencies, doctor performance, cost
effectiveness errors, poor outcomes records and the like. In some
example embodiments, this data may be used to build an artificial
intelligence model using the learning data as input and providing
auto-recommendations as output.
[0042] In one or more embodiments, the orthodontic care management
platform may be configured to automatically or interactively design
orthodontic treatment planning risk analysis, risk management,
treatment sequencing, staging, care monitoring, motivation,
activity-based tasking, scheduling services and the like. In some
embodiments, the automatic orthodontic care may be provided using
the artificial intelligence model described earlier.
[0043] In one or more embodiments, the orthodontic care management
platform maybe configured to design automatically or interactively
the optimized appliance configuration but not limited only to
customized appliance but inclusive of "off the shelf" products to
that generation of reliable forces consistent with the plan stage
and sequence of treatment, cost, aesthetics, ease of use, patients
tolerance and medical dental history, providers skills, and
evidence
[0044] In one or more embodiments, the orthodontic care management
platform maybe configured to design automatically or interactively
customizable or use a thermal ice and or heat packs that may be
incorporated within or in conjunction with a customized intraoral
appliance configuration or have a standard configuration that may
be incorporated in a standard intraoral appliance separately to be
used to manage patient discomfort or to accelerate tooth movement
through a process of alternating hot and cold cycles. In some
example embodiments, cold and hot thermal packs for intraoral use
can be designed as well.
[0045] In one or more embodiments, the orthodontic care management
platform maybe configured to design automatically or interactively
to allow the user, that is the patient or the doctor, to design an
appliance configuration automatically and interactively to maximize
clinical efficiency effectiveness, minimize dependency on patient
cooperation maximize aesthetics which may include but is not
limited to shape, form artwork motifs, color, fragrance choice of
material or device type removable, fixed, labial and or lingual
based upon stage sequence or phase of treatment within the bounds
of acceptable mechanical, physical, biological, environmental
bio-compatibility design considerations for considered in the
design of orthodontic therapeutic or devices.
[0046] In one or more embodiments, the orthodontic care management
platform maybe configured automatically or interactively to enable
the selection and use of customized configurations of fixed
orthodontic appliances, and or removable appliances in combination
with conventional off the shelf orthodontic appliances to achieve
the target care plan care.
[0047] In one or more embodiments, the orthodontic care management
platform maybe configured automatically or interactively to enable
the alteration of the care plan, to accommodate but not limited to
the cost of care. additional care services for reauthorization of
care use based upon unexpected treatment response or change in
patient needs or wants.
[0048] In one or more embodiments, the orthodontic care management
platform maybe configured automatically or interactively enable the
redesign and or sequence the use of customized fixed and or
removable appliance configurations based upon unexpected treatment
response, a change in patient needs, behavior, doctor preferences,
and new evidence.
[0049] In one or more embodiments, the orthodontic care management
platform maybe configured to automatically interact and refresh
provider websites or patient blogs or manufacturer websites to
provide testimonials or references or authentic patient doctor
experiences.
[0050] In one or more embodiments, the orthodontic care management
platform maybe configured to allow for levels of access and
permission for use based upon patient preferences, doctor's
preferences and other stakeholders. The overarching permission key
driven by the patient or the patient designate
[0051] In one or more embodiments, the orthodontic care management
platform maybe configured to allow for monetization by any of the
stakeholders based upon but not limited using time quality and
value of information sought and number of people accessing
information.
[0052] In one or more embodiments, the orthodontic care management
platform maybe configured to allow for managing patient care post
orthodontics and automatically or interactively at regular operator
defined intervals to evaluate post treatment changes and select and
design personalized customizable orthodontic devices to achieve
orthodontic corrective therapy or design and select appropriate
stabilizing orthodontic retainer appliance order the appliances for
manufacture remotely or onsite for fabrication a and define the
optimal cycle of use and project and notify the patient as to the
next time for self-evaluation and update the patients calendar and
personal motivational avatar
[0053] In one or more embodiments, the orthodontic care management
platform may be configured to provide customized appliance
manufacturing services which may be local or at a remote site.
[0054] In one or more embodiments, customized appliance manufacture
can be accomplished by but not limited to manual fabrication
including but not limited to 3D printing and or 3d milling or any
computer driven manufacturing technology.
[0055] In one or more embodiments, the orthodontic care management
platform may be configured to provide combinatorial configurations
of customized orthodontic appliances with the use of "off the shelf
products" based but not limited to stage and sequence of planned
treatment, cost, efficiency ease of use and of installation,
maximizing value, time demands, skill of provider availability of
customized appliance manufacturing services, past experiences,
evidence and guidance from data mining of relational patient
histories.
[0056] In one or more embodiments, the orthodontic care management
platform may be configured to provide a continuous learning system
based upon tracking provider performance and enable the care
provider for customized user specific tacit and explicit learning
experiences at the point of care or user specified location to
enhance professional skills, measure the adoption and
implementation of the new skills and report these to the
appropriate certifying bodies for credentialing
[0057] In one or more embodiments, the orthodontic care management
platform may be configured to provide a learning system is designed
to build context dependent learner specific needs to develop
explicit and tacit skills in an AR, VR or holographic environment
with speech haptic and gesture capabilities
[0058] In one or more embodiments, the orthodontic care management
platform allows for telepresence offsite consultations for the
doctor or patient or instructor or any stakeholder.
[0059] In one or more embodiments, the orthodontic care management
platform may be configured to provide a context dependent
continuous learning and motivation system based upon tracking
patient progress and adherence to care protocol and provide
motivation therapy by using voice, image text or avatars to achieve
behavioral modification to enhance their motivation and
cooperation, and report progress to the guardian care provider,
insurance other stakeholders.
[0060] In one or more embodiments, a method and
computer-implemented system for evaluating the submitted patient
records for completeness and providing the user feedback to correct
for the deficiencies.
[0061] In one or more embodiments, a method and
computer-implemented system for evaluating the submitted patient
records including voice text and image video records and alerting
patient or user of records and reporting on deficiencies.
[0062] In one or more embodiments, a method and
computer-implemented system for receiving 2D images and or 3D
images photographic laser white light, infrared, thermal images
X-ray, MRI, PET scan, ultrasound or dynamic video images of the
facial dental structures.
[0063] In one or more embodiments, a method and
computer-implemented system for receiving images and automatically
and or human interaction correcting for distortion coloration size
and orientation prior to processing for care planning.
[0064] In one or more embodiments, a method and
computer-implemented system for receiving multiple 2D images and
automatically constructing a 3D image from these sets of images may
be provided.
[0065] In one or more embodiments, a method and
computer-implemented system for receiving a single video image and
automatically deconstructing it into multiple static images may be
provided.
[0066] In one or more embodiments, a method and
computer-implemented system for receiving and combining images from
different sources into a unified image may be provided.
[0067] In one or more embodiments, a method and
computer-implemented system for manipulating the GUI interface and
objects on display through gesture, speech, text, touch screen and
mouse.
[0068] In context of this specification, the terms "component",
"member", "element", and "portion" are considered to be synonymous
and denoting parts of an orthodontic appliance that may be
constructed as an extension or modification or deformation of
another such part or may be joined with the other part through, but
not limited to, chemical adhesive bonding, and or mechanical
joining but not limited to crimping, soldering, brazing, welding,
screw and thread fastening, snap fitting, press-fitting, loop and
hook fastening or using shape memory O-rings such as
Nickel-Titanium rings or crimable onlay devices or thermal joining
techniques. Different terms have been used for different parts only
in order to differentiate them from other such parts to enable
clarity of discussion. Moreover, each "component", "member",
"portion" or "element" may be constructed through combining a
plurality of segments wherever modularity in design of the
orthodontic appliances is required.
[0069] In the context of this specification, the term "deformable"
is envisaged to include all kinds of non-zero and at least
partially reversible deformations such as, but not limited to,
elastic or nonlinear recoverable deformations such as
super-elasticity or pseudo elasticity behavior.
[0070] In the context of this specification, the term "tooth", such
as a first tooth, a second tooth and a third tooth etc. is
envisaged to include one or more teeth, depending upon several
factors such as, but not limited to, specific applications,
applicability of the orthodontic appliance and strength
requirements of the attachments.
[0071] In the context of this specification, the phrase "attached
to a tooth" such as "attached to a first tooth" or "attached to a
second tooth" etc. denotes that the attachment may be obtained in a
comparatively fixed or lasting manner such as through use of dental
bonding agents, tissue impingement, bone anchoring screws and
devices and thread fastening, appliance ligation use etc. or the
attachment may be obtained in a comparatively removable manner such
as through snap fits in undercuts of the tooth/teeth, frictional
fits, such as those achieved in removable orthodontic appliances,
aligners, lip bumpers in tubes, or through attachment of
permutations of male female attachments.
[0072] In the context of this specification, a "polymer material"
is any naturally occurring or man-made material having long chains
of organic molecules (8 or more organic molecules), with physical
and chemical properties of such organic molecules giving the
material its desired properties.
[0073] In the context of this specification, orthodontic appliances
can be defined as to have a number of components, passive
structural elements that are a part of an assembly used to guide
and/or stabilize teeth, and allow for attachment of active elastic
deformable objects that generate forces to move teeth. One end of
the elastic deformable objects may not always be directly attached
to a member of the passive structural element. The active
appliances produce tooth moving forces. As a result of recovery
from their elastic deformed state to the initial unreformed near
zero state. The orthodontic appliances may also include a jig or a
positional device that aids in the precise location and fixation of
the orthodontic assembly. This may be a part of the entire
monolithic configuration or a separate element that carries the
orthodontic assembly together the jig, and does not move teeth. The
jig may be removed from the mouth after placing the device but may
also be configured in a combinatorial design to provide added
functionality such as but not limited to stabilization of a passive
device and maintained in the mouth for the duration of care.
BRIEF DESCRIPTION OF THE DRAWINGS
[0074] At least one example of the invention will be described with
reference to the accompanying drawings, in which:
[0075] FIG. 1 illustrates architecture of a system for providing an
orthodontic care management solution, in accordance with an
embodiment of the present invention;
[0076] FIG. 2 illustrates a block diagram of the orthodontic care
management system, in accordance with an embodiment of the present
invention;
[0077] FIG. 3 illustrates a flow diagram of a method for
orthodontic care management, in accordance with an embodiment of
the present invention;
[0078] FIGS. 4A-4F illustrates exemplary user interface diagrams
for orthodontic treatment management, in accordance with an
embodiment of the present invention;
[0079] FIGS. 5A-5B illustrates exemplary user interface diagrams
for orthodontic treatment staging, in accordance with an embodiment
of the present invention;
[0080] FIGS. 6A-6D illustrates exemplary user interface diagrams
for orthodontic treatment testing, in accordance with an embodiment
of the present invention;
[0081] FIG. 7 illustrates an exemplary user interface for accessing
an orthodontic care management platform, in accordance with an
embodiment of the present invention;
[0082] FIG. 8 illustrates an exemplary method flow diagram for
orthodontic care management, in accordance with an embodiment of
the present invention;
[0083] FIG. 9 illustrates another exemplary method flow diagram for
orthodontic care management, in accordance with an embodiment of
the present invention;
[0084] FIG. 10 illustrates another exemplary method flow diagram
for orthodontic care management, in accordance with an embodiment
of the present invention;
[0085] FIG. 11 illustrates another exemplary method flow diagram
for orthodontic care management, in accordance with an embodiment
of the present invention;
[0086] FIG. 12 illustrates an exemplary user interface in the form
of an interactive avatar for voice-to-action mapping, in accordance
with an embodiment of the present invention;
[0087] FIG. 13 illustrates an exemplary user interface in the form
of an interactive avatar for action-to-voice mapping for a coaching
avatar, in accordance with an embodiment of the present
invention;
[0088] FIGS. 14A-14B illustrates exemplary methods for providing an
orthodontic care management solution to a user using AI techniques,
in accordance with an embodiment of the present invention;
[0089] FIG. 15 illustrates a virtual care navigator (VCN) system,
in accordance with an embodiment of the present invention;
[0090] FIG. 16 illustrates a method flow diagram for a doctor
workflow using VCN, in accordance with an embodiment of the present
invention;
[0091] FIG. 17 illustrates a method flow diagram for a patient
workflow using VCN, in accordance with an embodiment of the present
invention;
[0092] FIG. 18 illustrates a block diagram of an AI enabled
orthodontic care management system, in accordance with an
embodiment of the present invention;
[0093] FIG. 19 illustrates a method flow diagram for complexity
evaluation in orthodontic treatment provision, in accordance with
an embodiment of the present invention;
[0094] FIG. 20 illustrates a method flow diagram for cost
evaluation in orthodontic treatment provision, in accordance with
an embodiment of the present invention;
[0095] FIG. 21 illustrates a method flow diagram for identifying
patient financing options in orthodontic treatment provision, in
accordance with an embodiment of the present invention;
[0096] FIG. 22 illustrated a method flow diagram of an optimization
algorithm for orthodontic treatment provision, in accordance with
an embodiment of the present invention;
[0097] FIG. 23 illustrates an exemplary user interface for smile
selection, in accordance with an embodiment of the present
invention;
[0098] FIG. 24 illustrates a block diagram of a system for chair
side patient monitoring, in accordance with an embodiment of the
present invention;
[0099] FIGS. 25-27 illustrate exemplary user interfaces of an
orthodontic care management system, in accordance with an
embodiment of the present invention;
[0100] FIG. 28 illustrates an exemplary block diagram of a user
interface for superimposition of teeth, in accordance with an
embodiment of the present invention;
[0101] FIG. 29 illustrates an exemplary block diagram of a user
interface for displaying restorative care of teeth, in accordance
with an embodiment of the present invention;
[0102] FIG. 30 illustrates an exemplary block diagram of an affect
response enabling system, in accordance with an embodiment of the
present invention; and
[0103] FIG. 31 illustrates an exemplary block diagram of
context-specific patient monitoring system, in accordance with an
embodiment of the present invention.
[0104] It should be noted that the same numeral represents the same
or similar elements throughout the drawings.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0105] Throughout this specification, unless the context requires
otherwise, the words "comprise", "comprises" and "comprising" will
be understood to imply the inclusion of a stated step or element or
group of steps or elements but not the exclusion of any other step
or element or group of steps or elements.
[0106] Any one of the terms: "including" or "which includes" or
"that includes" as used herein is also an open term that also means
including at least the elements/features that follow the term, but
not excluding others.
[0107] The present invention discloses an ecosystem for orthodontic
care management which may provide optimal care for a patient in an
efficient effective manner. This may provide advantage over
existing orthodontic care systems in terms of reduced patient
costs, improvised care delivery and promotion of practice of
evidence based orthodontics rather than a system that is broken,
craft based, fragile and not generative. Thus, the methods and
systems disclosed herein provide for care being delivered to the
right person at the right place at the right time with the right
therapy by the right person all the time.
[0108] Generally speaking, an orthodontic ecosystem may be divided
into three levels:
[0109] Micro Level--at the level of patient
[0110] Meso Level--at the orthodontist-patient level
[0111] Macro Level--at the community level, involving other
orthodontists, associated organizations, third party provides such
as from financial service industries, patient communities,
manufacturers of orthodontic products, academia, research centers,
laboratories and the like
[0112] Currently, the orthodontic ecosystem is fraught with
challenges at all levels.
[0113] For example, at micro level, there is information asymmetry,
which is biased towards the orthodontist or doctor or manufacturer.
Thus, patient has little knowledge, exposure, impact and authority
in planning, managing and providing feedback on the care delivery
process extended to them. Also, currently the opportunity for the
patient to seek input from other communities on the cost,
aesthetics and reliability of their treatment process is
limited.
[0114] Similarly, at the meso level, a doctor or orthodontist has
limited opportunities to market, monitor and manage the doctor
patient interaction and their own market reputation. This is also
due to having no universal accepted standards for measuring
personal performance characteristics, a lack of a reporting
structure for errors in care, lack of access to patient experience
and outcome data, in terms of dental and medical care records,
among the communities of doctors and patients. Also, the doctor has
limited access to skill level based learning and training
resources, access to universal learning resources care guidelines
checklists, managing pricing options for patient or is not
incentivized to share clinical data.
[0115] Further, at the macro level, professional organizations do
not have effective mechanisms to suitable rate, monitor and
recommend exemplary doctors. Device manufacturers have limited or
no access to direct patient requirements and evaluate product
performance in the real world setting. Academia and learning
institutions have limited or no access to patient records residing
organization and insurance industry have no information to
incentivize superior performance at practitioner level and
recommend guidelines or best practices.
[0116] FIG. 1 illustrates architecture of a system for orthodontic
care management ecosystem, in accordance with an embodiment of the
present invention.
[0117] The system 100 illustrates a user 101 in interaction with an
orthodontic care management ecosystem 102. The user may be a
patient, a doctor, a device manufacturer, a financial service
provider, a computing service provider, a government organization,
an institutional user and the like, that may be desirous of
accessing the orthodontic care management ecosystem 102. The
orthodontic care management ecosystem 102 includes an orthodontic
care management platform 102a in communication with an orthodontic
appliance management system 102b for providing an orthodontic
appliance 103. The orthodontic care management ecosystem 102 may
also include other components, in communication with the components
102a and 102b, as will be described later in FIG. 2. The
orthodontic care management platform 102a may be configured to
provide a computing device based system for managing one or more
services provided by the orthodontic care management platform 102a.
The computing device may include such as a workstation, a laptop, a
mobile device, a tablet, a smart-phone, a PC, a handheld unit, a
smart watch, a smart wearable, a personal digital assistant, a
smart goggle, a kiosk and the like.
[0118] The one or more services provided by the orthodontic care
management platform 102a may include such as orthodontic treatment
planning, orthodontic community access, orthodontic appliance
designing, orthodontic treatment scheduling and alerts, knowledge
sharing, orthodontic treatment monitoring, orthodontic appliance
testing, medical recording of patient data, treatment sharing
options with third party providers, feedback provisioning,
treatment cost management and planning and the like.
[0119] In some example embodiments, the orthodontic care management
ecosystem may be configured to provide data collection and care
planning services. The data collection service may include such as
patient data collection. The patient data may include patient's
demographic data, patient image captured, by patient doctor or any
source also relevant current medical history dental history from
electronic patient record retrieved or filled, patient consent data
for treatment and the like. In some example embodiments, the
patient data may be collected by a user interface mechanism of the
orthodontic care management platform 102a, such using a kiosk for
information collection in a form, a questionnaire, a self-help
portal, a website based patient registration and the like. In some
example embodiments, the user interface for patient data collection
may also include an option to capture or scan or upload a patient
image to identify patient's anatomical features, such as patient's
bone face structure, teeth geometry, gums geometry and the
like.
[0120] In some example embodiments, the user interface for patient
data collection may also provide a display of recommended care plan
for the patient based on the collected patient data. For example,
the user interface may provide recommendation for an orthodontic
care plan based on matching of patient' facial features against
similar proportioned treated patient. The data for treated patients
may be stored in a database associated with the orthodontic care
management platform 102a. In some example embodiments, the user
interface for patient data collection may also provide options for
smile customization for the patient. The smile customization may be
performed based on automatic and interactive smile recommendations
provided to the patient based on the collected patient data. In
some example embodiments, the patient may be able to specify a
smile type, such as a smile similar to that of a celebrity, based
patient data.
[0121] In some example embodiments, the user interface for patient
data collection may be an interactive interface, such as a virtual
avatar of the patient in one or more of a virtual reality
environment, a holographic display, an augmented reality
environment and the like. In some example embodiments, the
orthodontic care management platform 102a may provide an
interactive contextual avatar for data access.
[0122] In some example embodiments, the user interface for patient
data collection may enable the patient to input one or more queries
related to treatment cost, treatment time, and number of visits,
doctor reputation and the like. The queries may be input using a
bot in one or more examples. The user interface may also enable the
patient to select appropriate plan, seek vote from friends and
external sources as to what aesthetic look is best for the patient
and the like.
[0123] In some example embodiments, the orthodontic care management
platform 102a may be configured to provide animated images to the
patient depicting future state of patient facial change based upon
patients' historical images.
[0124] In some example embodiments, the orthodontic care management
platform 102a may be configured to provide recommendations to the
user 101, such as the patient, related to the type of orthodontic
appliance 103 best suited for the patient, such as braces, aligners
and the like based upon patients preferences and or doctors, as
collected in patient data. The patient data may be maintained in a
database, such as a secure database. The secure database may
include medical records that may be time dated and stamped using
secure data technologies, such as block chain, access rights
management and the like. The data stored in the secure database may
be encrypted and user approved for use by a large community of
users, such as patients, doctors, research institutions and the
like. In some example embodiments, the secure database may also
enable monetization of patient data records, wherein a patient may
request for payment per transaction or one-time payment for use of
any patient related data by specific organization.
[0125] In some example embodiments, the orthodontic care management
platform 102a may be configured to provide treatment planning
services such as taking approvals for a patient's treatment from an
insurance or other third party provider, receiving automatic
preauthorization of treatment plan from am authorizing agency,
providing competitive bidding for the treatment cost from multiple
doctors, integrating with various third party financial service
providers to receive financing options for patient treatment based
on a patient's savings, income, loan time, credit history and the
like.
[0126] In some example embodiments, the orthodontic care management
platform 102a may also provide payment related services to the user
101. The payment may be in the form of such as crypto-currency in
one or more embodiments.
[0127] In some example embodiments, the orthodontic care management
platform 102a may also provide transaction management related
services to the user 101, such as connecting with third party
service providers for payment transaction management, clearance,
authorization and the like.
[0128] In some example embodiments, the orthodontic care management
platform 102a may be configured to provide treatment planning
services to the user 101, such as a doctor. The treatment planning
may include generating target treatment plan, managing staging of
treatment, risk planning, diagnostic planning and the like. The
orthodontic care management platform 102a may provide a user
interface for treatment planning by the doctor to manage various
stages of the treatment, such as space closure, intrusion,
extrusion, expansion, constriction, alignment detailing,
restorative care and its appropriate sequencing. In some example
embodiments, the treatment planning may be done automatically or
interactively. Further, at each stage, the doctor may be able to
identify one or more milestones, progress, processes completed and
the like.
[0129] In some example embodiments, the orthodontic care management
platform 102a may be able to provide checklists to the user 101,
such as the doctor to track their progress in delivering the
treatment to the patient. The checklist may be displayed to the
doctor using any of the display interfaces know in the art, such as
display screens, smart goggles, smart watch, tablet display, AR, VR
display and the like or using voice output.
[0130] In some example embodiments, the orthodontic care management
platform 102a may be configured to provide treatment monitoring and
testing services to the user 101. For example, the doctor may be
able to monitor the patient's progress, perform comparative
analysis, monitor appliance performance, generate and receive
alerts related to the treatment and the like. On the other hand,
the patient may be able to provide feedback on the treatment, the
doctor, send reminders and the like using the orthodontic care
management platform 102a. In some example embodiments, the avatar
based interface of the orthodontic care management platform may be
configured for providing scheduling and alerting services. The data
related to the treatment monitoring may be automatically updated in
a database, such as the secure database discussed earlier.
[0131] In some example embodiments, the data about the user 101,
such as the patient or the doctor, the treatment, the appliance and
the payment transactions, may be stored in the secure database and
used for learning purposes. Such as all the data may be added to a
learning system locally and externally and using deep learning
machine learning techniques, insights may be derived from the data.
Such data may include such as data related treatment plans, patient
specific information: like age, sex, ethnic background,
psychosocial habits, behavior, and the like. Such insights may be
used by doctors to do performance measurement based upon target vs.
outcome, current standards of care, evidence, doctor skills and the
like. Such skill and performance measures may in turn be used to
for credentialing doctors on professional websites.
[0132] In some example embodiments, the performance data may be
used to provide training recommendations to doctors who may include
providing training resources, workshops, collaborative learning
experiences, certifications and the like to the doctors.
[0133] In some example embodiments, the orthodontic care management
platform 102a may be configured to provide reputation management
services to the user 101. For example, using the learning database,
patient treatment data, standard guidelines and doctor data,
performance rating of doctor based upon tacit and explicit skills
may be computed. In some example embodiments, the performance
rating may be posted on social networking platforms for wider user
access.
[0134] The orthodontic care management platform 102a may be in
communication with the orthodontic appliance management system for
designing the orthodontic appliance 103. In some example
embodiments, the orthodontic appliance management system may be
configured to generate a prototype of the orthodontic appliance
103, such as using a 3D printing workflow. In some other
embodiments, the orthodontic appliance management system 102b may
be configured for directly generating the orthodontic appliance
103, such using in-clinic 3D printers. The orthodontic appliance
management system 102b may aid in designing appropriate appliances
for each stage and sequence of treatment. In some example
embodiments, the orthodontic appliance management system may
include software interfaces specifically implemented for designing
of the orthodontic appliance 103. The design files associated with
the design of the orthodontic appliance 103 may either be stored
locally, such as on the orthodontic appliance management system
102b or may be sent to a remote system for manufacturing. The
manufacturing of the orthodontic appliance 103 may be done using
any of the technologies know in the art, such as through
subtractive machining, additive manufacturing, die casting,
assembling and the like.
[0135] In some example embodiments, the orthodontic appliance 103
may be designed interactively and or for specific stages and can be
used in tandem with non-customized devices. Appliance design may
include removable, fixed combination, positional devices, both
active and passive elements and the like. In some example
embodiments, before appliance manufacture, the software interface
associated with the orthodontic appliance management system 102b
may enable appropriate risk management, wherein risk associated
with the use of each appliance type and associated tooth movement
or treatment may be automatically generated or simulated by the
doctor. The levels of risk for each appliance or sequence may
generate and accordingly, the design process may be driven.
[0136] In some example embodiments, the appliance 103 may be
configured to accelerate tooth movement through thermal cycling. In
other embodiments, the appliance 103 may have the ability to fit
around any appliance in the mouth. In some example embodiments, the
appliance 103 may be equipped with sensors. In some example
embodiments, the appliance 103 may be a combination of different
types of orthodontic appliances, and fixed and removable
orthodontic attachments.
[0137] The orthodontic care management ecosystem 102 may also
include other components, apart from the orthodontic care
management platform 102a and the orthodontic appliance management
system, as will be discussed in FIG. 2.
[0138] Such components may include such as social networking
platforms 102c, third party service providers 102d and device
manufacturers 102e. The components 102c-102e may be connected to
the components 102a-102b over a network 102f, which may be a wired
or wireless network. The network 102f may be a Local Area Network
(LAN) or a Wide Area Network (WAN). In several embodiments, the
network 102f may be Internet. The implementation of the network
102f may be carried out using a number of protocols such as 802.x,
Bluetooth, ZigBee, HSDPA, GSM, CDMA and LTE etc.
[0139] FIG. 2 also illustrates in detail the various components
associated with the orthodontic care management platform 102a and
the orthodontic appliance management platform 102b.
[0140] The orthodontic care management platform 102a may include an
imaging unit 102a-1. The imaging unit 102a-1 may be configured to
provide an image of the user 101 to the orthodontic care management
platform 102a. The image may be provided either by real-time
capture of the image or by using a pre-captured image. For
real-time capture of the image, the orthodontic care management
platform 102a may be equipped with an image sensor, such as a
camera, a scanner, an X-ray machine, a CT scan machine and the
like. For using a pre-stored image, the imaging unit 102a-1 may be
connected to an external source or may use an image uploaded by the
user 101 at the orthodontic care management platform 102. The image
received by the imaging unit 102a-1 may be provided for further
processing to any of the other components of the orthodontic care
management 102.
[0141] The orthodontic care management platform 102a may include a
planning unit 102a-2. The planning unit 102a-2 may be configured to
provide treatment planning service using the orthodontic care
management platform 102. The treatment planning services may
include such as collecting patient data, identifying a smile
template for patient smile correction, getting approvals for
treatment, sharing treatment related data among a user community
and the like. The orthodontic care management platform 102 may be
configured to provide a software interface to the user for
providing treatment planning services. The user 101, such as a
patient, may be provided an input user interface, such as a form,
for filling out their details, providing consent about treatment,
providing treatment payment related processes and approvals,
uploading their images for treatment planning, selecting options
for smile type based on the patient data and the like.
[0142] Similarly, if the user 101 is a doctor, they may be able to
access a software interface for extracting patient data and
managing smile correction or malocclusion correction for teeth
based on the patient data. FIGS. 4A-4F illustrates exemplary user
interfaces that may be provided to the doctor for planning the
orthodontic treatment of the patient. FIG. 400a illustrates that a
patient image window 400a-1 may be used to automatically extract
patient's dento-facial feature details. An image window 400a-2 may
specify a desired outcome in terms of smile type desired by the
patient. The image 400a-2 may be provided by the patient at the
time of supplying input information regarding a desired smile
template. Another image window 400a-3 may be used to specify in
greater detail, a comparison between patient's current facial
and/or teeth profile and desired facial/teeth profile. The user
interface in FIG. 400a also includes two menus 102a7-1 and 102a7-2
that illustrates various menu options for orthodontic treatment
planning and execution. Specifically, the menu 102a7-1 illustrates
one or more stages of the treatment that the user may want to
select, such as whether it is image capture using X-ray, smile
matching, 3D printing and the like as displayed in the menu. The
menu 102a7-2 illustrates the various types of appliances that may
be available for orthodontic treatment. These appliances may
include such as wires, pin and tube attachments, fixed removable
retainers, aligners and the like. It may be noted that using the
menu 102a7-2, the user 101 may be able to select one or more
appliances, or a combination of appliances for orthodontic
treatment.
[0143] FIG. 4B illustrates another exemplary user interface 400b-2
that may be used to provide a visual representation of the
patient's bony structure in a window 400b-1 and surgical movements
for bony movements in other windows, 400b-2 and 400b-3. FIG. 4C
illustrates in greater detail, a user interface 400c that may be
configured to provide an interactive interface to the doctor for
performing surgical movements of patient's bone soft tissue, as
provided in window 400c-1. An image window 400c-2 depicts patient's
initial teeth and bone state, while an image window 400c-2 depicts
the outcome of performing the bone movements specified in the
window 400c-1. In an example embodiment, the outcome depicted in
window 400c-2 may be used to specify measurements for manufacturing
of the orthodontic appliance 103. The doctor may be also be able to
manage potential risks before manufacture of the orthodontic
appliance 103 using the orthodontic care management platform 102.
The user interface in FIG. 4C also includes the menus 102a7-1 and
1-2a7-2 which have been discussed previously.
[0144] The orthodontic care management platform 102a may include a
staging unit 102a-3 which may be configured to provide treatment
staging and risk management services. FIG. 4D illustrates a user
interface 400d providing visualization of potential risks that may
occur due to planning of tooth movements, such as those depicted in
FIG. 4C. FIG. 400d-1 illustrates that front teeth that are not
completely erupted may be at risk of collisions. Using the
interactive interface windows 400d-2 and 400d-3, and the menus
102a7-1 and 102a7-2, the doctor may be able to visualize the
potential biological and displacer risks associated with placement
of an orthodontic appliance, such as braces, and can accordingly
manage tooth movements and appliance placement to avoid collision
of roots. Also, the doctor may be able to predict side effects of
their planned treatment, such as illustrated in FIGS. 4E and 4F,
which shows a particular appliance placement scenario and its
potential side effects in windows 400a-1-400a-3 where a bite opens
up. The windows 400f-1 to 400f-2 illustrates how this risk may be
circumvented using anterior box elastics attachments. Such risks
may be predicted in advance, at the treatment planning stage
itself, using the orthodontic care management platform 102. The
prediction of these risks may be done based on likelihood of an
event, severity of an event, context, patient age, patient sex and
other such factors. In an example, these factors may be used by a
deep learning module of the orthodontic care management platform
102 to provide predictive risk assessment.
[0145] In some example embodiments, the staging unit 102a-4 may
also be configured to provide staging and sequencing services to
the doctor. That is to say, the doctor may be able to plan sequence
of different stages of treatment using the software based
interfaces, such as depicted in FIGS. 5A-5C, provided by the
orthodontic care management platform 102. FIG. 5A illustrates an
exemplary user interface 500a that may be used to provide an
interactive interface to the user 101, such as a doctor, for
planning different stages of orthodontic treatment. In some example
embodiments, the staging of treatment may enable the doctor to
incorporate stage based optimizations in the design of the
orthodontic appliance 103. In the user interface 500a, the window
500a-1 depicts a model of a patient's teeth to be treated, window
500a-2 depicts the various movements planned for the teeth and
window 500a-3 provides interactive interface for adjusting the
degree of movements.
[0146] In some example embodiments, the tooth movements may be
planned automatically by defining milestones for teeth movements
and matching the progress of treatment velocity with desired
movements at each stage. This may be depicted using the user
interface 500b of FIG. 5B, in which window 500b-1 provides an
interactive interface for defining milestones of tooth movements,
window 500b-2 shows the actual movements required and window 500b-3
shows the degree of movements. In some example embodiments, the
staging unit 102a-3 of the orthodontic care management platform 102
may be configured to provide solutions to rectify unplanned events
in tooth movements by properly sequencing the treatment stages and
designing course correction with appropriate orthodontic
appliances. In some example embodiments, the staging unit 102a-3
may also be configured to provide checklists for different stages
of the treatment. These checklists may be accessed by the doctor
using a smart display, such as an AR display, VR display, smart
glasses, a smart watch and the like.
[0147] The user interfaces in FIGS. 5A-5C also include the menus
102a7-1 and 1-2a7-2 which have been discussed previously.
[0148] The orthodontic care management platform 102a may include a
testing unit 102a-4. The testing unit 102a-4 may be configured to
provide diagnostic services and monitoring services for monitoring
the progress of the orthodontic treatment against a desired
outcome. FIGS. 6A-6D illustrates exemplary user interfaces
600a-600d that may be provided to monitor the progress of the
orthodontic treatment. FIG. 6A illustrates a user interface 600a
depicting a large screen providing a diagnostic model of a
patient's teeth, a target model for the patient teeth in upper
right screen and the outcome of the current treatment in the lower
right screen. FIG. 6B illustrates the user interface 600b in which
the diagnostic model of the patient teeth may be compared against
the target model by cross hatching the initial diagnostic model
over the target model. In some example embodiments, the target
model may be sent to an insurance agency for approval to cover
treatment cost. The outcome of the orthodontic treatment may be
measured against the initial diagnostic model, such as depicted in
the user interface 600c depicted in FIG. 6C or the distance from
the planned target. The outcome of the orthodontic treatment may be
dependent on the ability of the doctor to achieve the desired
treatment outcome. In some example embodiments, this ability may be
measured in terms of percentage proximity of the outcome teeth
model to the target teeth model, as depicted in user interface 600d
of FIG. 6C. In the user interface 600d, any external features or
internal landmark of interest may be chosen to evaluate the
treatment outcome, such as arch forms, cusps, tips and the like.
The user interface in FIGS. 6A-6C also includes the menus 102a7-1
and 1-2a7-2 which have been discussed previously.
[0149] The orthodontic care management platform 102a may also
include a design unit 102a-4. The design unit 102a-4 may be
configured to provide services related to design of the orthodontic
appliance 103. For example, the design unit 102a-4 may be
configured to help a doctor in designing different types of
configurations of orthodontic appliances, and attachments couplings
mechanisms for the orthodontic appliance 103. For the purpose of
designing, the design unit 102a-4 may be configured to operate in
collaboration with the orthodontic appliance management system 102b
for complete lifecycle, from designing to manufacturing, of the
orthodontic appliance 103. The design unit 102a-4 of the
orthodontic care management platform 102a may provide user
interfaces for designing various types of devices or attachments
for the orthodontic appliance 103 based on analysis and outcomes of
all other units used in treatment planning, such as the imaging
unit 102a-1, the planning unit 102a-2, the staging unit 102a-3, and
the testing unit 102a-4.
[0150] The orthodontic care management platform 102a may include a
feedback unit 102a-6 for gathering and managing feedback related
services, such as feedback related to treatment, orthodontic
doctor, treatment cost and the like. In some example embodiments,
such feedback may be shared over social networking platforms 102c
associated with the orthodontic care management platform 102a.
Apart from the major design, planning and staging components
discussed above, the orthodontic care management platform 102a may
also include a UI unit 102a-7 to manage input/output access for the
orthodontic care management platform 102a.
[0151] In some example embodiments, the UI unit 102a-7 may be
configured for providing different interface mechanisms, including
but not limited to speech, gesture, text, eye tracking, mouse based
input, keystroke detection and the like.
[0152] In some example embodiments, the UI unit 102a-7 may be
configured to provide access to an authentication interface
including but not limited to voice recognition, password based
authentication, retinal scan, fingerprint scan, biometric
authentication, facial recognition and the like.
[0153] In some example embodiments, the UI unit 102a-7 may be
configured to provide a context based interactive avatar interface
which may take multiple roles based on user context.
[0154] FIG. 7 illustrates an exemplary visualization of the
interactive avatar. In some embodiments, the orthodontic care
management platform 102a may be accessed using a mobile device.
Thus, the avatar may be configured to provide a virtual interface
1000, in the form of an animated human figure, for an application
installed on the mobile device for accessing the orthodontic care
management platform 102a.
[0155] In some example embodiments, the avatar may be context
based. For example, the avatar may be used to provide follow-up
alerts to a patient, such as reminding them to wear their
orthodontic appliance, brushing teeth on time, scheduling and
alerting about orthodontist's appointments and the like.
[0156] At the same time, the avatar may be configured to provide
checklists for treatment follow-up to a doctor, sending reminder
alerts on the doctor's mobile device updating them about scheduled
calendar appointments, treatment monitoring and the like.
[0157] In some example embodiments, the avatar may be configured to
modify patient behavior. In some embodiments, avatars for doctors
and patient may be created to aid in learning.
[0158] The data related to treatment schedules, patient
information, orthodontist information and the like may be stored in
a memory unit 102a-8 of the orthodontic care management
platform.
[0159] In some example embodiments, the memory unit 102a-8 may be a
secure blockchain enabled database for maintaining medical records
related to the user 101 accessing the orthodontic care management
ecosystem 102.
[0160] The orthodontic care management ecosystem 102 also includes
the orthodontic appliance management system 102b, which further
includes the design unit 102b-1, the 3D printing unit 102b-2, and
the manufacturing unit 102b-3 for designing and manufacturing
various orthodontic appliances.
[0161] In some example embodiments, the orthodontic appliance
management system 102b may be configured to perform a 3D printing
workflow 300 as illustrated in FIG. 3.
[0162] The workflow 300 may include gathering patient related data,
such as data about patient's dento-facial features, various patient
scans, patient demographics, authorizations and the like. In some
example embodiments, the data gathering may be performed using the
scanning unit 102a-1, the UI unit 102a07 and the memory unit 102a-8
of the orthodontic care management platform 102a. The gathered data
may then be used to perform treatment planning, treatment staging
and orthodontic appliance designing, such as using the planning
unit 102a-2, the staging unit 102a-3, the testing unit 102a-4 and
the design unit 102a-5 of the orthodontic care management platform
102a. Further, once the appliance design has been finalized, the
appliance may be manufactured using the orthodontic appliance
management system 102b for performing orthodontic appliance 3D
printing model generation and 3D printing the product. At each
stage of the workflow 300, feedback management and social data
sharing may also be provided by the orthodontic care management
ecosystem 102.
[0163] FIG. 8 illustrates an exemplary flow diagram of a method 800
for managing the entire orthodontic care management workflow
according to an example embodiment of the present invention. The
method 800 may be used to provide an orthodontic care management
solution to a user in the orthodontic care management ecosystem.
The user may be any of a patient, a doctor, a prospective patient
seeking treatment help, a third party user such as a vendor, a
general practitioner seeking doctor data and the like. The method
800 may be implemented such as by the orthodontic care management
ecosystem 102 and in more particular by the orthodontic care
management platform 102a.
[0164] The method 800 may include, at step 801, receiving data
related to the user 101. The user 101 may be a patient accessing
the orthodontic care management ecosystem 102. In some embodiments,
the user 101 may alternately be a doctor accessing the orthodontic
care management ecosystem 102. The data related to the user may
comprise such as data related to a facial anatomy of the user 101.
In some example embodiments, the facial anatomy may comprise smile
anatomy of the user 101. The data related to the user 101 may be
received such as by scanning the facial anatomy of the user to
capture facial features, smile features and the like. In some
embodiments, the user 101 may have a preference for a specific type
of smile, such as smile of a celebrity, a specific orthodontic
treatment to create a specific smile. In such example, the user 101
may provide their data in the form of user preferences entered o a
user interface, such as the UI 102a7 of the orthodontic care
management platform 102a. In some other example embodiments, the
user data may be received by downloading such data from a website.
Once, the user data has been received in any such manner, the
method 800 may proceed to step 802.
[0165] The method 800 may further include, at step 802, obtaining
authorization data associated with the orthodontic care management
solution. In some example embodiments, this data may be provided by
the user, such as approval for a particular type of smile
correction, preference to a specific smile and the like. In some
other examples, the authorization data may include treatment
related approvals, such as obtained from third party providers 102d
including banks, insurance companies, government organizations and
the like. In some example embodiments, the approvals may also
relate to gathering patient consent regarding the treatment cost,
orthodontist reputation management and the like. After receiving
the desired approvals, the method 800 proceeds to step 803.
[0166] The method 800 may further include, at step 803, determining
a treatment plan. Treatment plan may be determined such as using
the planning unit 102a-2 of the orthodontic care management
platform 102a as discussed earlier. The treatment plan may help
facilitate a care provider, such as a doctor or a specialist to lay
out the entire treatment in terms of different stages, wherein a
stage is an operation to be performed during the course of the
treatment. For example, one of the stages may be a movement of a
tooth in a desired direction. Another stage may be placement of an
orthodontic attachment at a desired position in the patient's
mouth. In some example embodiments, the treatment plan may be
defined by the doctor, based on the smile anatomy of the patient
and the patient's desired smile anatomy. In some other example
embodiments, the treatment plan may be automatically generated
based on patient data, authorization data and some historical data
derived from the memory unit 102a-8, wherein the historical data
may be related to patient preferences, other patient treatment
results, success and failure records of treatment strategies and
the like. The treatment plan thus generated may be outlined and/or
stored in terms of a plurality of stages. The stages may be defined
in a particular order at step 804 of the method 800.
[0167] The method 800 may further include, at step 804, determining
a sequencing plan. The sequencing plan may be associated with the
treatment plan and may include an arrangement of the one or more
stages of operations associated with the treatment plan. In some
example embodiments, the sequencing plan may be generated to obtain
optimization of treatment outcomes, such as using the staging unit
102a-3 of the orthodontic care management platform 102a discussed
earlier. For example, the sequencing plan may be configured to
automatically or interactively help design a treatment management
approach that provides the greatest value choices driven by the
patient data, such as most aesthetic smile, least amount of
treatment time, fewest visits, the optimized sequencing of
treatment when care is offered by between multiple doctors situated
remotely and also the reconciliation and optimization of
conflicting opinions from various sources by searching of evidence
from referencing databases of research material and patient archive
records of treatment outcomes. Once the sequencing plan is
generated, the method 800 may proceed to step 805.
[0168] The method 800 may further include, at step 805, displaying
the sequencing plan to the user, such as the doctor or orthodontist
performing the treatment. The sequencing plan may be displayed such
as on a display interface of the user device used by the
doctor.
[0169] After display, the method 800 may proceed to step 806,
wherein the method 800 may include receiving feedback data
associated with the treatment plan. The feedback data may include
such as feedback provided by the patient about the various aspects
of the treatment, such as execution, cost, success, time taken,
comfort level, patient awareness, patient satisfaction and the
like. In some embodiments, the feedback may be received at the end
of the treatment by the doctor, by providing a feedback form to the
patient on a user interface of the user device being used by the
doctor and which is connected to the orthodontic care management
platform 102a. The feedback data thus received may be used by the
feedback unit 102a-6 of the orthodontic care management platform
102a discussed earlier. In some example embodiments this feedback
data may be used in reputation management of the doctor providing
the treatment. For this, the method 800 may include, at step 807,
updating a database, such as a learning data discussed earlier,
connected with the orthodontic care management platform 102a. The
learning database may also be used to store treatment data, apart
from the user data, doctor reputation data, treatment tutorials and
the like.
[0170] In an example embodiment, a system for performing the method
of FIG. 8 above, such as the orthodontic care management platform
may comprise a processor configured to perform some or each of the
operations (801-807) described above. The processor may, for
example, be configured to perform the operations (801-807) by
performing hardware implemented logical functions, executing stored
instructions, or executing algorithms for performing each of the
operations. Alternatively, the system may comprise means for
performing each of the operations described above.
[0171] In some example embodiments, the operations described in the
method 800 above may be further enhanced in capabilities by using
AI related techniques, as described in the method 900 disclosed in
the FIG. 9 herein.
[0172] FIG. 9 illustrates another exemplary method 900 for
providing an orthodontic care management solution. The method 900
includes, at step 901, collecting user data. The user data may be
data about user preferences, user's facial anatomy, smile anatomy,
picture, scan and the like as discussed previously. Once the user
data is collected, the method 900 may include, at step 902,
identifying related user data using artificial intelligence
techniques. The related user data may include such as data about
other users and/or patients who have undergone such similar
treatment, patients with similar facial anatomy or smile anatomy,
the success and failure of treatment involving similar patients,
patients treated by the same orthodontist and the like. In some
example embodiments, the data identification may involve performing
pattern recognition, pattern matching, natural language processing
(NLP), generating a learning model for automatic patient data
matching, providing auto-suggestions and the like. The related user
data and the user data identified in this manner may be used, at
step 903, for creating a structured database. The organization of
all the data in the form of a structured database provides the
unique advantage of ease of access and faster retrieval of data.
This data may be used, at step 904, for modification of the data.
Data modification may be done, such as to provide additional
details about the user, pattern matching and identifying similar
data. The modified data may further, at step 905, be encrypted for
data confidentiality and security purpose. The encrypted data may
be used, at step 906, for performing diagnostic analysis on user
data. The diagnostic analysis may include, at step 907, identifying
if user wants treatment simulation. If yes, then at step 907a1,
some user specific constraints may be identified. Further, at step
907a2, it may be identified based on patient data analysis whether
the patient needs professional intervention. If yes, then at step
907a3, a desired professional may be identified based on a
plurality of factors. Alternately, the method 900 may proceed to
step 907b1 if it is identified at step 907 that the patient does
not need treatment simulation. In this case, at step 907b1, pattern
recognition may be performed to identify similar user profiles.
Further, at step 907b2, these similar profiles may be displayed to
the user, such as on the display interface of the user device.
[0173] In an example embodiment, a system for performing the method
of FIG. 9 above, such as the orthodontic care management platform
may comprise a processor configured to perform some or each of the
operations (901-907) described above. The processor may, for
example, be configured to perform the operations (901-907) by
performing hardware implemented logical functions, executing stored
instructions, or executing algorithms for performing each of the
operations. Alternatively, the system may comprise means for
performing each of the operations described above.
[0174] In some examples, the method 900 may be further modified as
depicted in FIG. 10.
[0175] FIG. 10 includes a method 1000 for monitoring treatment
progress for a user. The method 1000 may include, at step 1001,
collecting user data. Further, the method 1000 may include, at step
1002, creating a virtual 3D user model. The virtual 3D user model
may display a patient's facial features, along with different
measurements fir different sections of the model, such as depicted
in FIG. 4. Once the mode, is created, the method 1000 may include,
at step 1003, creating separate objects from the virtual user
model. The method 1000 may further include, at step 1004, defining
boundary constraints for the separate objects. These boundary
constraints may include such as amount of movements, degree of
movements and the like for various objects. For example, the FIGS.
5A-5B depicts various boundary constraints for a model of a user's
lower jaw. Once the boundary constraints are set, the method 900
may include, at step 1005, sequencing the treatments using the
previously defined constraints. Further, the method 1000 may
include, at step 1006, creating checkpoints to monitor treatment
progress. As part of the treatment, the methods 1000 may include,
at step 1007 defining therapeutics. Further, the method 1000 may
include, at step 1008 defining potential outliers and at step 1009,
monitoring the treatment against checkpoints. It may be checked at
step 1010 whether the treatment is on-course. If yes, then at step
1010a, similar data may be identified and if no, then at step 1010b
a custom solution may be designed to bring the treatment back
on-course.
[0176] In an example embodiment, a system for performing the method
of FIG. 10 above, such as the orthodontic care management platform
may comprise a processor configured to perform some or each of the
operations (1001-1010) described above. The processor may, for
example, be configured to perform the operations (1001-1010) by
performing hardware implemented logical functions, executing stored
instructions, or executing algorithms for performing each of the
operations. Alternatively, the system may comprise means for
performing each of the operations described above.
[0177] In some example embodiments, the user may be able to use AI
related techniques for performing action to voice and voice to
action mapping of user data, as depicted in FIG. 11.
[0178] FIG. 11 illustrates a method 1100 for providing an
orthodontic care management solution to the user using AI related
techniques. The method 1100 may include, at step 1101, collecting
user voice data. User voice data may include such as voice
commands. At step 1103, this voice data may be converted to text
form, using automatic speech recognition (ASR) and natural language
processing (NLP). Further, this text may be used, at step 1103, for
creating a text-to-action mapping using AI. Further, at step 1104,
the desired action may be performed. This action may be recorded,
at step 1105, and further be used, at step 1106 for generating an
audio for performed action. Further, using the audio, at step 1107,
auto-suggestions may be provided.
[0179] In an example embodiment, a system for performing the method
of FIG. 11 above, such as the orthodontic care management platform
may comprise a processor configured to perform some or each of the
operations (1101-1107) described above. The processor may, for
example, be configured to perform the operations (1101-1107) by
performing hardware implemented logical functions, executing stored
instructions, or executing algorithms for performing each of the
operations. Alternatively, the system may comprise means for
performing each of the operations described above.
[0180] In some example embodiments, the voice to action an action
to voice mappings described in FIG. 11 may be displayed to the user
in the form of an active interactive avatar based UI as depicted in
FIGS. 12-13. FIG. 12 illustrates an interactive avatar for
generating a voice to action mapping, while FIG. 13 illustrates a
coaching avatar for showing an action to voice mapping. The avatars
shown in FIGS. 12-13 may be supported by an AI enabled orthodontic
care management system and methods for enabling the same.
[0181] FIGS. 14A-14B illustrates exemplary methods for providing
the orthodontic care management solution to the user using AI
techniques. The method 1400a in FIG. 14a illustrates a patient's
side of implementation of the orthodontic care management solution,
while the FIG. 14b illustrates a doctor's side of implementation
the orthodontic care management solution.
[0182] The method 1400a illustrates the overall patient workflow
from the start of treatment to its conclusion. In some example
embodiments, the method 1400a may be implemented by a Virtual Care
Navigator (VCN), which may be an essential part of this workflow;
however the VCN is not explicitly shown in this diagram and will be
described separately in greater detail in FIGS. 15-17 discussed
later.
[0183] The method 1400a illustrates, at step 1401a, capturing
patient data, such as by using an input capturing device. The input
capturing device may include such as a scanner, an image capturing
device like a camera, an input capture mechanism available on a
mobile device, and other similar technologies available for
capturing multiple 2D images of the user. In an example, the image
capturing technology may include capturing patient's facial
morphology using computer vision technology, such as using free
open source tools for facial recognition and facial orientation
detection. In some example embodiments, computer vision may be used
in combination with the image capturing technology to ensure that
the user takes pictures of the all appropriate orientations (facing
front, right-side, left-side) required by a system, such as the
orthodontic care management platform 102a. Some open source
computer vision technologies, such as OpenCV also include a
statistical machine learning library that may be used to learn the
user's facial features. Once learned, the system can automatically
detect the user the next time around. Thus, the captured user data
may also be used as part of a biometric authentication system for
access to the patient's dental/medical records, which may be stored
in a database. Once patient data is successfully captured, the
method 1400 may proceed to, at step 1402, to analyze patient
morphology.
[0184] Once the patient's details are thus successfully captured
and analyzed, the method 1400a may include, at step 1403a,
modifying patient data, such as by using patient affect. In some
example embodiments, the modification to the patient data may
include morphological analysis and 2D-to-3D conversion, such as by
building 3D models using image conversion tools known in the art.
For example, OpenCV may be used to perform morphological analysis
of the patient's face by extracting pertinent features from the 3D
model and showing the user how their face is likely to change with
the proposed orthodontic treatment. Further, after conversion, the
image may be subjected to deep learning using tools known in the
art, such as Tensor flow (www.tensorflow.org), to provide the
machine learning capabilities required for such analysis and
projection. Convolutional neural networks (CNN), well-known to
those skilled in the art, are one of the examples of several deep
learning algorithms that can be applied to this task. The machine
learning techniques such as CNN may include analysis of various
factors, such as a patient's view of their own attractiveness,
which may be used to determine their attractiveness preference
(affective sense) prior to the start of treatment, so that the
treatment can be patterned to meet their objectives. In this
regard, emotion detection (from the patient's images and models)
plays a significant role in determining how happy (or unhappy) a
patient is with their looks. Many patients go in for orthodontic
treatment because they are not happy with how they look. Their
level of dissatisfaction may be more reliably captured by emotion
detection tools, rather than their verbal exposition. For example,
some common tools known in the art like Emotient
(www.emotient.com), Affectiva (www.affectiva.com), and EmoVu
(http://emovu.com) may be used for emotion detection.
[0185] Once patient's preferences are factored in, the method 1400a
may include, at step 1404a, generating personalized care plan and
precision therapy for the patient. This may be achieved such as by
using built in algorithms, such as within a processing module in
the orthodontic care management platform 102a, wherein the built in
algorithms may be configured to provide an augmented reality
toolkit, a plan generator and other built in functions suitable for
orthodontic care processing. The personalized care plan generated
in this manner may be presented to the patient, such as using the
UI unit 102a-7 of the orthodontic care management platform 102a.
The plan and its effectiveness may be monitored, at step 1405a, to
provide personalized monitoring to the patient. This may include
such as providing a built-in app based monitoring, wherein the app
may be used to provide reminders, alerts for medicine intake, due
date reminders for patient visit to clinic and the like. Further,
the orthodontic care management platform 102a may be configured for
performing custom analytics on patient activity, provide
doctor-patient based feedback monitoring, providing visualization
of current treatment plan and deviation from the current treatment
plan (if any) using data visualization tools, design tools and 3D
plan generator. The personalized monitoring may enable the patient
to be fully aware and in-control of their treatment progress and
raise red flags as and when required to get the best possible
treatment meted out to them. Thus, the orthodontic care management
solution provided to the patient promises to be patient centric,
highly advantageous, cost efficient and patient oriented. Further,
the patient workflow in method 1400a also includes, at step 1406a,
providing 24/7 patient support, from top orthodontists. The patient
support may also include activities such as built-in scheduler for
notification, app development module for finding nearby top
orthodontists, 24/7 on chat support and call facility and the like.
The method 1400a may also be configured to, at step 1407a, gather
patient feedback for obtaining information about patient's reported
experience and outcomes. This may be done using a built-in module
for patient/doctor feedback collection and evaluation, such as
using the feedback unit 102a-6 of the orthodontic care management
platform 102a. The feedback unit 102a-6 may also be configured for
doctor-patient feedback collection and visualization and built-in
smile feedback support and rating. The feedback data and other
patient data gathered in this manner may be subjected to, at step
1408a, further data visualization and monetization and analysis.
This may be done using easy-to-understand dashboards (like those
generated by tools such PowerBI (https://powerbi.microsoft.com) and
Tableau (https://www.tableau.com), well-known to those skilled in
the art). The dashboards generated in this way may be configured
for leveraging the data analytics and mining created by the user,
such as the patient or the doctor using the cloud databases. In
some example embodiments, all patient and treatment data may be
securely stored on a permissioned blockchain network that is
accessible to patients, the doctor, and other entities (pharmacies)
who are authorized to see that data. The techniques for the
creation of an addition of records to a permissioned blockchain
network are well understood by those skilled in the art. The
advantages of using a blockchain for patient record storage in this
invention are twofold: immutability and transparency. Once stored
on the blockchain, patient records cannot be tampered with by
anyone (immutability) and the patient has access to it at all times
without being at the mercy of the doctor, or having to go through
any third-parties to access the data (transparency). Further, since
the patient is in control of their data at all times, they can, at
their discretion, choose to monetize it by making it available to
other doctors and health-related organizations for a fee. Other
specific advantages provided by the method 1400a are illustrated
by, such as step 1409a, where a learning database is updated by
reputation management data and learning data to provide a
continuous based AI enabled orthodontic care management solution to
the patient. In this age of instantaneous social media reviews,
doctors need to be very cognizant of negative reviews by patients
adversely affecting their reputation and practice. There is
currently no mechanism for patients' reputation to be logged and
evaluated, and no way to determine if their negative reviews are
justified or written out of sheer malice. By virtue of patient and
treatment records being maintained on a blockchain network, the
disclosed methods and systems of the invention ensure that there is
complete transparency on both sides. Patient reviews are also
tracked, so if a patient is in the habit of providing negative
reviews as a matter of course for no valid reason, that patient's
reputation suffers, and the doctor will be made aware of this when
the patient signs up for treatment with the doctor. Further,
patients are less likely to claim malpractice or bad treatment
since the records are accessible to both patient and the doctor
from the start of treatment all the way till the end.
[0186] As discussed above, the method 1400a is oriented towards a
patient and the services provided to the patient by the orthodontic
care management platform 102a of the orthodontic care management
ecosystem. Just like that, the method 1400b discussed in the
following description is oriented towards a doctor and the services
provided to a doctor by the orthodontic care management platform
102a. The FIG. 14b illustrates the method 1400b showing the overall
doctor workflow from the start of treatment to its conclusion at a
high-level. Again, the Virtual Care Navigator (VCN), also an
essential part of this workflow, is not explicitly shown in this
diagram, because it is all-pervasive. It is used to guide the
doctor through every step of this workflow. Given its importance,
the VCN is described separately in greater detail in FIGS. 15, 16
and 17.
[0187] As illustrated in the method 1400b, the steps 1401a, 1402a,
1404a, 1405a, 1406a, 1407a, and 1409a of the patient workflow in
the method 1400a are also present in the doctor workflow of method
1400b as the steps 1401b, 1402b, 1403b, 1404b, 1405b, 1406b, and
1408b respectively. However, the focus of the steps will be
different in each of the workflows. For example, doctor-specific
activities like data mining and data analysis (of patient and
treatment data) are part of step 1407b while step 1408a (data
monetization and visualization) pertains to patients consuming the
output of the analytics done in step 1407b as well as monetizing it
by selling their data to doctors and other health organizations
that can benefit from it. Similarly, reputation management for
doctors in step 1408b (doctors need to be constantly aware of their
reputations as it has direct bearing on the viability of their
practice) is shown as part of the doctor workflow. These steps are
described in greater detail below.
[0188] The method 1400b includes, at step 1407b performing data
mining and analysis on patient data for use by the doctor during
the provision of the orthodontic care management solution. The data
analysis may be done using data analysis tools known to those of
ordinary skill in the art, such as tolls like matplotlib, info
graphics, and Google Analytics to compute data analytics. Further,
the data mined in this manner may be used for tracking and
visualizing user journey in applications using the custom Google
analytics module.
[0189] The method 1400b further includes, at step 1408b, updating
reputation management data for the doctor and using continuous
learning techniques to use this data for further services such as
providing an inbuilt system for patient/doctor ranking based on
collected feedback using machine learning, providing
recommendations about the doctor using big data and machine
learning technologies, providing inbuilt voice-to-text action
support using ASR (automatic speech recognition) and NLP (natural
language processing) techniques, well-known to those skilled in the
art.
[0190] Throughout the workflows 1400a and 1400b, the virtual care
navigator (VCN) plays an important role, as described in the
following description in conjunction with FIGS. 15-17.
[0191] FIG. 15 illustrates a virtual care navigator (VCN) system
1500 which may be implemented as a cloud-based AI bot that may have
access to all the patient, treatment, and doctor data. Its main
function is to provide on-demand and context-dependent guidance to
the patient and the doctor in navigating through the various steps
of the treatment process "On-demand". That is to say, the VCN
system 1500 (hereinafter also interchangeably referred to as VCN
bot 1500 or VCN 1504) may be invoked at any time by the user
(either patient or doctor) and "context-dependent" means that the
bot is smart enough to know current status of the user (for
example, whether the user is a first-time patient, the user is
close to the end of his/her treatment plan, doctor is setting up
his practice and the like) and patterns its responses to user
queries accordingly. Further, the bot is customizable to individual
users, based on their role. As shown in the figure, the VCN 1504
has access to individual patients' data and has built-in security
mechanisms to ensure that a specific patient's data is shared only
with that user and the doctor. The bot instance invoked by a doctor
has access to all the patients' data under the care of that
doctor.
[0192] The VCN bot system 1500 illustrated in FIG. 15 may include a
user 1501, providing input to the system 1500 using an input
mechanism 1502. The input mechanism 1502 may include a variety of
input modalities, including but not limited to: text, via keyboard
or pen input (handwriting recognition); speech (automatic speech
recognition); gestures (finger and hand gestures) and gaze
tracking. After receiving the input, the VCN bot 1500 may provide
its output to users using a variety of output modalities, including
but not limited to: Text (displayed on the screen); rich media
(pictures, photographs, videos); speech (text-to-speech output);
audio (beeps, user-selectable tones/music for alerts); haptic
feedback and the like. The system 1500 may also include other input
and output modalities well-known to those skilled in the art.
[0193] In some example embodiments, the haptic feedback may be the
preferred output modality as it may be useful to highlight specific
teeth positions on the device display. For example, when the doctor
moves individual teeth on the screen to various positions during
treatment planning, such as using machine learning techniques using
Planning Problem and Constraint Logic Programming over Finite
Domains (CLP(FD)), haptic feedback can be used to indicate the
boundaries or limits of movement, or could also be used to indicate
alignment with the surrounding teeth. Such tactile feedback may
improve the overall efficiency of the treatment planning process,
since the doctor does not have to depend on any displays or speech
or text for feedback. Instead, the vibration felt by his finger
(haptic feedback) conveys the same information quickly and
succinctly. In some example embodiments, this may be implemented
using cloud software using well-understood methodologies such as
Software-as-a-Service (SaaS). Thus, the haptic input mapping
generated this way may be available on multiple channels
(including, but not limited to: mobile app, desktop app, website,
smart glass app, etc.). In one or more embodiments, the VCN bot
1504 could be implemented using one or more popular cloud
platforms, including but not limited to: Microsoft Azure
(https://azure.microsoft.com/en-us/), Amazon Web Services
(https://aws.amazon.com), and Google Cloud Platform
(https://cloud.google.com).
[0194] In some example embodiments, the VCN bot 1504 may be
configured to use NLP (natural language processing) algorithms to
decipher the intent from user input and convert it into database
queries to satisfy the user request. The NLP algorithms may use
machine learning to train a language model in the domain of patient
care (e.g., orthodontics) to ensure that the system can understand
user queries within that domain. An example of a widely used
cloud-based NLP system is LUIS--Language Understanding Intelligent
Service from Microsoft (www.luis.ai). Further, the VCN bot 1504
leverages the dynamic learning (also known as `active learning` by
those skilled in the art) capabilities of the NLP system to
dynamically update the language models based on user input. This
dynamic learning may ensure that the bot adapts itself to user
input during use.
[0195] In some example embodiments, speech may be used as the input
modality. The user's input speech may be first converted into text
(using automatic speech recognition) before NLP is applied to the
text to extract the user intent. The algorithms used for
recognizing large vocabulary continuous speech are well-known to
those skilled in the art. Popular cloud-based ASR systems include
Microsoft Speech
(https://azure.microsoft.com/en-us/services/cognitive-services/speech-to--
text/) and Google Speech
(https://cloud.google.com/speech-to-text/), which are hereinafter
incorporated in their entirety by reference.
[0196] In some example embodiments, gestures may be used as the
input modality. If gestures are used as the input modality, the
gestures may be first converted into text using gesture recognition
techniques known in the art, such as those described in
https://arxiv.org/ftp/arxiv/papers/1811/1811.11997.pdf), which is
incorporated in its entirety herein by reference, before NLP is
applied to the text to extract the user intent.
[0197] After receiving input in this manner, the VCN system 1500
may be configured to generate a user specific 3D virtual avatar
1503. For example, the user-specific 3D virtual avatar may include
a male patient avatar, a female patient avatar or a doctor avatar.
The 3D virtual avatar may be automatically generated using 2D
images taken by the user (as described in the descriptions for
FIGS. 1400a and 1400b). The 3D avatar, in the likeness of the user,
customizes the system for that user. However, mindful of the fact
that not all users would like to interact with a virtual digital
likeness of them, the creation and use of the avatar is optional
and is predicated on the user explicitly opting in for it.
Regardless of the presence or absence of the avatar, users will
still be able to interact with the VCN bot 1504 in the cloud. In
case the 3D avatar 1503 is enabled, the 3D avatar 1503 is connected
to the bot via an "Avatar-VCN AI bridge". This AI bridge is a
software communication channel between the 3D avatar and the VCN
1504 that allows the VCN bot 1504 to manipulate the facial
expressions and emotions of the 3D avatar 1503 in response to user
input and in synchronization with the VCN speech output. The result
is a virtual 3D 1503 avatar that interacts empathetically with the
user, enriching his or her experience with the VCN. The VCN 1504
may have access to a plurality of databases 1505 including male
patient data, female patient data and doctor data.
[0198] The operation and access methodologies for the VCN bot 1504
have been described in conjunction with the method flow diagrams
illustrated in FIGS. 16-17.
[0199] FIG. 16 illustrates a method 1600 used by a doctor for
accessing the VCN 1500 on their user device. The method 1600 may
include, at step 1601, downloading a VCN app on a user device, such
as a mobile device, a laptop, a tablet, a PC and the like. Once the
app is downloaded, the user, such as the doctor in this scenario,
may need to, at step 1602, sign up for using the VCN 1500. The sign
up may allow the user to access the VCN 1500 for guiding the user
that is the doctor here, through the entire doctor workflow, such
discussed in method 1400b earlier. After signing up for VCN access,
the method 1600 may include, at step 1603, providing the user with
an option to opt-in for a personal avatar. If the user chooses to
opt-in for the personal avatar, then the method 1600 proceeds to
step 1604, wherein multiple pictures of the user are taken from
multiple angles, to be used for creating the user's personalized
avatar. Once the pictures have been satisfactorily clicked, the
method 1600 includes, at step 1605, using the user's pictures for
creating a realistic 3D avatar of the user. The 3D avatar so
created is then linked to the VCN bot 1500 using the `avatar-VCN`
AI bridge discussed in FIG. 15. From here on, the VCN bot 1500 and
all its interactions with the user take place through the user's
realistic personalized 3D avatar. However, if the user chooses not
to have an avatar in step 1603, then the method 1600 proceeds to
step 1606.
[0200] At step 1606, the doctor may be asked to specify the
preferred mode of interaction. The various modes of interaction may
include such as smart glasses, smart glasses, PC, smartphone,
tablet, AR headset, and the like. Once the doctor has specified the
preferred modes of interaction, the method 1600 may include, at
step 1607, asking the doctor to specify social media platforms that
she/he may want to advertise their practice on. For example, the
doctor may specify one of the various available social media
platforms including Instagram, FB, snapchat, Twitter and the like.
Once the social media platform has been specified, the method 1600
may proceed to, at step 1608, the VCN bot 1500 guiding the entire
process of information collection from doctor to populate available
system databases, such as the learning database discussed earlier.
Along with this, some policy and other related documents may be
uploaded to the VCN system 1500. Such documents may include such as
documents on practice, policies, fee-structure, and the like. In
some example embodiments, the entire doctor workflow may be a
speech-driven process. In some other example embodiments, the
doctor workflow may include a combination of input modalities such
as text, speech, gesture and the like.
[0201] Once all the input modalities are done, the VCN bot 1500 is
ready to add new patients and their corresponding avatars. It may
be noted that though the VCN bot 1500 has been discussed in
conjunction with the doctor and the patient workflows, but the VCN
bot 1500 may be invoked at any time by the doctor. Additionally, at
step 1609, the VCN bot 1500 may be set up for push notifications
for activities including but not limited to new patient sign-ups,
patient appointments or cancellations, insurance claim payment or
modifications, patient feedback on social media pertaining to the
doctor's practice, patient questions/comments directed to the
doctor and the like.
[0202] The method 1600 is the doctor's side of the workflow for VCN
1500 set-up, and in a similar way, method 1700 illustrated in FIG.
17 shows a patient's side of the workflow for VCN set-up. The
method 1700 includes, at step 1701, downloading the app, such as an
app for providing access to the orthodontic care management
platform 102a, on a user device. The user device may be the
patient's personal device and may include such as a mobile phone, a
laptop, a desktop, a tablet, a PC and the like. Further, the method
1700 may include, at step 1702, signing-up for VCN access. After
sign-in, at step 1703, the patient may be asked whether to opt-in
or not for their personal avatar. If the patient chooses yes, then
at step 1704, necessary pictures of the patient are taken using
their personal devices. The pictures may be used, at step 1705, to
create the patient' personal 3D avatar and link it to VCN AI
bridge. Further, at step 1706, and also if the patient refuses to
opt for a personalized avatar, the method 1700 may proceed to
asking the user to specify preferred mode(s) of interaction, such
as whether they are smart glasses, PC, smartphone, tablet, AR or VR
headset and the like. Further, at step 1707, the method 1700 may
include specifying social media platform(s) and at step 1708,
specifying family and friends that need to be updated on treatment
progress of the patient. This information may be collected along
with a series of questions under control of the VCN bot 1500, and
may be used, at step 1709, to populate a system database. Further,
as already mentioned previously, this entire process of information
collection may be completely speech-driven or may be a combination
of both text and speech input modalities. Apart from guiding
collection of patient data, the VCN bot 1500 may also be configured
to provide an ability to upload files, charts, images, treatment
history, and the like from previous doctors. Further, at step 1710,
the VCN system 1500 may be set up for push notifications whenever
new relevant information is sent to the user that is the patient
here. Such information may include such as doctor reports,
consultation appointment confirmation and reminder notifications,
doctor messages/responses to patient questions/comments,
explanation of benefits statements from insurance for this doctor
and the like. After all this information collection is done, at
step 1711, the VCN system may be configured to send push
notifications to the doctor that the new patient has now signed-up.
Further, additional messages or responses to patient questions and
comments and explanation of benefits statements may also be done
for the new patient. Once set-up, the VCN bot 1500 may be invoked
at any time by the user.
[0203] Once the patient sets up the VCN (as shown in FIG. 17), and
the doctor has had a chance to evaluate the patient and his/her
treatment objectives, a treatment roadmap is created and uploaded
to the cloud based VCN bot 1500. The VCN bot 1500 then informs the
patient that a treatment roadmap is ready for their review, using
the push notification mechanism on the app. The next time the
patient logs into the app, the VCN bot 1500 guides the user through
the treatment roadmap step-by-step using a combination of text,
speech, and image/video simulations of the orthodontic treatment
(i.e. showing how the dental malocclusions will be fixed). This
process is called `onboarding` and helps to set and manage the
patient's expectations regarding the orthodontic treatment. For
many patients, this may also have the effect of reducing the
anxiety associated with any dental work. At the end of the
onboarding, the patient may have a crystal-clear idea of what to
expect over the course of the treatment. This onboarding sequence
will be stored in the such as in one or more databases associated
with the orthodontic care management platform 102a (and updated as
required) and the patient is able to return to it as many times as
needed to refresh his/her memory regarding the treatment
roadmap.
[0204] In some example embodiments, the orthodontic treatment
provided in accordance with the orthodontic care management
solution provided by the orthodontic care management platform 102a,
might have to be altered depending on how the teeth of the patient
respond to the originally planned treatment. Such course
corrections are common in orthodontic treatments. When such course
corrections occur, the doctor may update the treatment roadmap that
was part of the initial onboarding, such that the roadmap always
reflects the current course of treatment. When an update is
registered, the VCN system 1500 may detect the update and send push
notification to the app alerting the patient to a change in the
roadmap. When the patient logs in to the app the next time, such as
by using their user device, the roadmap updates are presented to
him or her, again using a combination of speech, text, and
images/video.
[0205] In some example embodiments, the VCN bot 1500 may help the
doctor in appointment scheduling. Generally, the orthodontists see
their patients at fixed intervals (4 to 6 weeks apart), with the
next appointment being scheduled during the current visit. Such
equally-spaced visits help the operational efficiency of the
doctor's back-office staff, but do not necessarily fit the needs of
individual patients. In such cases, it would be beneficial if the
patients could custom design their schedule, such as seeing the
doctor again within 2 weeks (or even earlier) depending on the
current appliance, their degree of malocclusion, and other factors.
Similarly, some patients could afford to wait for a longer interval
like 6 to 8 weeks for their treatment. Thus, the VCN bot 1500 may
be trained with enough orthodontic knowledge to know when the next
visit needs to be and will guide the doctor accordingly. Further,
once the appointment date is determined, the VCN bot 1500 may be
configured to auto-populate an on-device calendar for the doctor
and patient, assuming that the users have given permission to the
VCN 1500 during the set-up process. In some example embodiments,
the VCN bot 1500 may be configured to provide demand based device
manufacturing. Currently, patients are at the mercy of the
orthodontic device manufacturer that the doctor has contracted with
for manufacturing of the orthodontic device. If that manufacturer
is backed up or otherwise unavailable, the patient's orthodontic
treatment just gets delayed. However, using the orthodontic care
management platform 102a disclosed in the invention, a repository
of world-wide device manufacturers in multiple time zones may be
available, all of whom may be well-versed with the technology
needed to 3D print the device at short notice. By leveraging this
world-wide network, modern communication channels, and different
time zones, the system may be able to determine the best fit and
ships the drawings electronically to the currently available 3D
printer for manufacturing the device, such as the manufacturing
unit 102b-3. Further, the distribution of workload across multiple
3D print manufacturers ensures that no single manufacturer is
swamped with work while others wait for new orders. Thus, the
orthodontic care management platform 102a may provide dual
advantage of providing a patient plethora of manufacturing choices
and for the manufacturers; the workload may be spread across
different manufacturers for more efficient responses to customer
demands.
[0206] In some example embodiments, the orthodontic care management
platform 102a along with VCN bot 1500 may be configured to provide
numerous services as outlined in the table below:
[0207] VCN Services Table
[0208] The following table provides an overview of all the services
provided by the cloud-based VCN 1500. The presence of the optional
3D avatar affects how some of the services might be perceived by
the user and enhances them superficially, but it is to be noted
that the core functionality of all the services are invariant of
the presence or absence of the avatar.
TABLE-US-00001 Core Functionality Service (without 3D No.
Description Avatar) With 3D Avatar Comments 1 On-demand Text or
voice Avatar pops up The VCN is context- output when user asks
smart enough dependent announcing for help, stating to determine
help VCN that it is ready current state of availability to to help.
the user and help Text output on does not repeat the display in
itself addition to unnecessarily. speech. 2 Guidance in Text and No
avatar- filling out speech specific forms guidance. enhancements. 3
Setting up Step-by-step No avatar- appointments guidance specific
enhancements. 4 Help with Step-by-step Based on the various steps
guidance. sentiment of the Leverages analysis, treatment sentiment
the avatar could process analysis of alter its facial (<provide
user responses expressions as section and gaze well to references
tracking to empathize with here>) determine the the patient's
patient's mood. reaction to their look in the images. Tone of voice
will be altered in the VCN voice responses to empathize with
patient's mood. 5 Notifications If so No avatar- of new configured,
specific information VCN will pop enhancements. (appointments, up a
window report as well as availability, speak out the payments,
notification. etc.). See FIGS. 3 and FIG. 4 for notification
details. 6 Speech The dictation No avatar- dictation for mode can
be specific note-taking invoked on enhancements. demand. The VCN
will guide the doctor through the dictation process and ensure that
the notes are stored in the appropriate location within the
patient's treatment records. 7 Speech- In this mode, No avatar-
enabled the VCN specific teeth provides enhancements movement voice
commands to move specific teeth in the 3D simulations of the
patient's teeth (speech- to-action). It also provides speech output
describing teeth movement accomplished through other means
(action-to- speech)
[0209] In some example embodiments, the orthodontic care management
platform 102a may provide telepresence services. For example,
during the orthodontic treatment, there may be certain milestones
in the orthodontic treatment process that might require physical
visits to the doctor's office and others where the doctor does not
need to be present at the same geographical location as the patient
for all steps in the treatment process. Telepresence technologies
are well known in the art and may be used to provide the patient
with standard care as if he or she were at the doctor's office. It
may be understood that since mobile devices have advanced, two-way
communication technologies are already built into them and most
patients will not need anything more sophisticated than their
existing mobile device and built-in cameras to avail additional
advantages of the orthodontic care management platform 102a. For
example, the doctor at a remote location, can virtually examine the
patient's treatment progress in real time by having them show the
doctor their teeth, typically by pointing and orienting the mobile
device appropriately, under voice guidance either from the doctor
who is online, or by the VCN 1500.
[0210] In some example embodiments, the VCN 1500 may support a
dictation mode for chair side notes. In many cases in orthodontic
care management solution provision, one of the more serious issues
in orthodontic treatment is infection management. A doctor who
examines multiple patients during the course of a single day has to
be careful to change gloves and follow other appropriate
sanitization procedures with the tools, etc., as they move from
patient to patient. Further, a doctor who has to type in his notes
into the computer beside the patient during his examination of the
patient's mouth has to either remove his gloves or change gloves to
avoid contaminating the keyboard and mouse--an onerous process that
often gets omitted to save time and complete the patient
consultation as expeditiously as possible. Unless explicit care is
taken to sanitize the keyboard and mouse after each patient
consultation, there is a strong possibility of cross-contamination
between patients if that keyboard and mouse are touched again by
the doctor while examining another patient. To avoid such
inadvertent contamination, the methods and systems disclosed herein
provide the capability to leverage existing speech dictation
technologies, well-known to those skilled in the art, to enable the
doctor to dictate his notes directly into the system, such as the
orthodontic care management ecosystem 102. The VCN bot 1500 has a
dictation mode and will guide the doctor through the dictation
process as well as ensure that the notes are recorded in the
correct location within that patient's treatment record. For
example, the dictation mode may be used by the coaching avatar 1300
discussed in FIG. 13 for performing action-to-voice mapping and
providing instructions on sanitation management to the doctor.
[0211] In some example embodiments, the VCN system 1500 may also
support speech-enabled 3D object movement. For example, during the
treatment planning phase, the orthodontist may typically evaluate
multiple approaches to resolving the patient's malocclusions, given
the constraints, discussed later in conjunction with FIG. 22. This
process may be expedited considerably if the doctor can use speech
to direct the movement of teeth, for example, providing precise
instructions like "move tooth#8 1 mm forward", "rotate tooth#7 45
degrees") in the software simulations. This may also provide the
advantage of freeing up his hands for other tasks and also allow
for multiple keyboard and mouse steps to be completed by a single
voice command, for example, "move tooth#8 1 mm back and rotate it
10 degrees", significantly improving efficiency. The
speech-to-action mode discussed in FIG. 12 earlier may be
implemented using the VCN bot 1500, by having a specialized
speech-to-action mode built into it. In addition, it may also
useful to capture the description of the teeth movement and convert
it to speech output (action-to-speech mode), either for playback to
the patient if the doctor is doing the simulation interactively in
the presence of the patient, or for record-keeping (for later
retrieval).
[0212] Thus, the orthodontic care management ecosystem 102, in
combination with (and inclusive of) the VCN bot 1500 may provide
various advanced technologies for providing efficient, robust,
automated, AI enabled orthodontic care management and malocclusion
treatment options to the patients, in a fair, transparent,
beneficial, participative and cost efficient manner.
[0213] FIG. 18 discusses such an embodiment of an orthodontic care
management system 1800 supporting AI-enabled malocclusion
treatment. The system 1800 includes three main components, SC1
1801, SC2 1802, and SC3 1803, which may be configured to provide
different functionalities for providing an orthodontic care
management solution for malocclusion correction for a patient. The
modules SC1 1801, SC2 1802, and SC3 1803 may be implemented as a
combination of dedicated software modules, micro processing units,
hardwired logic systems, micro-programmed systems, applications or
apps, built-in special purpose software units and the like. The
modules 1801-1803 may be connected to a database 1804 which may
include patient data, doctor data, vendor data, third party service
provider data, learning data and the like. The database 1804 may be
implemented as a cloud based database, a relational database, a web
server, a blockchain enabled database, a memory unit and the like
technologies that may be well understood by those of ordinary skill
in the art.
[0214] The system 1800 may be configured to completely automate the
process of correcting malocclusions in an orthodontic patient. The
functionalities of the system 1800 and the various modules SC1
1801-SC31803 and all communication between these components and the
user may be accomplished with the help of the VCN bot 1500 that is
capable of deploying one of several input and output modalities
such as speech, text, and images wherever appropriate, as discussed
in FIG. 15.
[0215] The system 1800 includes the software component SC1 1801
that may be configured to accept as input a frontal image of a
human face with a smile, and output an aesthetically pleasing
picture that is the closest match to the patient's needs. This may
be accomplished by prior training of SC1 1801 on one or more
databases containing thousands of images of smiling human faces
showing teeth, using deep learning techniques such as convolutional
neural networks, well-known to those skilled in the art. For this
purpose, the module SC1 1801 may include a training module 1801a
that may be configured to analyse the data from multiple databases
1804, wherein the databases 1804 may be designed such that they
contain only those faces that have been judged as aesthetically
pleasing with straight (i.e., not maloccluded) teeth, and use the
data to train the module SC1 1801. The more variety of faces in
this database 1804 (i.e., different demographics such as race,
color, age, etc.) the more robust the training for the module SC1
1801. Once trained in this fashion, SC1 1801 may be able to accept
an image only if it is that of a smiling human with teeth showing,
and will reject an image if the human is not smiling or if the
image is that of a toy or an animal. In the latter case, the user
will be prompted to upload another image. This may be done by
proper recognition of human faces.
[0216] When first invoked by the patient, SC1 1801 prompts the
patient to upload their best facial image with a smile in which
teeth show. SC1 1801 then analyses the picture to verify that it is
indeed a human face in which teeth are showing. Once that
verification is complete, the module SC1 1801 matches the input
facial image with one from the dataset that has the closest match
to the patient's facial features, skin color, tooth size and color,
and the patient's stated goals for orthodontic treatment. The area
corresponding to the straight teeth from this closest matching
picture is identified. This is accomplished by using a standard
graphics algorithm implemented within SC1 1801 that finds a
bounding box for the area corresponding to the teeth. After this,
the patient may select the best smile. This may be done by using
the area captured by the bounding box, extracting it, and
superimposing on the patient's picture. A specialized algorithm
within SC1 1801 is included to ensure that the superimposed teeth
are blended smoothly into the picture so that the patient picture
looks natural. An example of superimposition of teeth with a
picture in shown in FIG. 28. FIGS. 2801-2803 show how the
superimposition of a reference region may be performed using
different planes illustrated in FIG. 2802. FIG. 2803 shows how
gradually in FIGS. 2803a-2803c the blending of area of interest
taken place with replaced teeth. This blended picture of the
patient with the teeth replaced is shown to the patient. The
patient either accepts it or rejects it. If the patient rejects it,
SC1 1801 retrieves the next closest matching facial image, extracts
the teeth area, superimposes it onto patient's teeth while blending
them with their surroundings, and presents it to the patient. This
process of presenting alternatives is repeated until the patient
accepts one picture. Alternatively, the software component SC1 1801
can find top 5 matches, and then use those top 5 closest matching
headshots to create five versions of patient's picture with his/her
teeth superimposed and blended. These five pictures will be
presented to the patient, who can then select the best one among
them. For example, FIG. 23 illustrates how various smile options
based on possible teeth superimpositions may be presented to the
patient. The patient may show their interest in these pictures
using various options shown in FIG. 2302. Based on their selection,
the final image of the patient using their selected teeth
replacement and smile may be presented to them as shown in FIG.
2303. In some embodiments, the system 1800, upon the patient's
request (e.g., "I'd like to have a smile like Julia Roberts") can
present celebrity faces, and so on and so forth.
[0217] The module SC1 1801 may be configured to perform 2D-to-3D
image conversion. Further, SC1 1801 may also provide an option to
the patient to visualize a 3D image of their superimposed, blended
picture. This may be achieved by incorporating well-known
techniques to convert 2D dento-facial images to a 3D dento-facial
image (example: www.selva3d.com). The 3D image may then be
presented to the patient for review.
[0218] In some example embodiments, the module SC1 1801 may use
affect response and immersion techniques for selecting the best
smiles for presenting to the patient. One of the ways to do this is
by performing, Facial Action Coding System (FACS). FACS is a
commonly used approach understood to measure an individual's
emotional response by studying facial behavior. The contraction of
specific facial muscles described by action units (AU) can be
related to the patient's affective response. The system 1800 may be
configured to incorporate FACS to measure and analyze a patient's
response to various simulations of their corrected smile at the
start of the orthodontic treatment. The system 1800 may be designed
to achieve the following objectives:
1. Evaluate patient's likability of a smile based upon action unit
analysis 2. Patient's immersion behavior: how interested are they
in evaluating their smiles? 3. Patient's eye movements/gaze
analysis: what particular aspects of the smile do they like or
dislike? 4. Provide facial muscle training and feedback 5. Perform
a sentiment analysis based upon facial expressions
[0219] As already mentioned above, the module SC1 1801 may be
configured for converting 2D images of the patient to 3D. Further,
augmented reality techniques may be used to simulate the various
corrected smiles of the patient. The displacement of the
individual's facial action units in response to various simulations
of the corrected smile and facial changes will be recorded,
measured, and analyzed. The displacement of the pertinent action
units is related to emotional states including, but not limited to,
happiness, sadness, surprise, disgust, fear, and anger. These
responses will be related to the likability of the smile. Machine
learning techniques, such as those described in Bartlett et al and
Yan Tong et al., which are herein incorporated in their entirety by
reference, will be applied to classify the facial expressions of
the patient. The system 1800 is trained on a dataset of facial
expressions from several hundred people. FIG. 23 shows an example
of a patient's attractiveness preference. FACS can also be used to
track the sentiment of the patient during the entire treatment
cycle by measuring their facial expressions as they respond to a
series of survey questions. Their facial expressions will provide a
better indicator of how they really feel about the progress of
treatment, regardless of their text or speech responses to the
questions. FACS can also be used to address a problem that some
orthodontic patients encounter during treatment--pain and stiffness
in their facial muscles. Based on the identification of muscle
fatigue by tracking facial expressions as described in Uchida et
al., which is herein incorporated in their entirety by reference,
the system 1800 can be trained using machine learning techniques to
provide customized muscle training exercises to the patient to
alleviate this pain and stiffness. Customized training exercises
are needed, since each patient might have different muscle
fatigue/stiffness symptoms in different muscles of the face. The
VCN bot 1500 can present the training exercises to the patient
visually using simulated images via their user device display and
guide them through the exercise routines at regular intervals set
by the patient or orthodontist. These muscle exercises and their
presentation to the patient via the VCN 1500 are useful not just
for orthodontic patients but for those people afflicted with Bell's
palsy
(https://www.facialparalysisinstitute.com/physical-therapy/exercises-for--
bells-palsy/). Thus, using FACS in combination with machine
learning to customize muscle training exercises and present them
via a VCN 1500 may find applicability beyond just orthodontics.
[0220] In some example embodiments, eye movement and gaze may be
evaluated to understand what features the patient focuses on or
avoids--avoidance implies dislike. Eye gaze detection and tracking
techniques described in A. Perez et al (2015), Selker et al (2001),
and Cuong et al (2010) which are herein incorporated in their
entirety by reference are well-known to those skilled in the art,
with some techniques requiring nothing more than an inexpensive
webcam to capture images. The system 1800 may be configured to use
such eye-gaze detection techniques to determine where the patient
is focusing their attention on their simulated smiles.
[0221] In some example embodiments, immersion measures the interest
level of the patient. Wearable biosensors (worn on the forearm, or
other convenient location on the body) capture neural signals
associated with attention (such as increases in heart rate and
electro dermal activity) and vagal tone (increases in heart rate
variability). Software associated with these sensors measures these
signals, analyses them, and quantifies them on a 0-10 scale, with a
higher score signifying greater immersion. Such physiological
sensor and software combinations are well-known to those skilled in
the art such as in Zak and Barraza (2018), which are herein
incorporated in their entirety by reference. The VCN 1500 may be
configured to guide the patient and doctor by a combination of
speech, text, and image modalities, as described in FIG. 15, making
it easier for the doctor to make the requisite measurements. The
patient's measures are tracked, measured and analyzed in the
learning database and used for demonstrating to the patient
improvements in their facial musculature as a result of exercises.
Furthermore, the sentiment index is used to present patient
reported experience measures on a continuous basis to the doctor
and update the doctor's reputation metrics. Recommendations to the
doctor to improve their patient a service driven by the sentiment
is provided through the VCN 1500 using scenario based learning. And
the learning trajectory within the practice is measured and
feedback provided to the appropriate institution. All these may be
implemented using the module SC1 1801 of the system 1800.
[0222] Further, the module SC1 1801 may provide a plurality of care
management options. The care management options may include
screening the patient's ability to self-manage their own care based
upon factors that include but are not limited to the patient's
desires, severity of malocclusions, and cost. Furthermore, SC1 1801
may also provide the patient with other care management approaches
that may include: a hybrid approach involving limited professional
supervision at identified points of care, or comprehensive
professional management through the entire care cycle, involving
regular doctor visits. The determination of the appropriate care
management path can also be accomplished by machine learning
techniques. For example, a database of dento-facial images may be
compiled. Each image may then be labelled based upon the
appropriate care management approaches including but not limited to
self-care management, hybrid or total professional management.
Further, a neural network may be trained using labelled data to
classify the dento-facial images based on a care management path.
When the patient's dento-facial image is presented to such a
trained network, it may be able to determine the recommended care
management path, with a reasonably high accuracy.
[0223] The module SC1 1801 may also be linked to VCN 1500 and thus,
it may be configured to leverage the interactivity and multiple
input/output modalities of the VCN 1500. The VCN 1500 may be used
for a variety of tasks such as collection of additional demographic
information of the patient as well as information from the patient
to establish their personal profile, persona, and treatment needs.
This data may then be used to provide additional smart information
to the patient by connecting to specific services, including but
not limited to support groups, patient learning aids, and patient
decision aids. SC1 1801 may also be configured to provide
generation of a 3D avatar of the patient, with the patient consent
as already discussed in FIG. 15. The animated 3D avatar, if chosen,
then becomes the face of the VCN 1500 as it guides the patient
through the entire treatment process, providing context-sensitive,
on-demand guidance. Further, SC1 1801 may be configured to provide
options for generation of a treatment plan for the patient. If the
VCN 1500 determines that the patient's malocclusions can be
corrected by self-management, it automatically generates the
sequence of orthodontic tooth movements using module SSC2 1802b
described later. Once the sequence of correcting the orthodontic
malocclusions has been obtained, the VCN 1500 then analyses the
output, and performs the appropriate sequencing, staging and
selection tasks to generate a complete plan for the patient. It may
also automatically use other services, including but not limited
to, appropriate clinical pathway guidelines, care milestone check
lists, appliances systems, self-care management aids, motivational
aids, anticipate the problems that the patient may incur, the most
cost-effective source for manufacture of the orthodontic appliances
and will also engage in competitive bidding services on behalf of
the patient to provide the most cost-effective care solutions.
Further, SC1 1801 may also provide treatment visualization for the
patient. The VCN 1500 also applies graphic transformation to the 2D
image of the dentition, to reflect the anticipated orthodontic
response to the orthodontic appliance that is recommended for the
patient. This image demonstrates to the patient what he/she may
expect to see in a temporal sequence. The picture in 2D and 3D is
shown to the patient on demand based upon the starting date of
treatment or can be displayed periodically driven at critical
junctures in the treatment of the patient. The VCN 1500 may also be
configured to instruct the patient to take appropriate dento-facial
images with a camera or scanner. It analyses the image and compares
it with the 2D image above, to determine the progress of treatment
and also informs the patient on how to manage any midcourse
corrections if needed.
[0224] The system 1800 may also include a module SC2 1802 that may
be configured for automated setup of malocclusions to establish a
target occlusion. For this, SC2 1802 may be configured to
automatically suggest the steps to be taken to treat a
malocclusion. This is accomplished by either using a series of 2D
images of the teeth or face and/or a 3D scan of the teeth, or
creating an output that provides a sequence of orthodontic steps
that must be taken to treat the malocclusion. Thus, the module SC2
1802 may enable correction of the malocclusion from its initial
state to its target state using a sequence of steps described in
conjunction with the optimization algorithm 2200 based on a
constraint logic problem illustrated in FIG. 22.
[0225] The constraint logic problem described in FIG. 22 includes,
at 2201, performing information extraction from dentition and at
2202, calling search predicate to find all fixtures recursively.
Further, at 2203, shortest list of fixtures is extracted and at
2204-2206, appropriate tooth is fixed, using the constraints
defined in 2207. Further, at 2208-2210, appropriate function calls
are made and at 2211 and 2212, using blocking information, exact
fixture for tooth is identified.
[0226] For instance, if we consider a tooth that needs to be moved
1 mm to the right and is blocked by a second tooth, to accomplish
the movement of the first tooth, the second tooth has to be moved
first and soon and so forth. The problem of treating a malocclusion
is termed a planning problem. A planning problem has the following
components: an initial state, a final state, and the "move or
displacement". The relation between the two states informs us if
one state can be reached from another state by a single move (for
example, moving a tooth 1 mm to the right). The planning problem
thus seeks to find the sequence of moves or displacements that will
take us from the initial state to the final or target state.
Planning problems can be modelled as a constraint satisfaction
problem (CSP). Constraint Logic Programming over Finite Domains
(CLP (FD)) technology that has been widely used for solving CSPs is
used. Essentially, each tooth that needs to be moved is constrained
by the two teeth on each side. The teeth to be moved are driven by
the patient's desire and or doctor's diagnosis and defined by
boundary conditions that include, but is not limited to, the facial
midline, arch-form, the class of occlusion, and the occlusal plane
level.
[0227] SC2 1802 may include two major components, SSC1 1802a and
SSC2 1802b. SSC1 1802a takes the input, for instance, the 3D scan
in an STL file format and computes the bounding box for each tooth,
using off-the-shelf tools such as Mesh Lab and Blender. SSC1 1802a
uses the coordinates of the bounding box to generate the input for
SSC2 1802b. SSC2 1802b will take this input and plans a sequence of
steps to correct the malocclusions. SSC2 1802b subcomponent takes
as input the configuration of each tooth. For each tooth, the input
should describe the entirety of malocclusions that the patient has.
This is captured by SSC2 through six values:
1. Crown Tipping (variable 2213 in FIG. 22): rotation of the tooth
with pivot at the root in either the sagittal plane or the frontal
plane. 2. Root Tipping (variable 2214 in FIG. 22): rotation of the
tooth with pivot at the crown in either the sagittal plane or the
frontal plane. 3. Torqueing (variable 2215 in FIG. 22): rotation of
the centre of the pivot along the transverse plane of the tooth. 4.
Rotation (variable 2216 in FIG. 22): Tooth is rotated on its centre
(both crown tipping and root tipping). 5. Translation (variable
2217 in FIG. 22): Degree to which the tooth is translated from the
standard jaw curve along the X axis or the Y axis of the transverse
plane of the tooth. 6. Intrusion/Extrusion (variable 2218 in FIG.
22): How much the tooth is intruded or extruded compared to the
desired location of the tooth on the Z-axis.
[0228] The input also consists of a pair of Boolean values (0 or 1)
that tell us if the tooth is blocked from the left or the right,
with respect to movement.SSC1 1802a generates these 7 values for
each tooth. A normal tooth will have all values as 0. These inputs
are then used by SSC2 1802b to generate the constraints, that are
then solved using the CLF(FD) implementations, such as the
algorithm 2200, found in most Prolog systems, well-known to those
skilled in the art. The output of the constraint solver consists of
a sequence of moves or displacements that indicate sequence of
orthodontic actions to be taken. The possible moves are as follows:
crownTip, rootTip, rotate, and translate. If the constraint solver
finds no solutions, i.e., there are collisions, the subsequent
analysis is done to identify whether expansion or flaring the tooth
(moving it forward) can accomplish the correction to the tooth
position (note all of these movements are constrained within the
aforementioned boundary conditions such as midline, arch-form,
etc.). If these alternatives fail, then either the boundary
conditions have to be changed systematically, or more aggressive
invasive strategies such as interproximal reduction (tooth shaved
to reduce its size) or extraction to create space and remove
collisions need to be considered. The input data is updated with
these changes, and the CLP (FD) program, such as the algorithm
2200, is run again until a sequence and staged approach of the
displacements is calculated and this defines the target. The
sequence of movements can be temporally matched to set milestones.
Other equivalent constraint solving technologies such as Integer
Linear Programming or SAT-solving can also be used instead of CLP
(FD).
[0229] The system 1800 also consists of module SC3 1803 that may be
configured for training and feedback provision for orthodontists.
In some example embodiments, the system 1800 may be configured for
providing a comprehensive automated orthodontic analysis of each
orthodontist's performance based on the treatment provided to
patients referred to them by using the orthodontic care management
platform 102a. The module SC3 1803 may be specifically developed
for this purpose. SC3 1803 may be accessed by a patient.
Periodically, the patient is asked to take an image or scan their
teeth and upload it to SC3 1803. They may be also asked to provide
a 3D scan of their teeth taken initially at the beginning of the
treatment. SC3 1803 may then compute the optimal configuration at
the current time based on the 3D scan. This optimal configuration
may further be compared to the current pictures of the teeth
uploaded by the patient. In case any major deviations are
identified, and for a given orthodontist, the same error is
observed repeatedly, the orthodontist is informed. Any other errors
identified during the comparison will also be identified and the
orthodontist and (possibly) the patient may be informed of the
same.
[0230] In some embodiments, the system 1800 may also be configured
for providing a patient affect response enabling system, as
illustrated in FIG. 30.
[0231] FIG. 30 illustrates an exemplary block diagram of an affect
response enabling system 3000, in accordance with one embodiment.
The system 3000 comprises an image capture module 3001, 2D to 3D
image conversion module 3002, an augmented reality smile simulation
module 3003, a display module 3004, and an eye-gaze tracking and
emotion recognition module 3005.
[0232] The image capture module 3001 may be configured to scan the
user's facial image or take multiple photos of the user, such as
the patient. The photos may then be converted to 3D by the 2D to 3D
image conversion module 3002 and patient's smile may be simulated
using AR techniques, by the augmented reality smile simulation
module 3003. The simulated smile may be displayed to the user using
multiple photos of corrected smiles, by the display module 3004.
The likeness of the patient to specific smiles may be assessed by
tracking their gaze by the eye-gaze tracking an emotion recognition
module 3005. In some embodiments, a plurality of photos may be
flashed in front of the patient showing them different arrangements
of teeth. These arrangements may be reflected on the images of the
patient themselves or on another subject. The module 3005 may then
track patients' emotion and follow their eye-gaze to understand
their likes or dislikes.
[0233] In some embodiments, the system 1800 may comprise a
context-specific patient monitoring enablement system 3100, as
illustrated in FIG. 31.
[0234] FIG. 31 illustrates an exemplary block diagram of the
context-specific patient monitoring enablement system 3100, in
accordance with one embodiment. The system 3100 comprises a user
3101, a 2D-to-3D image conversion module 3102, a patient database
3103, an augmented reality superimposition module 3104 and a
virtual care navigator 3105. The user 3101 may be any of a doctor
or a patient.
[0235] Thus, the system 1800 comprising of the modules 1801-1803
may be configured for complete, learning-enabled, constraint-based,
and feedback-oriented orthodontic treatment provision to the
patients of the orthodontic care management ecosystem 102. The
learning-based system 1800 may also be configured for complexity
evaluation of the orthodontic treatment process, as illustrated in
FIG. 19.
[0236] FIG. 19 illustrates a method 1900 for complexity evaluation
in orthodontic treatment provision according to one example
embodiment. The method 1900 for complexity evaluation may include,
at step 1901, capturing a facial image, and at step 1902, creating
a personalized avatar, such as the avatar 1903 described in context
of VCN bot 1500 disclosed in FIG. 15. Further, the method 1900 may
include, at step 1903, generating an intelligent animated avatar
which may provide image capturing related instructions, and thus,
at step 1904, the image is captured. Further, the image may
analysed, at step 1905, to check a number of constraints such as
no. of teeth captured, image contrast, tooth features and the like.
Based on this analysis, at step 1906, the image may be rated on a
scale of 1-10, and at step 1907, the image quality may be checked
for adequacy. If the image quality is satisfactory, then at step
1908, complexity of malocclusion may be evaluated. However, if the
image quality is not adequate, then through steps 1913-1915, the
image information is updated in the knowledge database and a new
image is captured based on voice, text or video instructions
provided by the intelligent avatar.
[0237] On the other hand, for a satisfactory image, after
complexity evaluation, at step 1909, pattern recognition is done to
identify a numeric figure outlining the deviations in teeth.
Further, at step 1910, similar images are identified using image
matching algorithms also, simultaneously; the numbers of deviations
are compared against the number of occlusions. Finally, at step
1912, the complexity of malocclusion is identified on a scale of
1-10.
[0238] The complexity evaluation may further be supplemented by
evaluating cost of orthodontic treatment based on complexity of
malocclusion, as illustrated in the method 2000 of FIG. 20.
[0239] FIG. 20 illustrates a method 2000 for evaluating cost of
orthodontic treatment based on complexity of malocclusion, in
accordance with an example embodiment.
[0240] The method 2000 may include, at step 2001, identifying
nature of complexity of malocclusion, and based on that, at step
2002, identifying an estimated cost of treatment. Further, at step
2003, the real cost of treatment is identified, and at step 2004,
the two costs are checked to see if they are equal and at step
2006, the knowledge database is updated. However, if the estimated
coast is not same as the real cost at step 2005, then at step 2007,
percentage of difference is identified on a scale of 1-10, and put
in the knowledge base at step 2006. The difference is also used to
calculate assurance, at step 2009, and may provide to the patient
2011 or the doctor 2012, or the insurance agents 2013 or banks
2014. In some embodiments, the knowledge database may be queried,
such as at 2008, to identify the root cause of difference between
estimated vs. real cost and based on that, at step 2010, various
factors such as increased treatment visits, appliance breakage and
the like may be evaluated and provided to the various stakeholders
mentioned earlier, that are the patient 2011 or the doctor 2012, or
the insurance agents 2013 or banks 2014.
[0241] The cost estimation may be used to identify patient's
treatment financing options, such as illustrated in the method 2100
of FIG. 21.
[0242] FIG. 21 illustrates a method 2100 for evaluating patient's
treatment financing options, in accordance with an example
embodiment.
[0243] The method 2100 may include, at step 2101, identifying
patient's estimated cost, and at step 2102, identifying patient's
credit history and credit score. Further, the two are provided as
inputs to the intelligent care navigator, that is the VCN 1500, at
step 2103, and at step 2104, the VCN 1500 may provide this data as
input to the competing financing institutions. The competing
financing institutions also keep receiving information about the
patient's personal reputation, at step 2110. Further, at step 2105;
the VCN 1500 uses the information provided by the competing
financing institutions, which may include bids for financing the
patient treatment cost, to a best financing option selection
algorithm, using which, at step 2106, best financing option is
selected. Further, this option may be presented to the patient, who
may, at step 2107, select the best option. Further, the VCN 1500
performs continuous monitoring of patient's finances, payment
schedules and provides feedback in managing finances to pay on time
and maintain a credit score. This information is also kept stored,
at step, 2109, in one or more learning databases. These databases
may be configured for providing patient's financial information,
such as their credit score, their payment history, payment dues and
the like, from time to time to requesting modules of the overall
system. These modules may be such as, doctor side interface 2111,
cost estimation algorithm 2112, patient credit score based module
2113 and the like.
[0244] Thus, the orthodontic care management ecosystem 102 may be
configured for providing treatment cost estimation and financing
options for patients, based on the algorithms outlined in the
methods 2000 and 2100.
[0245] In some example embodiments, the orthodontic care management
ecosystem 102a may also be configured for providing chairside
context-specific patient monitoring, as disclosed in FIG. 24.
[0246] FIG. 24 illustrates an example of the chairside
context-specific patient monitoring embodiment 2400 of the
invention. During treatment, when the orthodontist is examining the
patient in his office, it would be very helpful if the doctor is
able to see the difference between the current state of the
patient's teeth and the expected state, based on the treatment
plan. The orthodontic care management ecosystem 102 may be
configured to provide for this, with the help of augmented reality
(AR) technology. For example, if the orthodontist is wearing smart
glasses with a microphone and speakers, as shown in FIG. 2401, the
orthodontist may be reminded by the VCN 1500 of the treatment notes
from the last visit or any updates from the patient's parallel
notes. In addition, a check list, that maybe image-based,
text-based or through speech, may be prepared and presented to the
doctor. This may help to guide the doctor to examine the region of
interest. Further, there may be provided an image outlining the
expected state of the teeth at a current point of time in the
treatment plan as being digitally superimposed on the
orthodontist's view of the patient's teeth, as shown in FIGS. 2402
and 2403. This may help the orthodontist to efficiently determine
the progress of the treatment and any deviations caused by the
patient's physiological, mechanical or other factors affecting
treatment. In some example embodiments, the same superimposed image
can also be shown on a monitor visible to the patient so that
he/she can also see what the doctor is viewing. This is shown in
FIG. 2401. The cloud-based VCN 1500 will be responsible for
retrieving the requisite images from the cloud storage. Further,
since the doctor will likely have his/her hands busy with the
examination of the patient's teeth, the speech input
(speech-to-text) and speech output (text-to-speech) capabilities of
the VCN 1500, described in conjunction with FIG. 15 will be
leveraged to interact with the VCN 1500 by voice to retrieve the
requisite teeth images and manipulate them, without having to use a
keyboard, mouse, or other tactile input modalities. The AR-based
superimposition provides the patient with compelling visual
evidence of the progress of their treatment (or lack thereof) and
the speech interaction with the VCN 1500 enabling the orthodontist
to efficiently and quickly complete the consultation and move onto
the next patient. The doctor may also activate a search through the
VCN 1500 as a query to seek additional information to assist her
decision making at the chair side and retrieve and prescribe
electronically patient-specific learning aids. The doctor may also
share her findings in real-time with the patients' parents or
guardians in an interactive mode through image, text or speech.
Similarly the doctor may consult interactively in real-time with a
colleague to seek additional support or convey instructions. Also
the doctor's view area and or conversation with the patient is
captured by a camera in video or single frame mode and stored as an
image in the patient's electronic health record. Any instructions
for the patient are recorded and provided to the patients personal
VCN 1500 for action. Furthermore, any updated information may be
stored via speech or text as a part of the electronic record. All
data gathered becomes a part of the reinforcement learning database
that assesses the doctor's performance based upon matching the
predicted outcome versus the current state and when patterns or
random events are detected, the same system can be used to provide
feedback for point of care learning to the doctor. Furthermore, the
synthesized data is used to update the progress of patient care and
modify the estimate treatment time, the next scheduled appointment,
and the treatment plan if necessary. The VCN 1500 may inform the
patient or the guardian regarding the progress of care.
[0247] In some example embodiments, the orthodontic care management
platform 102a may be configured for providing as an output, the
orthodontic appliance 103, which may be equipped with sensors, as
illustrated in FIG. 27. The FIG. 27 illustrates orthodontic
appliances 2700 which may be augmented with one or more
multi-sensor devices. For example, FIG. 27 illustrates multi-sensor
device attached to a patient's tooth 2701. Similarly, there may be
a plurality of multi-sensor orthodontic appliances 2702 attached to
the tooth. In some embodiments of the invention, these multi-sensor
devices 2702 will have a gyroscope sensor for sensing the rotation
about the x, y, and z axes, and a Bluetooth low-energy (BLE) chip
for communicating the gyroscope readings to a smartphone or other
BLE-enabled device. The multi-sensor device 2702 will be shielded
and fully enclosed within the appliance to protect it from the
saliva and food and drink in the mouth of the patient, without
compromising its communication capabilities. The multi-sensor
device 2702 will be powered by miniature power cell that will also
be adequately shielded from moisture and food in the patient's
mouth. Any heat dissipation from this multi-sensor device 2702 will
be miniscule and not pose a danger to the patient's mouth, as
illustrated in FIG. 2703. Also, in some example embodiments, the
multi-sensor devices may have flat surface, such as illustrated in
FIG. 2704. By placing these multi-sensor devices 2702 at various
points on the appliance, the doctor can monitor the movement of
teeth using the readings from the multiple gyroscope sensors, such
as in FIG. 2702. For example, if the gyroscope readings suggest
more than expected rotation of the teeth or rotation in an
un-anticipated direction, the doctor will be notified (BLE chip in
patient's mouth communicates with patient's smartphone that has the
orthodontic treatment app on it, which in turn transmits this
information to the cloud backend and cloud backend pushes
notification to the app on the doctor's smart device). Based on the
analysis of the sensor data, the doctor can get the patient in for
a visit at his/her earliest convenience, to fix the rotation. The
VCN 1500 will help both the doctor and patient with the
communication with the sensors, appointment setup, and other
related tasks.
[0248] Further, the result of placing multi-sensor devices along
with various attachments may be to achieve desired tooth movements,
as depicted in FIG. 2705, where shaded regions show tooth
displacements achieved as a result of placing multi-sensor devices
and attachments on a patient's tooth. For example, FIG. 2706
illustrates that teeth 2706d-2706e may be provided with various
attachments, like the aligner 2706f, along with a pin and tube
attachment 2706a-2706c, and elastics 2706g-2706i. Additionally,
sensors may also be placed on the teeth.
[0249] The gyroscope sensors, BLE chips, and shielding enclosures
for such multi-sensor devices are well-known to those skilled in
the art. The use of such multi-sensor devices to measure teeth
movement during orthodontic treatment, as described in this
invention, is indeed novel.
[0250] In some example embodiments, the orthodontic care management
platform 102a may be configured for context-driven marketing
patient support and marketing. The orthodontic care management
platform 102a may be configured to customize and personalize
orthodontic care for individual patients by tailoring the doctor's
websites that are accessed from the patient's smartphones to their
specific needs and treatment plan. This is possible since the
patient's data, treatment plan, and treatment history will be
available to the cloud backend once the patient completes
registration with that doctor's practice. Since the website is
owned by the doctor and the patient will almost always access it
from the app on their smartphones, the website can be customized
for each patient by populating key sections of it with information
specific to the patient's treatment. For example, if the patient
has a specific type of malocclusion, then the website might contain
videos and other information pertaining to that type of
malocclusion only. As another example, the customized website might
contain information on the insurance coverage and payment plan
(installments, lump-sum, etc.) specific to that patient. Yet
another example would be language customization. Or yet another may
connect the patient to disease-specific support groups. A patient
who is not well-versed in English might find the website in their
native language a lot easier to understand and navigate. For such a
patient, the entire website may be served up in their native
language, with an option to switch to English. Such customizations
may make it easier for patients to get useful information about
their treatment but also creates stickiness--they are more likely
to come back to that website or seek treatment with the doctor who
has such customizable websites.
[0251] In some example embodiments, the orthodontic care management
platform 102a may be configured for automatic treatment planning
for the patient based on the design of the target plan, for example
as provided by the design unit 102a-5. The automatic planning is
also illustrated by the exemplary user interface 2500 snapshots
provided in FIG. 25.
[0252] FIG. 25 illustrates exemplary user interface 2500 that may
be used for automatic treatment planning, such as for automatic
malocclusion treatment. The user interface 2501 shows automatic
planning of malocclusion treatment using an aligner device with
different stages according to a target teeth configuration. The
stages in interface 2501 represent planned outcome of stage 1 in
treatment at 6 weeks. Further, interfaces 2502 and 2503 represent
different stages like lower teeth stabilization and lower arch
stabilization. Further, interfaces 2504 and 2505 represent how
individual tooth movements can be achieved using aligner and
elastics. The user interface 2500 also includes various menu
options as illustrated in menu 102a7-1 which may help a user that
may be an orthodontist to choose a particular view for display. The
user interface 2500 also includes the menu 102a7-2 which helps the
orthodontist to choose a suitable appliance for treatment. Further,
the use of different types of orthodontic appliances and even a mix
and match of orthodontic appliances may be possible for treatment,
as illustrated in FIG. 26.
[0253] FIG. 26 illustrates a plurality of different types of
orthodontic devices 2601-2605 that may be used for malocclusion
treatment. These may include a pin and tube attachment, aligners,
wires, tubes, pivots and the like. The outcome of the treatment may
be used to provide restorative care options to an orthodontist,
such as illustrated in FIG. 29. The various tooth configuration
displays 2901-2903 may be used to identify what restorative
movements are required to further match the treatment's planned
outcome to desired outcome.
[0254] In some example embodiments, the orthodontic care management
platform 102a may be configured for providing automatic treatment
options by considering the patient's wants or needs, patient's
economic stipulations, patient's treatment time considerations, the
doctor's skills and preferences, and diagnosis based upon the soft
tissue, skeletal, dental, functional, medical and dental history,
biological and physiological status, psychosocial profile, and
current evidence. Each of these elements can be considered as a
constraint logic problem. So, the methodology described earlier in
FIG. 22 for correcting malocclusions using AI may be applied to
each one of the variables described above to establish a care
solution.
[0255] In some example embodiments, establishing the target setup
is accomplished automatically with the objective of maximizing
aesthetics, function and stability, cost-effectiveness of care,
efficiency of tooth movement, and patient safety. In some example
embodiments, this may be accomplished by designing minimal
interventional and invasive treatment care approaches that may
include, but are not limited to, minimizing tooth movement or
displacements, designing personalized orthodontic appliances,
minimizing disruptions of the patient's lifestyle such as the
frequency of visits to the doctor or pain associated with the use
of appliances and minimizing time in treatment. The order of
staging orthodontic treatment, for instance, "does space closure
precede alignment?", and the sequence of tooth movement, that is,
"which tooth or teeth should move first?", is performed
automatically. In addition, the sequence of tooth movement is
designed strategically and automatically with the goal of
establishing the shortest path towards the target position and with
a minimal number of collisions. The nature of tooth movement to
achieve maximum efficiency and stability such as tipping versus
translation are also considered in the design of the plan. In
addition, for each major milestone in treatment, timelines are
developed automatically. Further, the appliance systems to achieve
specific care goals in orthodontic treatment are also automatically
designed. These appliances are generally designed to create
statically determinate force systems to apply controlled and
predictable forces to the teeth. Common combinations of fixed
appliances, aligners, and removable appliances with temporary
anchorage devices and elastics may be considered to achieve these
goals. Other features considered in the design of these appliances
include achieving maximum safety and reliability in performance,
modularity, minimal adjustments or changes, maximize patient
comfort, and ease of installation and use by the operator and
cost-effectiveness. The design and manufacture of these appliances
is done strategically in terms of defining the shortest path of
tooth movement to achieve the desired target. Also considered is
the path that involves the least number of tooth collisions, and
the most effective type of tooth displacement such as tipping
versus translation. The time-bounded treatment milestones are
provided to the patient to allow the doctor and patient to follow
the progress of care. Further, in some embodiments, appropriate
checklists for each milestone to be accomplished during the course
of treatment are developed. Additionally, key performance
indicators for the individual patient and doctor to measure
progress in care are shared with the doctor.
[0256] In some example embodiments, the system 1800 may also be
configured to provide alternative treatment approaches based upon
reprioritizing the input parameters discussed above and allows for
complete interactivity. Furthermore, the system 1800 allows for
planning interdisciplinary care such as combining restorative care
with orthodontic care driven again by the principles of minimal
invasive and interventional therapy to achieve care in the shortest
period of time with maximum aesthetics, stability, and
function.
[0257] In some example embodiments, the AI enabled system 1800 may
allow for automatic design of appropriate therapeutic devices or
appliances that are automatically designed to achieve each of the
staged or sequenced events such as for alignment, space closure,
root correction, jaw repositioning and orthopedics and
stabilization, finishing, and retention. The preferred features
considered in the design of the appliances include, but are not
limited to, features that generate controlled, predictable and
known force systems whose force magnitude and line of action and
direction of forces can be controlled, are statically determinate,
create consistent force systems, are compliant with current
evidence in terms of effectiveness and efficiency, require minimal
changes, are operationally safe, can fit in the mouth and do not
impinge on the patient's oral tissue, are highly reliable, not
prone to failure, require minimal compliance from the patient in
terms of wear, cause the least amount of discomfort to the patient,
are easily installed by the operator, are self-limiting in action,
can be easily adjusted, are modular to allow for chair side
modification if the need arises to perform concurrent tooth
movements, allow for maximum aesthetics, and are made of the
appropriate material from both a biosafety perspective and
efficiency of force delivery and cost.
[0258] Furthermore, in some example embodiments, the AI-enabled
system 1800 may also allow for the fully interactive design of the
appliances such as design of novel male-female fixed and removable
attachments, for brackets, aligners, removable or fixed appliances.
Bracket dimensions and slot may be changed, additional hooks or
slots or auxiliaries can be designed. Aligners with special
features such as internal attachments within the aligner shell or
external attachments on the outside of the shell can be designed.
Furthermore, tooth attachments for aligners are automatically
designed and their shape and form can be modified. Also, special
features such as tubes, brackets, posts, telescopic structures,
hooks, and buttons can be designed into the aligner. These features
can be designed into the appliance or as separate parts that may be
attached or fixed to the aligner at a later stage. Structural
features in the aligners may be modified; these include, but are
not limited to, sectional aligners, cutout aligners, space between
aligners and tooth or any contact surface hybrid aligners with
springs, aligners with bite blocks, jaw repositioning aligners,
monobloc aligners and the like. Also, the internal or external
topography of the aligners can be designed by creating flat
surfaces with or without inclines or even contoured surfaces.
Honeycomb, lattice or corrugated structures of the aligner can also
be designed. Additionally, removable appliances retainers and
wires, indirect bonding trays to position to bond the attachments,
appliances or guide implants in the mouth may also be designed. The
capability to design polymer or composite based archwires that may
be directly printed using additive printing technology are also
provided Furthermore, the appliances can be designed to generate
active tooth movement forces or as passive devices that generate
near-zero forces based upon the treatment stage and sequence of
treatment. Additionally, the design and use of the fixed
appliances, aligners, and removable appliances are automatically
staged and sequenced to optimize control and predictability of
tooth movement, safety and cost when used in combination with the
methods and systems disclosed in this invention. The software
provides instructions to the doctor in the installation, use, and
management of the appliances. These can be communicated via the VCN
1500 as images, text, and/or speech. Furthermore, the doctor can be
trained prior to installation in the use of the appliances if they
are not familiar with them through AR and VR tools and certified
for appropriate skills prior to using the new devices. Similarly,
the software automatically designs personalized instructions to the
patients for use and management of the appliances and activates the
VCN 1500 to support the care of the patient. The patient can also
be given instructions in the use and management of the appliances,
safety instructions, and managing emergencies using AR-VR tools
already discussed.
[0259] In some example embodiments, using the system 1800, the
patient or doctor may also enhance the aesthetics of the devices in
use. This may be accomplished by creating veneers that are attached
or built into the various orthodontic appliances to cover or mask
the underlying malocclusion as it is being corrected. Tooth pontics
can also be designed into or attached to the orthodontic devices to
hide the extraction site(s) during space closure. Furthermore,
fashion item features such as pop-culture motifs (movie and cartoon
characters, emojis, etc.), colors, shape, materials coatings, and
jewelry can be designed into the orthodontic appliance designed
using the system 1800. Various pharmaceutical substrates to
minimize tooth decalcification, pain, inflammation, whitening of
the teeth and to enhance flavors to minimize the metallic plastic
taste or enhance mouth freshness or minimize anti-halitosis may be
designed into the appliances using a variety of carrier mechanisms
such as thin films and microspheres. Additionally, probiotic
bacteria may be layered into the appliances or carriers to manage
oral halitosis or gingival inflammation Furthermore, coatings to
minimize friction in the devices can also be designed into the
appliance. Sensors may also be incorporated in the devices to
manage reminders for adherence, loss of appliance, time to change
the appliance and loss of force delivery.
[0260] In some example embodiments, the system 1800, such as the
orthodontic care management platform 102a, may allow appropriate
choice of materials based upon the design features, functionality,
and use case of the orthodontic appliance. These include, but are
not limited to, commonly accepted orthodontic biocompatible
materials such as titanium-based alloys, stainless steel, chrome
cobalt, titanium niobium, plastics, acrylics, elastomers,
fiber-reinforced composites, ceramics, polymers, and silicones.
Furthermore, combinations of materials of different mechanical
characteristics may be blended to optimize the function of the
orthodontic device design. In some example embodiments, the
appliances may be printed either by using additive or subtractive
computer-aided manufacturing processes. The system 1800 may
automatically provide real-time feedback on product pricing and the
most cost-effective discounted source for the fabrication or
purchase of the appliance, the shipping costs, and the ability for
aggregate purchase.
[0261] In some example embodiments, the system 1800 may provide the
ability to the doctor to use off-the-shelf products when
appropriate and design it into the care plan. As described earlier,
the personalized patient-specific VCN 1500 may be used to help,
coach, and motivate the patient and provide appropriate behavioral
nudges to enhance patient motivation through the course of
treatment. The doctor may also modify the prescription of the VCN
1500 in terms of additional needs such as wearing elastics at a
specific time period. Further, the system 1800 may be designed
around training a cohort database and using it to identify rules to
automate the functionality.
[0262] In some example embodiments, the system 1800 may be
configured to provide prognosis, anticipatory, and therapeutic risk
management. For this, the temporal nature of tooth displacement for
both the active and reactive units of the dentition in response to
the applied force system may be predicted. This may be accomplished
by using the principles of static mechanics such as equilibrium
diagrams and free-body analysis when statically determinate force
systems are applied. Additional risk factors may be identified and
their impact on orthodontic tooth movement is considered. Such
factors may include, but are not limited to nature of the
malocclusion, the spatial relationships of the teeth with respect
to the point of force application, the anatomical and biological
constraints, the condition of the periodontium, collisions between
the teeth, mechanical material and physical characteristics of the
appliances considered for use, and patient cooperation. With more
complex force systems, dynamic finite element methods and Beam
Theory may be used to better comprehend the nature of applied
forces and to predict the response. A person skilled in the art may
well recognize that these methods are well known. Using the methods
and systems disclosed in the invention, any unwanted tooth
movements are managed in advance by designing appliance systems
that generate consistent force systems or the use of adjunctive
appliances that may minimize or negate the reactive forces that may
cause unwanted tooth displacement. These may include, but are not
limited to, the use of appliances such as directional elastics or
lingual arches or temporary anchorage devices, etc. Further,
checklists are also created. These may be image, text and/or
speech-based. The checklists serve as reminders to both the patient
and doctor to recognize side effects early in treatment. The
checklists are also accompanied with instructions to manage
unwanted tooth movement. A weighted prioritized risk analysis of
the likelihood of any spurious tooth movement occurring at any
point in time is calculated and provided to the practitioner and
patient in advance. These are shared with the VCN 1500 as well who
takes the role both of a therapeutic nurse and a patient coach to
remind the patient periodically to check for spurious tooth
movement and/or wear appliances such as elastics to check against
the unwanted tooth movement. As the risk level of the patient
increases, the monitoring intensity becomes more intense. This
automatically triggers a change in the frequency of visits to the
doctor and both the doctor's and patient's appointment calendars
are automatically updated to reflect this need. Additionally, the
involvement of the VCN 1500 care services intensifies.
[0263] Further, besides mechanical factors, the patients risk
profile includes consideration of biological factors such as
presence of bone loss, root desorption, oral hygiene condition,
decalcification, type and magnitude of planned orthodontic
treatment, medical history, educational level, and psychosocial
profile. The risk profile of the patient is compared against a
cohort sample of patients to further delineate the potential nature
and extent to which the patient treatment prognosis is favorable or
not in response to the intervention. Additionally, learning from
the cohort patient group is shared with the doctor to better manage
patient care and augment patient safety.
[0264] Further, in some embodiments, the orthodontic care
management platform 102a, that is to say, the system 1800 may
provide for patient monitoring and management. Patient care
management may have both a patient-directed and a doctor-directed
component facilitated with the participation of the VCN 1500. Thus,
using the methods and systems disclosed in the invention, treatment
progress may be measured at five levels: the patient's current
status against the initial state, the current status against the
planned final outcome, the current status against the predicted
status at this stage in the treatment plan, the current status
against the status at the previous appointment, and the current
status against a similar cohort of patients. Input data from the
patient at any point in treatment includes a series of 2D images of
the dentition. These images are transformed into 3D images using
either photogrammetric or convolutional neural network (CNN)
techniques as already discussed. These are well known approaches to
transform raster images to vector images. This requires training
the CNNs with 2D images from large databases consisting of images
from several hundreds or thousands of patients. The CNNs are able
to automatically extract features of interest from these images to
create the 3D models. In other embodiments of the invention, 3D
scans of the dentition maybe taken as well, wherever feasible. It
is now possible to capture 3D images with a smartphone (such as
those running Apple iOS and Android operating systems). Therefore,
in some embodiments of the invention, the patients will be expected
to take 3D images with their smartphones during the entire
treatment process. The current progress scans are then superimposed
automatically over the initial or the planned outcome or images
captured at the last appointment using a variety of approaches.
These include least mean square or the best fit method,
superimposition over relatively fixed anatomical landmarks such as
the mid-palatine rogue in the maxilla, or implants that maybe
present or placed in the patient's mouth. The mid-palatine rogue
and a small area dorsal to it are unique to each person and do not
positionally change or remodel during the course of treatment.
Therefore, they collectively serve as a unique personal signature
to superimpose upon as a reference to measure and analyze relative
displacement of the teeth accurately, reproducibly and with
precision for each patient as disclosed in Vasilakos et al, Sivaraj
A, and Dong-Soon Choi, which are herein incorporated in their
entirety by reference. As such, by using the rogue as points of
reference, it is straightforward to map the current state of the
patient's teeth to the original 2D image taken at the beginning of
the treatment cycle by aligning the 2D images with respect to the
mid-palatine rogue to determine progress (or lack thereof).
Similarly, in the mandible, the mandibular tori or the maxillary
mandibular mask when using cone beam data have been shown to be
relatively stable reference points for superimposition, as
disclosed in An, K., Ruellas, A C et al, which is herein
incorporated in their entirety by reference. Alternatively, in some
other embodiments, relative tooth movement in 3D space can also be
calculated with respect to the reference arch and a tracking
history of the nature of tooth movement to this point in time is
graphically created The tracking history is used to define whether
treatment is on course and refine the predictive analytics
regarding time to completion of the care cycle or refine the
prediction for the anticipated treatment response at the next
visit.
[0265] Generally, 3D image capture capability may not be widely
available yet to all patients, since not everyone can afford the
latest smartphones with 3D image capture technology. For such
patients, scanning their mouth to create a 3D model of the teeth is
a time-consuming process typically done in the doctor's office with
expensive equipment (not currently available to them). Therefore,
2D images of the patient's teeth are more commonly used during the
various stages of the treatment cycle, after the initial 3D scan.
This is more so in cases where the patient is remote and can easily
take 2D images of their teeth themselves and transmit the 2D images
to the doctor at regular intervals. For this reason, in some
embodiments of our invention, a variant of a technique called `UV
mapping` well-known to those skilled in the art of 3D graphics and
gaming, can be used to project the 2D images of the patient's teeth
onto the original 3D scan. While UV mapping is used in the 3D
graphics field to project a 2D image onto a 3D model for texture
mapping, we use it in our invention to identify and measure
displacement of the teeth. In order to measure the displacement, a
number of reference points in the 3D scan and the 2D image (that
have not moved due to teeth displacement) are used. Examples of
such reference points include, but are not limited to, mid-palatine
rogue, points on the teeth that are not part of the orthodontic
treatment and therefore have not moved since the original
pre-treatment 3D scan and other stationary points, as determined by
the doctor's expertise. Once these reference points are matched,
then the discrepancies between the 2D projection and the original
3D model can be accurately quantified to determine the teeth
movement. This is a novel application of UV mapping to the field of
orthodontic care.
[0266] Regardless of which progress tracking method described above
is used, the tracking history is also compared automatically to a
cohort database of patients to detect any deviations. Furthermore,
the doctor can select any region of interest to better understand
treatment response. The VCN 1500 recognizes the changes and may
share these with the patient's guardian or other professionals
involved in the patient's care. It has been observed that when
treatment does not track the planned events, an automatic
root-cause analysis is triggered and the doctor is informed of the
possible causes of the problem and potential solutions.
Additionally, the doctor may activate the VCN 1500 to search from
the library sources reports on similar anomalous behavior.
Midcourse correction in the treatment plan is automatically
generated and so are the associated appliances to correct for the
anomalous response. In the case of remote patients who are largely
monitoring their own care (with limited periodic doctor visits),
when the doctor sees any deviations, he is able to proactively
order the appropriate revised aligners and/or appliances to arrive
in time for the next patient visit, shaving considerable time off
the entire treatment process. This can result in several weeks of
time saved for the patient. All data is captured, classified,
archived, and synthesized for future use to establish doctor
performance and reputation.
[0267] In some example embodiments, patterns in treatment
modalities and responses are tracked and doctor-specific learning
aids are automatically created to enhance the skills of the doctor
through point-of-care learning. Similarly, patients are educated
throughout the treatment process by visual aids and learning
materials designed to help them better understand their treatment.
Thus, the care management platform 102a, with its associated
software described above is novel in the sense that it is
discovery-driven, proactive and therefore dynamic, unlike the
current static reactive care management patient care models.
Through a combination of effective communication links between the
patient and doctor, use of an AI-based VCN that guides both patient
and doctor, targeted and personalized therapeutics, novel
application of techniques like UV mapping to track patient care and
knowledge management repositories to manage point-of-care learning
for both doctors and patients, gains in treatment effectiveness
efficiency, treatment duration, safety, and cost of care are
realized for both the patient and doctors.
[0268] In some example embodiments, the system 1800 may provide
statistics on manpower utilization and resource utilization to
achieve maximum effectiveness, efficiencies, and safety in the
practice environment.
[0269] A number of AI tools sets known by those skilled in the art
will be used These will include but are not limited to
"Neuro-Symbolic Concept Learner (NSCL) that uses neural networks to
extract features from images ie compose the symbols and then use,
rule-based programs offered by it to respond to and solve problems
based on those symbols. Furthermore to answer questions about the
objects/elements in an images ie Visual question answering (VQA) AI
tools such as those offered by CLVR, may be used. Few shot AI
training tools maybe used to create the talking Virtual care
navigator 1500. To detect changes in images AI tools such as RepMet
may also be considered. Since much of patient data is sensitive,
fragmented and received from multiple sources and their privacy
needs to be maintained during training it is envisioned that
generative adversarial networks will be used for data
synthesis.
[0270] In some example embodiments, the orthodontic care management
platform 102a may also be configured for eye gaze detection,
personalized hologram generation and the like.
[0271] In some example embodiments, the orthodontic care management
ecosystem 102 of the present invention may help to develop
orthodontic appliances that include a basic framework with
extensions that can be used for space closure, intrusion, root
correction, distal movement, expansion, extrusion, alignment,
retention and stabilization. These movements can be performed
concurrently or in tandem. In several embodiments, the basic
framework is envisaged to include longitudinal, arch shaped and
transverse components that may be attached to teeth of a patient
using relatively fixed or removable means as defined above. The
components may include bent portions and be combined with each
other using several attachment mechanisms as discussed earlier. In
such scenarios, the framework will act as a passive (non-force
applying) structure and will be adapted to receive other active
elements/member that are capable of force applications. These
active elements may include structure that is capable of at least
partial elastic or elastomeric deformation. Alternately, the
framework itself may have components or portions such as aligners
and segmental aligners that may be capable of applying a non-zero
force. However, in such a scenario as well, the framework would
still be able to receive active elements/members. It is to be noted
here that the orthodontic appliances are envisaged to include one
or more of the framework and the additional active components. Also
it is envisaged that an orthodontic appliance designed for one
region of the teeth (such as buccal or labial) may be implementable
to another region with very basic modifications without departing
from the scope of the invention
[0272] The key advantages offered by the orthodontic appliances and
their embodiments discussed above include ability to generate
statically determinate forces and moments. Another advantage is
that the setups may be modified to generate different kinds of
tooth movements and may implement as labial or lingual appliances
without any significant design changes. Moreover, the appliances
may be easily produced using material very known in the art.
Additional deformable members, elements, portion including springs,
elastic bands and chain may also be deployed for additional axial
forces and moments. Also, the attachments may be chosen to be
easily removable or long-lasting.
[0273] It is further an objective of the invention to develop a
computer implemented method and a computer system to enable a
collaborative environment in which a patient, in collaboration with
a doctor and several third part services is able to plan their
orthodontic treatment, receive training, design and build
customized orthodontic appliances and manage their entire care
lifecycle on their own. Further, all the data that is retrieved,
received or generated is stored in a secure database and
distributed ledger/blockchain technologies may be leveraged to
safeguard especially the privacy, authenticity and financial
aspects of the orthodontic treatment. Advantages of such an
approach includes customized care corresponding to specific needs
of the patient, such as allergens, blood type, body type and
pre-existing conditions/ailments, verification of authenticity from
time to time, ability to conduct audits whenever needed and ability
to perform philanthropic research without compromising the identity
of the patient.
[0274] In some embodiments, the computer-implemented orthodontic
care management system 102 may be configured to allow access to
archived research papers in repositories and provide automatically
synthesis of patient specific information. The system 102 may also
provide updates on any new findings related to specific treatment
within the local database at the doctor's site and provide the
doctor with an alert message to make them aware of the updated
information.
[0275] Further, the system 102 may be configured to provide value
driven plans that may be designed automatically and/or
interactively with doctor considering choices defined by the doctor
such as but not limited to optimal stability, optimal function,
optimal aesthetics and the like. The system 102 may designed also
to automatically provide financial, functional and operational
metrics and analytics for the practice based upon users choice of
metrics.
[0276] In some embodiments, the system 102 may be configured to
optimize the supply chain and delivery of customized appliances or
orthodontic conventional appliances with the patients scheduled
visits to maintain a just in time inventory.
[0277] In some embodiments, the system may provide backup of all
data, including patient data such as demographic data, disease
history and the like, to develop a better patient understanding of
their malocclusion and treatment.
[0278] The features can be implemented in a computer system that
includes a back-end component, such as a data server or that
includes a middleware component, such as an application server or
an Internet server, or that includes a front-end component, such as
a client computer having a graphical user interface or an Internet
browser, or any combination of them. The components of the system
can be connected by any form or medium of digital data
communication such as a communication network. Examples of
communication networks include a LAN, a WAN and the computers and
networks forming the Internet.
[0279] The computer system can include clients and servers. A
client and server are generally remote from each other and
typically interact through a network. The relationship of client
and server arises by virtue of computer programs running on the
respective computers and having a client-server relationship to
each other.
[0280] One or more features or steps of the disclosed embodiments
can be implemented using an Application Programming Interface
(API). An API can define one or more parameters that are passed
between a calling application and other software code (e.g., an
operating system, library routine, function) that provides a
service, that provides data, or that performs an operation or a
computation.
[0281] The API can be implemented as one or more calls in program
code that send or receive one or more parameters through a
parameter list or other structure based on a call convention
defined in an API specification document. A parameter can be a
constant, a key, a data structure, an object, an object class, a
variable, a data type, a pointer, an array, a list, or another
call. API calls and parameters can be implemented in any
programming language. The programming language can define the
vocabulary and calling convention that a programmer will employ to
access functions supporting the API.
[0282] In some embodiments, an API call can report to an
application the capabilities of a device running the application,
such as input capability, output capability, processing capability,
power capability, communications capability, etc.
[0283] It should be understood that the techniques of the present
disclosure might be implemented using a variety of technologies.
For example, the methods described herein may be implemented by a
series of computer executable instructions residing on a suitable
computer readable medium. Suitable computer readable media may
include volatile (e.g. RAM) and/or non-volatile (e.g. ROM, disk)
memory, carrier waves and transmission media. Exemplary carrier
waves may take the form of electrical, electromagnetic or optical
signals conveying digital data steams along a local network or a
publically accessible network such as the Internet.
[0284] It should also be understood that, unless specifically
stated otherwise as apparent from the following discussion, it is
appreciated that throughout the description, discussions utilizing
terms such as "controlling" or "obtaining" or "computing" or
"storing" or "receiving" or "determining" or the like, refer to the
action and processes of a computer system, or similar electronic
computing device, that processes and transforms data represented as
physical (electronic) quantities within the computer system's
registers and memories into other data similarly represented as
physical quantities within the computer--system memories or
registers or other such information storage, transmission or
display devices.
[0285] It should be noted that where the terms "server", "secure
server" or similar terms are used herein, a communication device is
described that may be used in a communication system, unless the
context otherwise requires, and should not be construed to limit
the present invention to any particular communication device type.
Thus, a communication device may include, without limitation, a
bridge, router, bridge-router (router), switch, node, or other
communication device, which may or may not be secure.
[0286] It should also be noted that where a flowchart is used
herein to demonstrate various aspects of the invention, it should
not be construed to limit the present invention to any particular
logic flow or logic implementation. The described logic may be
partitioned into different logic blocks (e.g., programs, modules,
functions, or subroutines) without changing the overall results or
otherwise departing from the true scope of the invention. Often,
logic elements may be added, modified, omitted, performed in a
different order, or implemented using different logic constructs
(e.g., logic gates, looping primitives, conditional logic, and
other logic constructs) without changing the overall results or
otherwise departing from the true scope of the invention.
[0287] A number of embodiments have been described. Nevertheless,
it will be understood that various modifications may be made.
Elements of one or more embodiments may be combined, deleted,
modified, or supplemented to form further embodiments. As yet
another example, the logic flows depicted in the figures do not
require the particular order shown, or sequential order, to achieve
desirable results. In addition, other steps may be provided, or
steps may be eliminated, from the described flows, and other
components may be added to, or removed from, the described systems.
Accordingly, other embodiments are within the scope of the
following claims.
[0288] The terms and descriptions used herein are set forth by way
of illustration only and are not meant as limitations. Examples and
limitations disclosed herein are intended to be not limiting in any
manner, and modifications may be made without departing from the
spirit of the present disclosure. Those skilled in the art will
recognize that many variations are possible within the spirit and
scope of the disclosure, and their equivalents, in which all terms
are to be understood in their broadest possible sense unless
otherwise indicated.
[0289] Various modifications to these embodiments are apparent to
those skilled in the art from the description and the accompanying
drawings. The principles associated with the various embodiments
described herein may be applied to other embodiments. Therefore,
the description is not intended to be limited to the embodiments
shown along with the accompanying drawings but is to be providing
broadest scope of consistent with the principles and the novel and
inventive features disclosed or suggested herein. Accordingly, the
disclosure is anticipated to hold on to all other such
alternatives, modifications, and variations that fall within the
scope of the present disclosure and appended claims.
* * * * *
References