U.S. patent application number 13/675411 was filed with the patent office on 2013-05-16 for interactive mobile learning (iml) platform.
The applicant listed for this patent is Lachina Publishing Services, Inc.. Invention is credited to Amanda Susanne Almon, Anthony Victor Anselmo, Cory Adam Hughart, Aaron Thomas Kantor, Andrew Kuhar, Jeffrey August LACHINA, Mikala Alexander Little, Jacklyn Marie Watson, Jesse Andrew Werner.
Application Number | 20130122980 13/675411 |
Document ID | / |
Family ID | 48281139 |
Filed Date | 2013-05-16 |
United States Patent
Application |
20130122980 |
Kind Code |
A1 |
LACHINA; Jeffrey August ; et
al. |
May 16, 2013 |
INTERACTIVE MOBILE LEARNING (IML) PLATFORM
Abstract
A computer implemented learning and assessment apparatus
includes a database having at least one set of educational game
parameters, a processor operable to receive input signals from a
user of the apparatus, and a human readable display. Game logic of
the apparatus is operable to generate a game-based learning
experience by presenting to the user a virtual game on the display
in accordance with a selected first one of the set of educational
game parameters. Measurement logic generates measurement data
representative of actions of the user during interaction by the
user with the virtual game. Assessment logic is operable to
generate assessment data representative of gameplay results wherein
a failure of the user to produce predetermined expected learning
results is weighted in accordance with predetermined game-based
learning parameters relative to experiential exercise of the
virtual game by the user. A result signal is selectively rendered
on the human readable display.
Inventors: |
LACHINA; Jeffrey August;
(Wickliffe, OH) ; Anselmo; Anthony Victor; (South
Euclid, OH) ; Almon; Amanda Susanne; (University
Heights, OH) ; Hughart; Cory Adam; (Mayfield Heights,
OH) ; Watson; Jacklyn Marie; (Mayfield Heights,
OH) ; Kuhar; Andrew; (Wickliffe, OH) ; Little;
Mikala Alexander; (Shoreline, WA) ; Werner; Jesse
Andrew; (Avon, OH) ; Kantor; Aaron Thomas;
(Cleveland Heights, OH) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Lachina Publishing Services, Inc.; |
Cleveland Heights |
OH |
US |
|
|
Family ID: |
48281139 |
Appl. No.: |
13/675411 |
Filed: |
November 13, 2012 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61558818 |
Nov 11, 2011 |
|
|
|
Current U.S.
Class: |
463/9 |
Current CPC
Class: |
G09B 5/00 20130101; A63F
9/24 20130101 |
Class at
Publication: |
463/9 |
International
Class: |
A63F 9/24 20060101
A63F009/24 |
Claims
1. A computer implemented learning and assessment apparatus
comprising: a non-transient memory; a database stored in the
memory, the database having at least one set of educational game
parameters; a processor operable to receive input signals from a
user of the apparatus; a human readable display; game logic
operable to generate a game-based learning experience by presenting
to the user a virtual game on the display in accordance with a
selected first one of the set of educational game parameters;
measurement logic operable to generate measurement data based on
the input signals received from a user by the processor, the
measurement data being representative of actions of the user during
interaction by the user with the virtual game; assessment logic
operable to generate assessment data based on the measurement data,
the assessment data being representative of gameplay results
wherein a failure of the user to produce predetermined expected
learning results based on the selected first one of the set of
educational game parameters is weighted in accordance with
predetermined game-based learning parameters relative to
experiential exercise of the virtual game by the user; and, result
logic operable to generate in accordance with the assessment data,
a result signal for selective rendering on the human readable
display.
2. The apparatus according to claim 1 wherein: the measurement
logic is operable to generate, as the measurement data, score data
representative of a score of the user during interaction by the
user with the virtual game as measured against a set of learning
objectives; time data representative of an amount of time consumed
by the user consumed during interaction by the user with the
virtual game; attempts data representative of an number of re-tries
pursued by the user during interaction by the user with the virtual
game; interactions data representative of a quantity of
interactions by the user with the game logic, wherein the
interactions comprise key strokes, screen touches of the display,
pages viewed, and exercise of control by the user over the
apparatus during interaction by the user with the virtual game;
and, options data representative of an amount of utilization by the
user of a range of game options available to the user by the game
logic.
3. The apparatus according to claim 2 wherein: the assessment logic
is operable to generate, as the assessment data, retention data
representative of average score accuracy during interaction by the
user with the virtual game and comprising a factor for any changes
in score over multiple attempts.
4. The apparatus according to claim 3 wherein: the assessment logic
is operable to generate, as the assessment data, engagement data
representative of an average of the retention data, a rate of
interaction by the user with the virtual game, and exploration by
the user with the virtual game.
5. The apparatus according to claim 2 wherein: the assessment logic
is operable to generate, as the assessment data, perseverance data
representative of an increased weighting to multiple attempts by
the user interacting with the virtual game, and comprising a factor
in accordance with precision by the user interacting with the
virtual game and an improvement of scores by the user interacting
with the virtual game.
6. The apparatus according to claim 3 wherein: the assessment logic
is operable to generate, as the assessment data, comprehension data
representative of an average of the retention data, improvements in
time by the user interacting with the virtual game, and score
accuracy over multiple attempts by the user interacting with the
virtual game.
7. The apparatus according to claim 2 wherein: the assessment logic
is operable to generate, as the assessment data, skill data
representative of an average score, an average time, and an average
interaction rate by the user interacting with the virtual game.
8. The apparatus according to claim 4 wherein: the assessment logic
is operable to generate, as the assessment data, interest data
representative of an average of the retention data, the engagement
data, and comprehension data representative of an average of the
retention data, improvements in time by the user interacting with
the virtual game, and score accuracy over multiple attempts by the
user interacting with the virtual game.
9. The apparatus according to claim 2 wherein the assessment logic
is operable to generate, as the assessment data: retention data
representative of average score accuracy during interaction by the
user with the virtual game and comprising a factor for any changes
in score over multiple attempts; engagement data representative of
an average of the retention data, a rate of interaction by the user
with the virtual game, and exploration by the user with the virtual
game; perseverance data representative of an increased weighting to
multiple attempts by the user interacting with the virtual game,
and comprising a factor in accordance with precision by the user
interacting with the virtual game and an improvement of scores by
the user interacting with the virtual game; comprehension data
representative of an average of the retention data, improvements in
time by the user interacting with the virtual game, and score
accuracy over multiple attempts by the user interacting with the
virtual game; skill data representative of an average score, an
average time, and an average interaction rate by the user
interacting with the virtual game; and, interest data
representative of an average of the retention data, the engagement
data, and the comprehension data.
10. The apparatus according to claim 9 further comprising: result
logic configured to receive the retention data, the engagement
data, the perseverance data, the comprehension data, the skill
data, and the interest data, and being operable to generate a
result signal in accordance with the retention data, the engagement
data, the perseverance data, the comprehension data, the skill
data, and the interest data, wherein the result signal is
representative of a learning assessment of the user interacting
with the virtual game.
11. The apparatus according to claim 1 wherein: the result logic
operable to generate the result signal as a line on a radar chart
for selective rendering on the human readable display.
12. A learning and assessment method in an apparatus comprising a
non-transient memory, a database stored in the memory, the database
having at least one set of educational game parameters, a processor
operable to receive input signals from a user of the apparatus, and
a human readable display, the method comprising: generating, by
game logic of the apparatus, a game-based learning experience by
presenting to the user a virtual game on the display in accordance
with a selected first one of the set of educational game
parameters; generating, by measurement logic of the apparatus,
measurement data based on the input signals received from a user by
the processor, the measurement data being representative of actions
of the user during interaction by the user with the virtual game;
generating, by assessment logic of the apparatus, assessment data
based on the measurement data, the assessment data being
representative of gameplay results wherein a failure of the user to
produce predetermined expected learning results based on the
selected first one of the set of educational game parameters is
weighted in accordance with predetermined game-based learning
parameters relative to experiential exercise of the virtual game by
the user; and, generating, by result logic of the apparatus in
accordance with the assessment data, a result signal for selective
rendering on the human readable display.
13. The learning and assessment method according to claim 12,
wherein the generating the measurement data by the measurement
logic comprises: generating score data representative of a score of
the user during interaction by the user with the virtual game as
measured against a set of learning objectives; generating time data
representative of an amount of time consumed by the user consumed
during interaction by the user with the virtual game; generating
attempts data representative of an number of re-tries pursued by
the user during interaction by the user with the virtual game;
generating interactions data representative of a quantity of
interactions by the user with the game logic, wherein the
interactions comprise key strokes, screen touches of the display,
pages viewed, and exercise of control by the user over the
apparatus during interaction by the user with the virtual game;
and, generating options data representative of an amount of
utilization by the user of a range of game options available to the
user by the game logic.
14. The learning and assessment method according to claim 13,
wherein the generating the assessment data by the assessment logic
comprises: generating, by retention logic of the assessment logic,
retention data representative of average score accuracy during
interaction by the user with the virtual game and comprising a
factor for any changes in score over multiple attempts; generating,
by engagement logic of the assessment logic, engagement data
representative of an average of the retention data, a rate of
interaction by the user with the virtual game, and exploration by
the user with the virtual game; generating, by perseverance logic
of the assessment logic, perseverance data representative of an
increased weighting to multiple attempts by the user interacting
with the virtual game, and comprising a factor in accordance with
precision by the user interacting with the virtual game and an
improvement of scores by the user interacting with the virtual
game; generating, by comprehension logic of the assessment logic,
comprehension data representative of an average of the retention
data, improvements in time by the user interacting with the virtual
game, and score accuracy over multiple attempts by the user
interacting with the virtual game; generating, by skill logic of
the assessment logic, skill data representative of an average
score, an average time, and an average interaction rate by the user
interacting with the virtual game; and, generating, by interest
logic of the assessment logic, interest data representative of an
average of the retention data, the engagement data, and the
comprehension data.
15. The learning and assessment method according to claim 14,
further comprising: receiving, by the result logic, the retention
data, the engagement data, the perseverance data, the comprehension
data, the skill data, and the interest data, and generating by the
result logic, a result signal in accordance with the retention
data, the engagement data, the perseverance data, the comprehension
data, the skill data, and the interest data, wherein the result
signal is representative of a learning assessment of the user
interacting with the virtual game.
16. The learning and assessment method according to claim 15,
further comprising: generating, by the result logic, the result
signal as a line on a radar chart and selectively rendering the
line on the radar chart on the human readable display of the
apparatus.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. provisional
application Ser. No. 61/558,818, filed on Nov. 11, 2011,
incorporated in its entirety herein.
BACKGROUND
[0002] 1. Field
[0003] Embodiments herein relate to electronic books incorporating
multimedia and, more particularly, to computer-assisted learning
methods and systems using gaming techniques for comprehension
assessment.
[0004] 2. Description of Related Art
[0005] Modern students are less inclined toward the linear, textual
learning mode of traditional printed and electronic textbooks. High
school students are now accustomed to a new, non-linear style of
discovering information made possible by the Internet. Current
electronic textbook trends, however, follow the traditional linear
modus operandi with added media components. This does not address
the method of learning and engaging with content that the Internet
has made popular.
[0006] Total breaks from traditional linear modes of learning are
often unsuccessful, however. Some students do still learn best by
reading traditional textbooks and, although videos and animations
can be more engaging than text, they are often better used as
introductions to material than as references proper.
[0007] Educational games have been used as a method of learning but
have a history of failure. Most such systems lack the comprehensive
content of traditional textbooks. Also, some educational games used
as a method of learning are too complicated, not complex enough, or
generally not fun for the end user students. Further, cultural
attitudes toward games as lacking "seriousness" are an obstacle for
adoption within the educational community. Overall therefore,
implementation or use of educational learning games has remained
substantially underfunded in most school systems and elsewhere and
the return on investment for the school systems too low or
uncertain for any substantial adoption thereof.
[0008] However, there remains no simple, standard way of assessing
learning comprehension. Textbooks have a mostly standardized and
accepted way to assess comprehension through end-of-chapter reviews
and instructor materials. Different learning methods often teach
distinct aspects of a subject and thus require separate assessment
methods. It is difficult therefore to determine what
learning/assessment method is objectively better for providing fair
and accurate assessment results.
SUMMARY
[0009] The following presents a simplified summary of the example
embodiments in order to provide a basic understanding of some
aspects of the example embodiments. This summary is not an
extensive overview of the example embodiments. It is intended to
neither identify key or critical elements of the invention nor
delineate the scope of the invention. Its sole purpose is to
present some concepts of the example embodiments in a simplified
form as a prelude to the more detailed description that is
presented later.
[0010] In an example embodiment, there is disclosed herein a
computer implemented learning and assessment apparatus, comprising
a non-transient memory, a database stored in the memory, the
database having at least one set of educational game parameters, a
processor operable to receive input signals from a user of the
apparatus, and a human readable display. Game logic of the
apparatus is operable to generate a game-based learning experience
by presenting to the user a virtual game on the display in
accordance with a selected first one of the set of educational game
parameters. Measurement logic of the apparatus is operable to
generate measurement data based on the input signals received from
a user by the processor, the measurement data being representative
of actions of the user during interaction by the user with the
virtual game. Assessment logic of the apparatus is operable to
generate assessment data based on the measurement data, the
assessment data being representative of gameplay results wherein a
failure of the user to produce predetermined expected learning
results based on the selected first one of the set of educational
game parameters is weighted in accordance with predetermined
game-based learning parameters relative to experiential exercise of
the virtual game by the user. Result logic of the apparatus is
operable to generate in accordance with the assessment data, a
result signal for selective rendering on the human readable
display.
[0011] In a further example embodiment, there is disclosed herein a
learning and assessment method in an apparatus comprising a
non-transient memory, a database stored in the memory, the database
having at least one set of educational game parameters, a processor
operable to receive input signals from a user of the apparatus, and
a human readable display. The method comprises generating, by game
logic of the apparatus, a game-based learning experience by
presenting to the user a virtual game on the display in accordance
with a selected first one of the set of educational game
parameters. The method further comprises generating, by measurement
logic of the apparatus, measurement data based on the input signals
received from a user by the processor, the measurement data being
representative of actions of the user during interaction by the
user with the virtual game. The method further comprises
generating, by assessment logic of the apparatus, assessment data
based on the measurement data, the assessment data being
representative of gameplay results wherein a failure of the user to
produce predetermined expected learning results based on the
selected first one of the set of educational game parameters is
weighted in accordance with predetermined game-based learning
parameters relative to experiential exercise of the virtual game by
the user. The method further comprises generating, by result logic
of the apparatus in accordance with the assessment data, a result
signal for selective rendering on the human readable display.
[0012] An Interactive Mobile Learning Platform (IMLP) system in
accordance with an example embodiment comprises methods and
apparatus providing an integrated source for textual, graphical,
auditory, and interactive information. It provides a familiar
interface for modern learning techniques based on the production
qualities and traditions of print media. In general, the IMLP
system consists of electronic pages of multimedia content that
present information in a structured, linear manner, but that also
provide access to a 3D scriptable rendering engine, which enables
greater levels of experiential modes of learning.
[0013] The learning system of the example embodiment provides a
computer interface with similarities to printed books. This allows
for intuitive use for most users during this period of transition
between printed and electronic books. Most education currently in
practice utilizes textbooks, though practical methods of digitizing
them have been available since the 1990s. Most of these methods
have been met with critiques concerning formatting, comfort, and
quality. High school students today are accustomed to discovering
and participating in content via the Internet, and with mobile
computing devices. IMLP system in accordance with the example
embodiments described herein assists in bridging the gap between
current educational practices and modern information sharing and
discovery.
[0014] An interface of the learning system of the example
embodiment consists of a view of one of numerous pages formatted in
the likeness of a printed publication. The page has no set limits
in terms of width and height, but the preferred implementation
fixes the width to the horizontal size of the viewing device
screen, while the length varies depending on the amount of content
(but generally does not exceed the length of twice the vertical
size of the device screen, held in portrait orientation). The
preferred implementation of the page employs an HTML5 rendering
engine, allowing the embedding and streaming of any web-enabled
content. HTML pages can be dynamically generated or provided by a
local or remote database; the current invention utilizes pages
created by hand to more fully emulate the process and quality of
print production.
[0015] In the learning system of the example embodiment, contextual
multimedia objects supplement user comprehension of the textual and
graphical information. Video, audio, text (such as RSS feeds),
interactive objects, and other media are presented in proximity to
textual information that is difficult to grasp through reading
alone. Video, audio, and text objects are selectively stored
locally on the device or streamed from the Internet, and are
displayed via HTML5 in the current invention. Interactive objects
are selectively presented in one of at least two modes including
for example as in-line objects and in separate full-screen views.
In-line objects are presented and interacted with directly on the
page next to other textual and graphical information. The preferred
implementation employs HTML5 and JavaScript to facilitate custom
in-line interactive objects. These objects can include, but are not
limited to, pan/zoom-able images confined behind a fixed frame on
the page, which can also be viewed full-screen; image slideshows
confined behind a fixed frame on the page, with an indication of
the number of images and the current image being viewed, which can
also be viewed full-screen; images with overlaid button elements
that provide access to called out text or images; one or more boxes
with multiple tabs that hide and show text or multimedia, providing
a variety of contexts for a single concept; and charts, graphs, and
tables with variable ways of displaying data.
[0016] More complex interactive objects are selectively displayed
as separate full-screen views activated via a button, a link, or an
interactive page element. The current invention displays the
full-screen view by means of a 3D scriptable rendering engine.
Scripted 3D content may take the form of a simple method for
viewing a 3D model or more complex instances of simulations and
games.
[0017] Interactive objects also facilitate user comprehension
assessment. In-line testing modules of the example embodiment
selectively provide traditional multiple choice, matching, or other
simple tests directly on the page. Additionally, assessments are
selectively accomplished by means of the 3D scriptable rendering
engine. Game scenarios custom-built for specific, experience-based
learning objectives can be embedded as necessary. In the preferred
example embodiment, the game mechanics are highly related to the
learning objective so that experiential knowledge gained
successfully transfers beyond the scope of the textbook and
classroom. Assessment data is stored and displayed locally, and can
also optionally be stored on a cloud-computing server or
transferred to various learning management and content management
systems.
[0018] Interactive objects are preferably displayed in the example
embodiment in the context of a "Learn, Interact, Test" learning
method. In the embodiment described, a page or other section of
content is selectively delineated into three (3) segments related
to a specific learning outcome including "Learn," "Interact," and
"Test." Learn comprises experiences in textual, graphical,
auditory, or other non-interactive informational media covering the
subject in detail. Interact comprises in-line or full-screen
interactive object, game, or simulation based on criteria
previously defined. Test comprises a link initiating a separate
full-screen view of an assessment module, based on criteria
previously defined.
[0019] With some subjects, it may be possible to chain several of
these triads together to form a series of "levels" in a
comprehensive curriculum.
[0020] The pages of the subject learning system of the example
embodiment are navigated via a user interface consisting of three
distinct parts, or "views," within which different tiers of
navigation are accessible comprising in the example embodiment a
main content view, a local navigation view, and a global navigation
view. The main content view displays the current page and provides
access to adjacent pages via swiping gestures (from left to right
or right to left) on a touch-sensitive surface or by use of graphic
button elements. In the example embodiment described herein this
view fills the bounds of the device screen except where overlapped
by any "toolbars" or other graphical navigation elements.
[0021] The local navigation view provides access to a range of
pages in the vicinity of the current page via thumbnails, whereby
activating a thumbnail results in the specified page appearing in
the main content view. The thumbnails are arranged horizontally in
a sequential manner, and can be scrolled through via swiping
gestures or graphic button elements. Other graphic interface
elements may provide additional navigation via buttons, sliders,
and/or touch-based input areas; the current invention provides
buttons for accessing previous/next page in the user's history and
a slider for quickly scrolling to specific pages of the book. The
local navigation view slides up from the bottom of the screen in
the current invention, ideally obscuring as little of the page as
possible, and can be shown or hidden with swiping gestures.
[0022] The global navigation view provides access to a table of
contents and other global application features such as bookmarks,
notes, search, index, and options. Each global navigation view
feature is accessed via labeled tabs that, when activated, display
the selected feature in the view. The table of contents feature
displays a list, textual or graphical, of chapters, sections, or
other forms of content groupings. This list preferably provides an
interactive hierarchy of the contents of the book, whereby
top-level groupings (such as chapters) are shown, and the user may
"drill down" to child nodes (such as chapter sections). When a
child node is activated, the first page of the specified section is
centered in the local navigation view and displayed in the main
content view. Nodes may display more than just titles of sections;
they may include links to multimedia objects and important content.
The global navigation view slides in from the left in the current
invention, and can be hidden or shown with swiping gestures or by
activating one of the labeled tabs that stays visible along the
left side of the screen when the view is hidden.
[0023] Along with methods for navigating and displaying content,
the subject interactive mobile learning platform system of the
example embodiment provides methods for marking and notating
textual content. Selecting text by means of device-specific
standards (such as touch-and-hold on iOS devices) prompts the user
with a number of options, including "highlight" and "note." In the
preferred implementation, choosing the highlight option injects
HTML tags around the selected text with JavaScript, which are
styled to appear highlighted with CSS. The highlighted text, along
with the page number, position, and other data, is stored in the
user settings database. Choosing the note option opens the notepad
view, which slides down from the top and contains an editable text
body. This text is also stored and retrieved from the user settings
database.
[0024] The subject interactive mobile learning platform system of
the example embodiment is a customizable codebase that can be
extended to fit the particular needs of a client. One such
embodiment concerns collaborative learning. At any point in the
current invention, social networking APIs can be utilized to enable
social sharing of mediated content. APIs can also be integrated
from content management or learning management systems to enable
data collection and secure transfer of information, such as test
scores and usage.
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The accompanying drawings incorporated herein and forming a
part of the specification illustrate the example embodiments. In
the drawings:
[0026] FIG. 1 is a block diagram illustrating an example of a
device for implementing an example embodiment.
[0027] FIG. 2 is a block diagram illustrating a computer system
upon which an example embodiment may be implemented.
[0028] FIG. 3 illustrates a schematic of an application framework
and execution flow in accordance with an embodiment of the present
invention.
[0029] FIG. 4 is a flowchart illustrating operations performed by
the main loop controlling the user interface of FIG. 3.
[0030] FIGS. 5a-5c illustrate example user interface screens by
which main views are displayed in accordance with the example
embodiment.
[0031] FIG. 6 illustrates a user interface screen including a
hierarchy in accordance with the example embodiment.
[0032] FIG. 7 illustrates a flowchart of a method of selecting and
interacting with electronic book content of the subject learning
and assessment system in accordance with the example
embodiment.
[0033] FIG. 8 illustrates a flowchart of a method of interactive
learning and assessment in accordance with the example
embodiment.
[0034] FIG. 9 is a block diagram illustrating logic components of
an interactive learning and assessment system in accordance with
the example embodiment.
[0035] FIG. 10 is a functional block diagram illustrating
measurement logic operable to provide learning and assessment
measurements in accordance with the example embodiment.
[0036] FIG. 11 is a functional block diagram illustrating data
derivation logic operable to provide derived learning and
assessment measurements in accordance with the example
embodiment.
[0037] FIGS. 12a-12f are block diagrams illustrating logic modules
of the assessment logic of the example embodiment.
[0038] FIG. 13 is a block diagram illustrating the result logic of
the example embodiment.
[0039] FIG. 14 is a block diagram illustrating the presentation
logic of the example embodiment.
[0040] FIG. 15 is a radar chart displaying an assessment result of
a first example user of the system of the example embodiment.
[0041] FIG. 16 is a radar chart displaying an assessment result of
a second example user of the system of the example embodiment.
[0042] FIG. 17 is a radar chart displaying multiple assessment
results of the first example user of the system of the example
embodiment.
DETAILED DESCRIPTION OF THE EXAMPLE EMBODIMENTS
[0043] The following presents a simplified overview of the example
embodiments in order to provide a basic understanding of some
aspects of the example embodiments. This overview is not an
extensive overview of the example embodiments. It is intended to
neither identify key or critical elements of the example
embodiments nor delineate the scope of the appended claims. Its
sole purpose is to present some concepts of the example embodiments
in a simplified form as a prelude to the more detailed description
that is presented later.
[0044] This description provides examples not intended to limit the
scope of the appended claims. The figures generally indicate the
features of the examples, where it is understood and appreciated
that like reference numerals are used to refer to like elements.
Reference in the specification to "one embodiment" or "an
embodiment" or "an example embodiment" means that a particular
feature, structure, or characteristic described is included in at
least one embodiment described herein and does not imply that the
feature, structure, or characteristic is present in all embodiments
described herein.
[0045] Referring now to FIG. 1, there is illustrated an example of
a device 100 for implementing an interactive mobile learning and
assessment system 10 of an example embodiment. Device 100 comprises
a transceiver 102 suitable for sending and/or receiving data on a
link 104. Link 104 may be a wired or wireless link, and transceiver
102 may be a wired or wireless transceiver. Logic 106 is coupled to
transceiver 102 and configured to send and/or receive data from
link 104 via transceiver 102. "Logic", as used herein, includes but
is not limited to hardware, firmware, software and/or combinations
of each to perform a function(s) or an action(s), and/or to cause a
function or action from another component. For example, based on a
desired application or need, logic may include a software
controlled microprocessor, discrete logic such as an application
specific integrated circuit (ASIC), a programmable/programmed logic
device, memory device containing instructions, or the like, or
combinational logic embodied in hardware. Logic may also be fully
embodied as software.
[0046] In an example embodiment, logic 106 is configured to receive
data corresponding to educational, entertainment, gaming or any
other material derived or sourced from any associated external
system in operative communication with a predefined group via link
104. The data may be received in-band (via transceiver 102) or
out-of-band (for example manually entered data, data `burned in` at
the factory or received from some other means other than
transceiver 102).
[0047] In an example embodiment, logic 106 is responsive to a
signal received from a device (not shown) via the transceiver 102
for communicating the data relating to the educational,
entertainment, gaming or any other material.
[0048] In the example embodiment, the logic 106 is operable to
provide interactive learning and assessment to end users.
[0049] In a further example embodiment, the logic 106 is operable
to engage users in interactive learning using game-based learning
experiences and assessments.
[0050] FIG. 2 illustrates a computer system 200 upon which an
example embodiment may be implemented. Computer system 200 includes
a bus 202 or other communication mechanism for communicating
information and a processor 204 coupled with bus 202 for processing
information. Computer system 200 also includes a main memory 206,
such as random access memory (RAM) or other dynamic storage device
coupled to bus 202 for storing information and instructions to be
executed by processor 204. Main memory 206 also may be used for
storing a temporary variable or other intermediate information
during execution of instructions to be executed by processor 204.
Computer system 200 further includes a read only memory (ROM) 208
or other static storage device coupled to bus 202 for storing
static information and instructions for processor 204. A storage
device 210, such as a magnetic disk or optical disk, is provided
and coupled to bus 202 for storing information and
instructions.
[0051] Computer system 200 may be coupled via bus 202 to a display
212 such as a cathode ray tube (CRT) or liquid crystal display
(LCD), for displaying information to a computer user. An input
device 214, such as a keyboard including alphanumeric and other
keys is coupled to bus 202 for communicating information and
command selections to processor 204. Another type of user input
device is cursor control 216, such as a mouse, a trackball, or
cursor direction keys for communicating direction information and
command selections to processor 204 and for controlling cursor
movement on display 212. This input device typically has two
degrees of freedom in two axes, a first axis (e.g. x) and a second
axis (e.g. y) that allows the device to specify positions in a
plane. Input device 214 may be employed for manually entering
keying data.
[0052] An aspect of the example embodiment is related to the use of
computer system 200 for interactive learning and learning
assessment. According to an example embodiment, data corresponding
to learning and assessment is provided by computer system 200 in
response to processor 204 executing one or more sequences of one or
more instructions contained in main memory 206. Such instructions
may be read into main memory 206 from another computer-readable
medium, such as storage device 210. Execution of the sequence of
instructions contained in main memory 206 causes processor 204 to
perform the process steps described herein. One or more processors
in a multi-processing arrangement may also be employed to execute
the sequences of instructions contained in main memory 206. In
alternative embodiments, hard-wired circuitry may be used in place
of or in combination with software instructions to implement an
example embodiment. Thus, embodiments described herein are not
limited to any specific combination of hardware circuitry and
software.
[0053] The term "computer-readable medium" as used herein refers to
any medium that participates in providing instructions to processor
204 for execution. Such a medium may take many forms, including but
not limited to non-volatile media and volatile media. Non-volatile
media include for example optical or magnetic disks, such as
storage device 210. Volatile media include dynamic memory such as
main memory 206. Common forms of computer-readable media include
for example floppy disk, a flexible disk, hard disk, magnetic
cards, paper tape, any other physical medium with patterns of
holes, a RAM, a PROM, an EPROM, a FLASHPROM, CD, DVD or any other
memory chip or cartridge, or any other media from which a computer
can read.
[0054] Computer system 200 also includes a communication interface
218 coupled to bus 202. Communication interface 218 provides a
two-way data communication coupling computer system 200 to a
communication link 220 that is employed for communicating with
other devices belonging to a predefined group. Computer system 200
can send messages and receive data, including program codes,
through a network via communication link 220, and communication
interface 218.
[0055] FIG. 3 illustrates a schematic of the application framework
and execution flow in accordance with an embodiment of the present
invention. The native application container 301 contains a
scriptable 3D game engine 302, a user interface 303, and a local
storage database 304. In the present invention, the 3D game engine
302 handles initialization with a startup scene 306, which passes a
request to the native binding helper class 305 to switch to the
user interface 303 after loading. The 3D game engine 302 runs any
3D games and any other 3D models or scenes that may be required or
used to supplement the learning components. One preferred form of a
3D game engine 302 suitable for use in the subject system is
Unity.TM. available from Unity Technologies
(http://www.unity3d.com). In addition, the preferred 3D game engine
302 renders 3D content and is also a scriptable 3D game engine
wherein customized models in accordance with the example embodiment
may be selectively embedded and exercised by the 3D game engine to
provide the desired teaching presentations and learning assessments
in accordance with the embodiment.
[0056] The global navigation controller 307 loads options 308,
bookmarks 309, notes 310, and other possible data from user
settings 311, and also loads a content list 312 from the page
database 313, which includes page entries 314 with data such as
page number 315, page chapter 316, and page section 317. The
content list 312 includes an outline with links to sections that,
when activated, sends a message to the local navigation
controller's 318 section navigation 319 to fetch and display the
appropriate page thumbnails from the page database 313. The section
navigation 315 in turn messages the main content controller's 320
page view 321 to display the appropriate page 322. The page 322
pulls in page data 323, such as textual content in the form of HTML
in the present invention, from the page database 313. The page view
321 renders the page 322 and pulls in any necessary media content
324 from the local storage database 304 and/or from the Internet
325 or other computer network. The page data 323 may also contain
links specifically formatted so as to be interpreted by the page
view 321 as a request to the native binding helper class 305 to
switch to the 3D game engine 302 and display one of a plurality of
3D scripted scenes 326. The 3D scripted scene 326 must contain a
link or automated request to the native binding helper class 305 to
switch back to the user interface 303 displaying the last page 322
viewed. Arrows connecting the 3D scene module and the page module
322 illustrate that links in the page module 322 of the user
interface module 303 may be activated to transition the application
to the one or more 3D scenes for learning in accordance with the
embodiment.
[0057] FIG. 4 is a flowchart illustrating a method 400 comprising
operations performed by the main loop controlling the user
interface 303 of FIG. 3 in accordance with an example embodiment.
Following an initialization step 410, a check is performed at step
412 on the user settings 311 for the last page number 315 viewed.
If no record exists, the first page is retrieved 414 and displayed
as a page 521 in a page view 502 such as shown, for example in FIG.
5a. In addition to retrieving the first page at 414, adjacent pages
are retrieved as well wherein the adjacent pages 542b, 542c for
example are displayed adjacent to the first page 542a in the
thumbnail pages view 542 as shown for example in FIG. 5b. If a
record does exist, the indicated last page viewed is retrieved at
416 from the page database 313 for example, and displayed as a page
521 in a page view 502 such as shown, for example in FIG. 5a. In
addition to retrieving at 414 the last page viewed, adjacent pages
are retrieved as well wherein the adjacent pages 542b, 542c for
example are displayed adjacent to the first page 542a in the
thumbnail pages view 542 as shown for example in FIG. 5b. This
action causes at step 420 a message to be sent to the local
navigation controller 318 to display adjacent pages such as shown
for example at 542 in FIG. 5b.
[0058] Main content logic 320 in the system 300 executed at 422 to
provide a main content view 521 of the display page such as shown
in FIG. 5a for example.
[0059] A global navigation action 432 by the global UI logic 307 of
FIG. 3, a local navigation action 433 by the local UI 318 logic of
FIG. 3, or main content navigation action 434 by the main content
logic 320 of FIG. 3 sends one or more appropriate messages to
retrieve the appropriate page data from the local storage 304.
Selection of the global navigation view such as at 432 causes the
global UI logic 307 to display a global view 546 as shown in the
example at FIG. 5c. The content list 312 enables page navigation
within the global view 546. Selection of the local navigation view
such as at 433 causes the local UI logic 318 to display a set of
local navigation views of adjacent pages 540-543 as shown in the
example at FIG. 5b. Selection of the main content navigation view
such as at 434 causes the main content logic 320 to display a set
of local navigation views of adjacent pages 540-543 as shown in the
example at FIG. 5b. By a user swiping to the left or right by hand
or finger the page area 542 scrolls left or right accordingly to
present page selection options for viewing to the user.
[0060] When specially formatted links 522 such as shown for example
in FIG. 5a occur in a page 421, they may be selectively activated
at 435 to display full-screen media 436. In the full-screen media
view, the 3D scene logic 326 and the page view logic 321 are
operable to present full screen images and the like to the user by
drawing suitable selected data form the media storage 324. A link
is provided enabling the user to selectively exit at 438 the
full-screen view wherein the logic returns the user to the main
content view 422.
[0061] FIGS. 5a-5c illustrate example user interface screens 502,
504, and 506 by which the three main views 538, 540, and 544 are
displayed in accordance with the example embodiment. Overall, the
three main views comprise a main content view 538, a local
navigation view 540, and a global navigation view 544. Upon
initialization of the application such as at step 427 shown in FIG.
4, the system is operable to present the user with the main content
view 538 which, in the example embodiment, is comprised of the page
view 521 and a set of virtual tabs 539 that, when initiated or
activated by the user, are operable to open sections of the global
navigation view 544 as shown for example in FIG. 5c. The local
navigation view 540 slides up from the bottom of the screen 541 in
an embodiment of the current invention and consists of a horizontal
slider of page thumbnails 542 and a slider bar 543 that displays
and navigates the user's absolute position in the book. The global
navigation view 544 as shown in FIG. 5c slides in from the left of
the screen 545 when a tab 539 is activated or dragged in an
embodiment of the current invention. The default view of the global
navigation view consists of an outline of the book content 546 in
the form of chapter titles that, when activated, expand to show a
list of subsections 547.
[0062] FIG. 6 illustrates a user interface screen 600 including a
hierarchy in one embodiment of the subject learning and assessment
system by which the electronic book is navigated. The global
navigation view 644 (shown as 544 in FIG. 5c) affects the rendering
of the local navigation view 640 (shown as 540 in FIG. 5b), which
can cause a specific page 622 to be rendered in the main content
view 638 (shown as 538 in FIG. 5b). Also illustrated is the notion
that all navigation and content views can be displayed
simultaneously. The subject system is operable to update each of
the local navigation view 640 and the specific page 622 to be
rendered in the main content view 638 based on a selection by the
system user of one of the items presented in the global navigation
view 644. The subject system is operable to update each of the
global navigation view 644 and the main content view 638 based on a
selection by the system user of one of the items presented in the
local navigation view 640. Yet still further, subject system is
operable to update each of the global navigation view 644 and the
local navigation view 640 based on a selection by the system user
of one of the items presented in the main content view 638.
[0063] FIG. 7 illustrates a flowchart of a method 700 whereby a
user of one embodiment of the present invention selects and
interacts with electronic book content of the learning and
assessment system. Initially, a startup program is initiated 702
whereby the user is provided with a view at 704 of one or more
pages in accordance with the method described above in connection
with FIG. 4 and, further, the user is provided with one or more
choices at 706 including, in the example embodiment enabling the
user to either choose which content to display 712 through various
navigation views or to begin directly reading and displaying
content at 714. On any given page, the user may choose at 716 to
interact with certain multimedia objects from the page database 313
and presented on the screen, including manipulating interactive
in-line page objects 720 and activating links to full-screen
previews of static media 722. The system further provides the user
with the selectable option to view three dimensional data such as
the media data 324 by activating at 724 the 3D game engine module
302, which may consist of a 3D object to manipulate 730. Selection
by the user of playing a learning game activates the game module at
750 followed in the example embodiment by a learning assessment
performed by the assessment module 756 and further followed by
generation of assessment results by the assessment result module
760.
[0064] In addition, in accordance with the example embodiment, the
user may make a selection at 770 to replay the learning game. In
the flowchart illustrated, an election to replay the game returns
control to the game module step 750.
[0065] After activating and completing a game assessment module,
any score or other measurement of comprehension is stored and
displayed as necessary or desired, and possibly submitted to a
learning management system or other internet or network
database.
[0066] FIG. 8 illustrates a flow diagram of an overall method 800
for interactive learning and assessment in accordance with the
example an embodiment. Learning occurs in the subject system during
the presentation by the system to the user of information which may
be absorbed or otherwise understood by the user. The information
may be textual, illustrative, video, audio or of any other form as
necessary or desired. In any case, for purposes of explanation, in
the example shown, the course material is divided into separate
"units" wherein the user may learn and be assessed with respect to
the learning within these separate learning units. In the example,
the learning and assessment occurs in first 810, second 812 and nth
814 learning stages. The system of the example embodiment executes
logic within each of the learning stages comprising a sequence of
learn presentation 859, interact logic 860, test logic 861. The
learning and assessment system is operable to enable the user to
enter the sequence at the "Learn" logic module 859, which is
comprised of non-interactive information, including textual,
graphical, auditory, and video media. In accordance with the
example embodiment, the learn logic 859 is operable while the user
is actively viewing content such as, for example, in steps 702-714
of FIG. 7. Examples include reading the electronic book contents or
viewing pictures of the electronic book, for example. It is to be
appreciated that in accordance with the example embodiment, the
measurement logic 932 may selectively log these learning activities
for purposes of developing measurement data 1000 as will be
discussed below in greater detail.
[0067] When ready, the system is operable to enable the user to
move or otherwise enter into the "Interact" logic module 860 which,
in the example embodiment, illuminates specific and focused
elements of the learning material presented during execution of the
"Learn" logic module 859 and including custom-designed interactive
experiences, such as manipulate-able 2D or 3D objects, simulations,
and games. The system of the example embodiment is operable to
engage the user in the interact logic module and challenge the user
to apply the knowledge absorbed in the learning phase such as by
presenting layered information and by presentation of
problem/solution scenarios, questions and answers, or the like. In
accordance with the example embodiment, the interact logic 860 is
operable while the user is actively interacting with the content
such as, for example, while the user is manipulating interactive
objects, previewing full screen media objects or items, and while
the user is activating the 3D engine module such as for example in
steps 720-724 of FIG. 7. It is to be appreciated that in accordance
with the example embodiment, the measurement logic 932 selectively
logs these interaction activities for purposes of developing
measurement data 1000 as will be discussed below in greater
detail.
[0068] After completing the learning and interactive modules 859,
860 or otherwise exiting, the system is operable to enable the user
to move or otherwise enter the "Test" logic module 861. This module
can consist of traditional assessment methods such as
multiple-choice questions, or, preferably such as in the example
embodiment, it consists of an interactive assessment based on the
"Interact" logic module 860. The interactive assessment is
custom-designed to focus on specific learning objectives derived
from the "Learn" 859 and "Interact" 860 logic modules, and scores
the user based on those learning objectives and not on the user's
skill at manipulating the virtual space. The "Learn, Interact,
Test" triads 858, 858', 858'' are, in the example embodiment, an
ongoing cycle, moving on to different learning objectives after
each cycle is completed. This model can be extrapolated as
comprising the "Learn" module of an overarching cumulative triad
862; after completing a series of cycles, a cumulative interactive
module and cumulative assessment module can be selectively
presented to the user.
[0069] In accordance with the example embodiment, the test logic
861 is operable while the user is actively interacting with game
module 750 such as, for example, while the user is playing a
learning game within the game module in step 750 and further, while
the user operates the assessment logic and assessment result logic
of steps 756 and 760 of FIG. 7. Essentially, in accordance with the
example embodiment, the knowledge gained in the learn and interact
phases presented to the user by the system is applied by the system
against the user in the game module 750. It is to be appreciated
that in accordance with the example embodiment, the measurement
logic 932 selectively logs these learning game activities occurring
during execution of the game module 750 such as, for example,
user's scores relative to one or more sets of learning objectives,
for purposes of developing measurement data 1000 as will be
discussed below in greater detail.
[0070] As noted above, the subject interactive learning and
assessment system advantageously adopts a gaming assessment
philosophy for reasons including because game-based learning
experiences within the system of the example embodiment provide
users with an engaging way of learning information while providing
educators with an accurate assessment model for confirming the
learning experience. Games appeal to users of all ages and genders,
and games motivate users to do better, to navigate through levels
towards a targeted goal, and to achieve successes through
experiential learning. The system of the example embodiment enables
users to experiment through gameplay in a safe environment where
failure to produce an expected result does not negatively impact
the final "grade" or "score" within set parameters. This
experimentation inherent to game-based learning helps users develop
critical thinking skills, heightens engagement in learning
experiences, and ultimately increases comprehension of a subject
matter. Accordingly, in the example embodiment, the assessment
module 756 (FIG. 7) comprises logic modules operable to provide
assessment measures in selected learning areas including for
example knowledge retention, engagement, perseverance,
comprehension, skill, and interest. The game module pits the users
against a set of learning objectives and the assessment module
tests the user's learning. In accordance with the embodiment, the
game module is engaging, visually interesting and intellectually
compelling so that the user feels comfortable for enhanced
learning. The learning objectives are based on the particular
subject matter of the course work and are selectively drawn from
the local storage 304 as needed and in accordance with a selected
curriculum. In an embodiment, the learning objectives based on
selected subject matter of the course work may be selectively drawn
from external sources such as the Internet 325 alone or in
combination with the local storage 304 as needed and in accordance
with a selected curriculum.
[0071] Accordingly and with reference now to FIG. 9, the assessment
module 756 of the example embodiment includes assessment logic 910
operable to generate learning assessment metrics including metrics
for the set of learning areas identified above. To this end, the
assessment logic 910 includes knowledge retention logic 912
operable to generate a knowledge retention learning metric,
engagement logic 914 operable to generate an engagement learning
metric, perseverance logic 916 operable to generate a perseverance
learning metric, comprehension logic 918 operable to generate a
comprehension learning metric, skill logic 920 operable to generate
a skill learning metric, and interest logic 922 operable to
generate an interest learning metric. Although the example
embodiment provides learning assessment in learning areas of
knowledge retention, engagement, perseverance, comprehension,
skill, and interest, the embodiment is not so limited and the
system may be extended to other areas of learning assessment as
desired.
[0072] In addition to the above and with continued reference to
FIG. 9, the assessment module 756 of the example embodiment further
includes improvement logic 930 operable to generate an improvement
metric based on selected one or more of the learning assessment
areas listed above, measurement logic 932 operable to generate
selected learning parameter measurement data, and data derivation
logic 934 operable to derive selected composite data from the
selected learning parameter measurement data obtained from the
measurement logic 932.
[0073] As noted above, the knowledge retention logic 912 is
operable to generate a knowledge retention learning metric. In
accordance with the example embodiment, "Retention" is measured by
averaging score accuracy and factoring in any change in score over
multiple attempts. Retention generally reflects the average score,
impacted positively or negatively depending on an increase or
decrease in scores.
[0074] In accordance with the example embodiment, the engagement
logic 914 is operable to generate an engagement learning metric
wherein "Engagement" is measured by averaging Retention, the rate
of interaction, and exploration. Retention is used in calculating
Engagement because it provides a value of improvement in score over
all attempts, and if there is little improvement it is unlikely
that the learner is engaged. "Interactions" are defined in
accordance with the example embodiment as any input events received
by the computer from the player, and each game has a unique, ideal
interaction rate based on user testing. Exploration is a percentage
of the number of choices made vs. an ideal number of choices based
on data from user testing; high exploration indicates a broader
range of learning opportunities aside from the singular game
objective.
[0075] Further in accordance with the example embodiment, the
perseverance logic 916 operable to generate a perseverance learning
metric, wherein "Perseverance" is measured by giving increased
weight to multiple attempts, and factoring in the precision and
improvement of scores. The number of attempts is significant in
developing the perseverance learning metric wherein a high
Perseverance measure may indicate that the learner is trying but
not comprehending well, whereas a low Perseverance measure may
indicate that they are just not trying.
[0076] The comprehension logic 918 in accordance with the example
embodiment is operable to generate a comprehension learning metric.
In the embodiment, "Comprehension" is measured by averaging
Retention, improvement in time, and score accuracy over the number
of attempts. Comprehension decreases with each successive attempt;
otherwise, it will generally reflect Retention. If multiple
attempts show improvement in time, in accordance with the example
embodiment, Comprehension is maintained, but if there is a decline
in time, Comprehension may drop drastically.
[0077] Lastly with regard to the assessment logic 910 of FIG. 9,
skill logic 920 is operable to generate a skill learning metric,
and interest logic 922 is operable to generate an interest learning
metric. In the embodiment, "Skill" is measured by averaging score,
time, and interaction rate. Skill measures the player's ability to
manipulate the game environment and does not necessarily reflect
how much he has learned about the content. Further in the
embodiment, "Interest" is measured by averaging Retention,
Comprehension, and Engagement. Interest is weighted in favor of
Engagement, accounting for players who may be interested but are
having difficulty retaining or comprehending.
[0078] The improvement logic 930 is operable to generate an
improvement metric based on selected one or more of the learning
assessment areas listed above. In this regard, in accordance with
the example embodiment, the improvement logic 930 specifies a
percentage of improvement in relation to the input values and the
expected goal value over multiple attempts.
[0079] As noted above, in the example embodiment, the measurement
logic 932 is operable to generate selected learning parameter
measurement data. With reference now to FIG. 10, the measurement
logic 932 is in operative communication with the 3D game engine 302
(FIG. 3), the user interface 303, and the local storage 304 to
generate the learning parameter measurement data 1000 in accordance
with the user's interaction with the interactive learning system of
the example embodiment. Preferably, the learning parameter
measurement data 1000 includes Score data 1002, Time data 1004,
Attempts data 1006, Number of Interactions data 1008, and Options
Chosen data 1010. The Score data 1002 is derived in the example
embodiment as the user collects points or other units of score
measurement from the game module 750 and possibly other areas of
the 3D game engine 302 and user interface 303, which is then
compared to a total score or goal score data derived from local
storage 304. Similarly, the Time data 1004 is derived in the
example embodiment as the user is timed from the initialization to
the completion of the game module 750 and possibly other areas of
the 3D game engine 302 and user interface 303, which is also
compared to a goal time data derived from local storage 304. The
Attempts data 1006 is derived in the example embodiment as the user
replays the singular game module 750. The Number of Interactions
data 1008 is derived in the example embodiment as the user clicks
with a computer mouse, touches a touch-sensitive surface, enters
input from a keyboard, or other methods of user interaction with
the computer system 200 input device 214 or cursor control 216. The
Options Chosen data 1010 is derived in the example embodiment as
the user makes decisions based on choices presented in the game
module 750, wherein the number of choices made is compared with a
goal number of choices data derived from local storage 304.
[0080] In accordance with the example embodiment, the data
derivation logic 934 is operable to derive selected composite data
from the selected learning parameter measurement data 1000 obtained
from the measurement logic 932. With reference next to FIG. 11, the
data derivation logic 934 is operable to receive selected data from
among the learning parameter measurement data 1000 (FIG. 10) and
generate Accuracy data 1102 based on the score data 1002 and a
predefined goal score, Precision data 1104 based on closeness of
scores data derived from attempt data 1006 and Exploration data
1108 based on the options chosen data 1010 and predefined goal
options chosen data. More particularly, the Accuracy data 1102 is
derived based on score data divided by goal score data. The
Precision data 1104 is derived based on the standard deviation of
the score data from the goal score data. The Exploration data 1108
is derived based on the average of options chosen data 110 divided
by goal options chosen data.
[0081] With reference next to FIG. 10, the measurement logic 932 is
operative to generate measurement data 1000. In the example
embodiment, the measurement data comprises score data 1002, time
data 1004, attempt data 1006, interactions data 1008, and options
data 1010. More particularly, measurement logic 932 is operative to
receive raw data from the 3D game engine 302, the user interface
module 303, and the local storage 304 and to convert the raw data
in to the measurement data 1000. In the example embodiment, the
score data 1002 is representative of the score of a user as
measured against a set of learning objectives, and the time data
1004 is a measure of the amount of time the user consumed during
the learning 859 and interacting 860 using the subject system. The
attempt data 1006 is representative of the number of re-tries 770
pursued by the user during the learning. The attempts are also
represented by learn, interact and test loops within any of the
units 858, 858' as shown in FIG. 8. The interactions data 1008 is
representative of a quantity of interactions by the user with the
subject system such as, for example, key strokes, screen touches,
pages viewed, or any other form of control by the user over the
system during a course of learning using the system. The options
data 1010 is representative of the amount of utilization by the
user of the range of game options available to the user by the game
module 750. Users who completely exercise the game options develop
a high utilization score and users who have only simple interaction
develop a low utilization score.
[0082] With reference next to FIG. 11, the data derivation logic
934 is operative to generate a set of derived data 1100 including,
in the example embodiment, accuracy data 1102, precision data 1104,
and exploration data 1108. As shown, the data derivation logic 934
is configured to receive selected items of the measurement data
1000 generated by the measurement logic 932 in a manner described
above.
[0083] For ease of understanding, the functional operation of the
data derivation logic 934 will be described below through use of
example pseudocode as follows:
TABLE-US-00001 Set times to array of time measured per attempt Set
goal_time to time expected for success Set scores to array of
scores recorded per attempt Set goal_score to total possible score
OR expected score value Set attempt to current attempt value
(between 0 and attempts) Set attempts to number of attempts/replays
of the game (length of scores) Set interactions to array of
interactions measured per attempt Set goal_interactions to expected
interactions value Set rank to number based on score vs global or
local score database Set options to array of the number of options
chosen or choices made per attempt Set goal_options to total
possible options OR expected number of options chosen
[0084] In the example embodiment, the data derivation logic 934 is
operative to generate the accuracy data 1102 of the set of derived
data 1100 as follows:
TABLE-US-00002 Function Accuracy(score, goal_score) If score is
greater than or equal to goal_score Return 1 Else Return score
divided by goal_score End Function Function Average(array) Return
Sum_of_all(array) divided by length of array End Function Function
Sum_of_all(array) Set sum to 0 For each item in array Add item to
sum End For Return sum End Function Function WeightedAverage(array)
Set total to 0 Set weight to 1 For each item in array Add product
of item & weight to total Increment weight by 1 End For Set
weight_total to weight / 2 * (weight + 1) Return total divided by
weight_total End Function Function AverageAccuracy(scores,
goal_score) Set accuracies to array of Accuracy(score, goal_score)
for each score in scores Return Average(accuracies) End Function
Function InteractionRatio(interactions, goal_interactions, times,
goal_time) Set ratio to Sum_of_all(interactions) /
Sum_of_all(times) * goal_time / goal_interactions Return
Constrain(ratio) End Function Function ImprovementHigher(array) Set
total_improve to 0 If array length is greater than 1 For each item
in array (except last item) Set partial_imp to next item minus
item, divided by item Add partial_imp to total_improve End For
Return total_improve divided by (array length minus 1) Else Return
0 End If End Function Function ImprovementLower(array) Set
total_improve to 0 If length of array is greater than 1 For each
item in array (except last item) Set partial_imp to item minus next
item, divided by item Add partial_imp to total_improve End For
Return total_improve divided by (array length minus 1) Else Return
0 End If End Function Function Variance(scores) Set total_var to 0
For each score in scores Add score minus Average(scores), squared,
to total_var End For Return total_var divided by length of scores
End Function
[0085] In the example embodiment, the data derivation logic 934 is
operative to generate the precision data 1104 of the set of derived
data 1100 as follows:
TABLE-US-00003 Function Precision(scores, goal_score) Set
standard_deviation to the square root of Variance(scores) Return 1
minus (the reciprocal of goal_score) multiplied by
standard_deviation End Function Function Constrain(number) If
number is less than 0, set to 0 If number is greater than 1, set to
1 Return number End Function
[0086] In the example embodiment, the data derivation logic 934 is
operative to generate the exploration data 1108 of the set of
derived data 1100 as follows:
TABLE-US-00004 Function Exploration(options, goal_options) Return
Average(options) divided by goal_options End Function
[0087] With reference next to FIG. 12a, the retention logic 912 is
configured to receive selected items of the measurement data 1000
generated by the measurement logic 932 in a manner described above
and also to receive selected items of the derived data 1100
generated by the data derivation logic 934 in a manner described
above. The received data includes score data 1002, accuracy data
1102, and attempts data 1106. The retention logic of the example
embodiment is operative to generate retention data 1202
[0088] For ease of understanding, the functional operation of the
retention logic 912 will be described below through use of example
pseudocode as follows:
TABLE-US-00005 Function Retention(scores, goal_score) Set avg_diff
to 0 If length of scores is greater than 1 For each score in scores
(except last score) Add Accuracy(next score, goal_score) minus
Accuracy(score, goal_score) to avg_diff End For Divide avg_diff by
length of scores - 1 End If Set total_ret to sum of avg_diff &
AverageAccuracy(scores) Return Constrain(total_ret) End
Function
[0089] With reference next to FIG. 12b, the engagement logic 914 is
configured to receive selected items of the measurement data 1000
generated by the measurement logic 932 in a manner described above
and also to receive selected items of the derived data 1100
generated by the data derivation logic 934 in a manner described
above. The received data includes time data 1004, exploration data
1106, attempts data 1106, and interactions data 1008. The
engagement logic 914 of the example embodiment is operative to
generate engagement data 1204.
[0090] For ease of understanding, the functional operation of the
engagement logic 912 will be described below through use of example
pseudocode as follows:
TABLE-US-00006 Function Engagement(interactions, goal_interactions,
times, goal_time, retention_value, exploration_value) Set
interaction_weight to InteractionRatio(interactions,
goal_interactions, times, goal_time) Set total_eng to the sum of
interaction_weight & exploration_value & retention_value,
divided by 3 Return Constrain(total_eng) End Function
[0091] With reference next to FIG. 12c, the perseverance logic 916
is configured to receive selected items of the measurement data
1000 generated by the measurement logic 932 in a manner described
above and also to receive selected items of the derived data 1100
generated by the data derivation logic 934 in a manner described
above. The received data includes time data 1004, attempts data
1106, and score data 1002. The perseverance logic 916 of the
example embodiment is operative to generate perseverance data
1206.
[0092] For ease of understanding, the functional operation of the
perseverance logic 916 will be described below through use of
example pseudocode as follows:
TABLE-US-00007 Function Perseverance(scores, goal_score times) Set
attempt_factor to 1 minus the reciprocal of the length of scores
Set precision_factor to Precision(scores, goal_score) Set
improvement_factor to ImprovementHigher(scores) Set total_pers to
attemt_factor + (1 - attempt_factor) * improvement_factor *
precision_factor Return Constrain(total_pers) End Function
[0093] With reference next to FIG. 12d, the comprehension logic 918
is configured to receive selected items of the measurement data
1000 generated by the measurement logic 932 in a manner described
above and also to receive selected items of the derived data 1100
generated by the data derivation logic 934 in a manner described
above. The received data includes time data 1004, attempts data
1106, score data 1002, and accuracy data 1102. The comprehension
logic 918 of the example embodiment is operative to generate
comprehension data 1208.
[0094] For ease of understanding, the functional operation of the
comprehension logic 918 will be described below through use of
example pseudocode as follows:
TABLE-US-00008 Function Comprehension(scores, goal_score, times,
goal_time, retention_value) Set reduction_rate to 0.98 Set
improvement_factor to 0.02 * Constrain(ImprovementLower(times)) Set
total_comp to retention_value * ((reduction_rate +
improvement_factor) to the power of the length of scores - 1)
Return total_comp End Function
[0095] With reference next to FIG. 12e, the skill logic 920 is
configured to receive selected items of the measurement data 1000
generated by the measurement logic 932 in a manner described above
and also to receive selected items of the derived data 1100
generated by the data derivation logic 934 in a manner described
above. The received data includes time data 1004, attempts data
1106, score data 1002, accuracy data 1102, and interactions data
1008. The skill logic 920 of the example embodiment is operative to
generate skill data 1210.
[0096] For ease of understanding, the functional operation of the
skill logic 920 will be described below through use of example
pseudocode as follows:
TABLE-US-00009 Function Skill(scores, goal_score, times, goal_time
interactions, goal_interactions, [rank]) Set score_factor to
WeightedAverage(scores) divided by goal_score Set time_factor to
Constrain(goal_time divided by WeightedAverage(times)) Set
interaction_factor to Constrain(1 - (WeightedAverage(interactions)
- goal_interactions) / goal_interactions) Return product of
score_factor, time_factor, & interaction_factor End
Function
[0097] With reference next to FIG. 12f, the interest logic 922 is
configured to receive selected retention data 1202 from the
retention logic 912, engagement data 1204 from the engagement logic
914, and comprehension data 1208 from the comprehension logic 918.
The interest logic 922 of the example embodiment is operative to
generate interest data 1212.
[0098] For ease of understanding, the functional operation of the
interest logic 922 will be described below through use of example
pseudocode as follows:
TABLE-US-00010 Function Interest(retention_value,
comprehension_value, engagement_value) Return (retention_value +
engagement_value * 2 + comprehension_value) / 4 End Function
[0099] With reference next to FIG. 13, the result logic 936 of the
assessment module 756 (FIGS. 7, 9) is operative to receive the
outputs of the logic modules 912-922, perform one or more
operations on the data 1202-1212, and generate result data 1302
suitable for use by users such as educators or the like in making
learning assessments in accordance with the example embodiment.
[0100] FIG. 14 illustrates a block diagram of the presentation
logic 938 of the assessment module 756 (FIGS. 7, 9), wherein the
presentation logic 938 is operative to receive the one or more
outputs of the result logic 936, perform one or more operations on
the data, and generate presentation data 1402 suitable for use in
the example embodiment for displaying on a display screen 212 (FIG.
2) or the like the learning assessment results in a simple and
easily comprehensible way. In the example embodiment, the
presentation data 1402 is configured to be used in generating a
radar chart such as shown, for example, in FIGS. 15-16 to be
described below.
[0101] With reference now to those drawings, FIG. 15 shows a radar
chart 1500 of a "User A" in an example to be described below and
FIG. 16 shows a radar chart 1600 of a "User D" in the example. In
the example embodiment, the presentation logic 938 is operative to
generate the presentation data 1402 in a manner to be suitable for
use by the system of the example embodiment for presentation on the
display 212 (FIG. 2).
[0102] In the example, ideal scores and user behavior is as
follows:
Ideal
TABLE-US-00011 [0103] Attempt # 1 Scores 100 Times 100 Interactions
100 Options 100 Retention: 100% Engagement: 100% Perseverance: 0%
Comprehension: 100% Skill: 100% Interest: 100%
User A
TABLE-US-00012 [0104] Attempt # 1 2 Scores 85 90 Times 120 110
Interactions 110 105 Options 80 90 Retention: 92.5% Engagement:
90.33% Perseverance: 52.87% Comprehension: 90.8% Skill: 72.75%
Interest: 90.99%
[0105] The results of the learning assessment of User A in the
above example is represented in the radar line 1502 presented in
the radar chart 1500 of FIG. 15.
User B
TABLE-US-00013 [0106] Attempt # 1 2 3 4 Scores 25 50 60 90 Times
200 160 170 110 Interactions 170 120 110 120 Options 80 70 85 90
Retention: 77.92% Engagement: 80.14% Perseverance: 85.87%
Comprehension: 74.07% Skill: 35.29% Interest: 78.07%
User C
TABLE-US-00014 [0107] Attempt # 1 2 3 4 5 6 Scores 80 85 92 96 97
99 Times 130 132 120 110 104 102 Interactions 120 114 110 105 98 99
Options 92 95 97 97 99 100 Retention: 95.3% Engagement: 94.84%
Perseverance: 84.01% Comprehension: 86.55% Skill: 82.17% Interest:
92.88%
User D
TABLE-US-00015 [0108] Attempt # 1 2 3 Scores 50 60 20 Times 140 120
80 Interactions 70 80 40 Options 60 50 30 Retention: 28.33%
Engagement: 43.63% Perseverance: 60.21% Comprehension: 27.48%
Skill: 37.1% Interest: 35.77%
[0109] The results of the learning assessment of User D in the
above example is represented in the radar line 1602 presented in
the radar chart 1600 of FIG. 16.
[0110] FIG. 17 is a radar chart 1700 of an example user using the
system of the example embodiment to execute multiple learning and
assessment cycles 858, 858', 862 such as shown diagrammatically in
FIG. 8 wherein for a first unit of study or learning (unit 1) the
system generates a first radar line 1702, for a second unit of
study or learning (unit 2) the system generates a second radar line
1704, for a third unit of study or learning the system generates a
third radar line 1706, for a fourth unit of study or learning the
system generates a fourth radar line 1708, and for the average
learning and assessment the system generates an average radar line
1710.
[0111] Described above are example embodiments. It is, of course,
not possible to describe every conceivable combination of
components or methodologies, but one of ordinary skill in the art
will recognize that many further combinations and permutations of
the example embodiments are possible. Accordingly, this application
is intended to embrace all such alterations, modifications and
variations that fall within the spirit and scope of the appended
claims interpreted in accordance with the breadth to which they are
fairly, legally and equitably entitled.
* * * * *
References