U.S. patent application number 13/835721 was filed with the patent office on 2014-02-13 for animation transitions and effects in a spreadsheet application.
This patent application is currently assigned to MICROSOFT CORPORATION. The applicant listed for this patent is MICROSOFT CORPORATION. Invention is credited to Alexandre da Veiga, Steven Drucker, Jonathan Edgar Fay, Michael Kallay, John Alfred Payne, Igor Borisov Peev, Ehab Sobhy, Curtis G. Wong.
Application Number | 20140043340 13/835721 |
Document ID | / |
Family ID | 50065859 |
Filed Date | 2014-02-13 |
United States Patent
Application |
20140043340 |
Kind Code |
A1 |
Sobhy; Ehab ; et
al. |
February 13, 2014 |
Animation Transitions and Effects in a Spreadsheet Application
Abstract
Concepts and technologies are described herein for animation
transitions and effects in a spreadsheet application. In accordance
with the concepts and technologies disclosed herein, a computer
system can execute a visualization component. The computer system
can detect selection of a scene included in a visualization of
spreadsheet data. The computer system also can generate an effect
for the scene selected. In some embodiments, the computer system
identifies another scene and generates a transition between the
scenes. The computer system can output the effect animation and the
transition animation.
Inventors: |
Sobhy; Ehab; (Redmond,
WA) ; Drucker; Steven; (Bellevue, WA) ;
Kallay; Michael; (Bellevue, WA) ; da Veiga;
Alexandre; (Bellevue, WA) ; Payne; John Alfred;
(Seattle, WA) ; Wong; Curtis G.; (Medina, WA)
; Fay; Jonathan Edgar; (Woodinville, WA) ; Peev;
Igor Borisov; (Arlington, WA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
MICROSOFT CORPORATION |
Redmond |
WA |
US |
|
|
Assignee: |
MICROSOFT CORPORATION
Redmond
WA
|
Family ID: |
50065859 |
Appl. No.: |
13/835721 |
Filed: |
March 15, 2013 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61681851 |
Aug 10, 2012 |
|
|
|
Current U.S.
Class: |
345/473 |
Current CPC
Class: |
G06F 3/0488 20130101;
G06F 16/9537 20190101; G06T 11/206 20130101; G06F 3/04842 20130101;
G06F 16/26 20190101; G06F 16/9038 20190101; G06F 16/29 20190101;
G06F 40/18 20200101; G06T 15/10 20130101; G06F 3/01 20130101; G06F
16/248 20190101; G06F 40/169 20200101; G06F 40/166 20200101; G06F
16/50 20190101; G06F 3/04815 20130101; G06F 16/4393 20190101; G06F
16/444 20190101; G06F 3/04847 20130101; G06F 16/2477 20190101; G06F
3/048 20130101; G06T 19/00 20130101; G06F 3/0482 20130101; G06T
13/00 20130101; G06T 15/00 20130101; G06F 3/0481 20130101 |
Class at
Publication: |
345/473 |
International
Class: |
G06T 13/00 20110101
G06T013/00 |
Claims
1. A computer-implemented method for generating an animation in a
spreadsheet application, the computer-implemented method comprising
performing computer-implemented operations for: detecting, at a
computer system executing a visualization component, selection of a
scene included in a visualization of spreadsheet data; determining,
by the computer system, a duration of the scene based upon a start
time of the scene and an end time of the scene; receiving, by the
computer system, selection of an effect comprising a visual effect
applied during rendering of the scene from a viewpoint from which
the scene is rendered; generating, by the computer system, the
scene based upon the duration and the effect for the scene; and
outputting, by the computer system, an effect animation
corresponding to the effect applied to the scene.
2. The method of claim 1, wherein generating the effect comprises
determining an effect type for the scene, determining a duration of
the effect and a speed or magnitude of the effect, and generating
the effect animation based upon the effect type and the duration of
he scene, as well as positioning of the viewpoint.
3. The method of claim 2, wherein generating the effect further
comprises determining a camera distance for the effect, the camera
distance comprising a distance between the viewpoint for the effect
animation and a center point of data included in the scene.
4. The method of claim 3, wherein the effect type is selected from
a group comprising an orbit effect, a figure eight effect, a back
and forth effect, a line effect, and a stationary effect.
5. The method of claim 1, further comprising: identifying a further
scene included in the visualization; generating a transition
between the scene the further scene; and outputting a transition
animation corresponding to the transition applied to the scene and
the further scene.
6. The method of claim 5, wherein generating the transition further
comprises determining a transition type for the transition,
determining a distance between a start location that corresponds to
the scene and an end location corresponding to the further scene,
determining a duration of the transition, and generating the
transition animation based upon the transition type, the distance,
and the duration.
7. The method of claim 6, wherein determining the distance between
the scene and the further scene comprises identifying a first
geographic location associated with data in the scene, identifying
a second geographic location associated with data in the further
scene, determining a distance between the first geographic location
and the second geographic location, and defining the distance
between the scene and the further scene as the distance between the
first geographic location and the second geographic location.
8. The method of claim 7, wherein the transition type is selected
from a group comprising a linear transition type, an arc transition
type, or a zoom-in/zoom-out transition type.
9. The method of claim 6, wherein generating the transition further
comprises identifying the transition type as one of a cut
transition type or a fade transition type, and and generating the
transition animation based upon the transition type.
10. The method of claim 5, further comprising: generating an effect
for the further scene; and outputting a further effect animation
corresponding further scene and the effect applied to the further
scene.
11. The method of claim 1, further comprising: generating a user
interface comprising a representation of the scene, a further
representation of a further scene of the visualization, and
controls for specifying an effect type, a duration of the effect, a
speed of the effect, a magnitude of the effect, a transition type
for a transition between the scene and the further scene, and a
duration for the transition; and determining, based upon
interactions with the user interface, the effect type, the duration
of the effect, the speed of the effect or the magnitude of the
effect, the transition type, and the duration.
12. A computer storage medium having computer readable instructions
stored thereon that, when executed by a computer, cause the
computer to: detect selection of a scene included in a
visualization of spreadsheet data; determine a duration of the
scene, the duration comprising an amount of time between a start
time of the scene and an end time of the scene; receive data
indicating a selection of an effect to be applied to the scene, the
effect comprising a visual effect applied during rendering of the
scene from a viewpoint from which the scene is rendered; identify a
further scene included in the visualization; generate a transition
between the scene and the further scene; and output an effect
animation and a transition animation.
13. The computer storage medium of claim 12, wherein generating the
effect comprises determining an effect type for the scene,
determining a speed of the effect or a magnitude of the effect,
determining a camera distance for the effect, the camera distance
comprising a distance between the viewpoint for the effect
animation and a center point of data included in the scene, and
generating the effect animation based upon the effect type, the
duration, the speed or the magnitude, and the camera distance.
14-20. (canceled)
21. The computer storage medium of claim 12, wherein generating the
transition further comprises determining a transition type for the
transition, determining a duration of the transition, determining a
distance between the scene and the further scene, determining a
duration of the transition, determining a path and an orientation
of a viewpoint from which the transition is to be rendered, and
generating the transition animation based upon the transition type,
the distance, and the duration.
22. The computer storage medium of claim 21, further comprising
computer readable instructions that, when executed by the computer,
cause the computer to: generate a user interface comprising
representations of the scene and the further scene, and controls
for specifying an effect type, a duration of the effect, a speed of
the effect or a magnitude of the effect, a transition type for a
transition between the scene and the further scene, and a duration
of the transition; and determining, based upon interactions with
the user interface, the effect type, the duration of the effect,
the speed or magnitude of the effect, the transition type, and the
duration.
23. A computer storage medium having computer readable instructions
stored thereon that, when executed by a computer, cause the
computer to: generate a user interface comprising representations
of a scene and a further scene included in a visualization of
spreadsheet data, and controls for specifying an effect type, a
duration of the effect based upon a start time of the scene and an
end time of the scene, a speed of the effect or a magnitude of the
effect, a transition type for a transition between the scene and
the further scene, and a duration of the transition; detect
selection of one of the representations corresponding to the scene;
determine a duration of the scene based upon the start time and the
end time; identify, based upon a selection received via the user
interface, an effect to be applied to the scene selected, the
effect comprising a visual effect applied during rendering of the
scene from a viewpoint from which the scene is rendered and being
based on the effect type, the duration of the effect, and the speed
or magnitude of the effect; identify a further scene included in
the visualization; generate a transition between the scene and the
further scene, the transition being based on the transition type
and the duration; and output an effect animation and a transition
animation.
24. The computer storage medium of claim 23, wherein the transition
type comprises one effect selected from a group comprising a cut
transition type, a fade transition type, a linear transition type,
an arc transition type, or a zoom-in/zoom-out transition type.
25. The computer storage medium of claim 23, wherein the effect
type is selected from a group comprising an orbit effect, a figure
eight effect, a back and forth effect, a line effect, and a
stationary effect.
26. The computer storage medium of claim 23, wherein outputting the
effect animation and the transition animation comprises generating
a preview of a visualization comprising the effect animation and
the transition animation, and presenting the preview in the user
interface.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to U.S. Provisional Patent
Application No. 61/681,851 entitled "3D Visualization of Data in
Geographical and Temporal Contexts," filed Aug. 10, 2012, which is
incorporated herein by reference in its entirety.
BACKGROUND
[0002] A spreadsheet application, reporting application, or other
data presentation application may support presentation of data in
visual representations. The visual representations of the data can
include two-dimensional and/or three-dimensional pie charts, line
graphs, bar graphs, charts, or the like. Users may generate the
visual representations of the data to attempt to gain insight into
the data, relationships among data points, trends, or the like.
Some data, however, may not be readily susceptible to graphing
and/or charting according to these approaches.
[0003] In particular, some data may include geographical and/or
temporal components. Spreadsheet applications may present these
data in charts, graphs, or other visual representations, but the
display of this type of information may be limited to color codes,
data labels, or other formats that may not impart meaning to the
data being presented. Furthermore, the presentation of these data
may not provide the data in a visually appealing manner and
therefore may not be appreciated by viewers.
[0004] It is with respect to these and other considerations that
the disclosure made herein is presented.
SUMMARY
[0005] Concepts and technologies are described herein for animation
transitions and effects in a spreadsheet application. In accordance
with the concepts and technologies disclosed herein, a computer
system can execute a visualization component. The visualization
component can be included in a spreadsheet application and/or can
be configured to present visualizations of spreadsheet data. As
used herein, a "visualization" can include an animated rendering of
spreadsheet data over time on a map, globe, or other surface that
can provide geographical context. According to various embodiments,
the visualization can include one or more scenes. The
visualizations can be based, at least in part, upon geographical
information and/or time values, timestamps, and/or other temporal
information included in the data. The visualization can include a
rendered globe or map that shows the data in corresponding
locations on the map or globe, based upon geographical information
and/or other location data included in the data.
[0006] Additionally, embodiments of the concepts and technologies
disclosed herein can be used to add camera effects and/or camera
transitions to and/or between scenes. As used herein, a "camera"
can refer to a virtual camera that corresponds to a viewpoint for a
particular scene and/or scenes. The viewpoint, path, orientation,
and/or other aspects of the camera can be determined based upon an
associated effect, as well as a scene start time and end time. The
start time and end time for a scene can be specified or selected by
a user or determined by the visualization component. The
visualization component also can receive a selection of an effect
to apply to the scene. The visualization component can generate
effects for scenes that animate the camera during rendering of the
scene. The effects can include movement of the camera around or
past a center point or other focus of the camera during the
scene.
[0007] The visualization component also can generate transitions
between the scenes that animate the camera between the scenes. The
visualization component can receive a duration of the transition,
receive a start location and an end location, and determine a path
between the locations. The visualization component also can
determine an orientation of the camera during the transition. The
transitions can include movement of the camera along a flight path
and/or varying zoom levels of the camera. The transition and the
effects can be displayed by the visualization component and/or
output by the visualization component in other ways.
[0008] According to one aspect, a computer system executing a
visualization component receives spreadsheet data. The spreadsheet
data can include a visualization and/or the visualization can be
generated by the visualization component and/or other devices. The
computer system can detect selection of a scene of the
visualization, for example via a user interface presenting the
visualization. The computer system can generate an effect for the
scene. The effect can be generated based upon a known or selected
effect type, effect duration, effect speed, and/or a camera
distance for the effect. The computer system also can output an
effect animation.
[0009] According to another aspect, the computer system can
identify two scenes in the visualization and generate a transition
between the scenes. The computer system can generate the transition
based upon a known or selected transition type, a distance between
geographic locations depicted in the scenes, and a transition time
or duration. The computer system also can output a transition
animation. In some embodiments, the computer system outputs the
effect animation and the transition animation in a preview screen
included in a user interface.
[0010] It should be appreciated that the above-described subject
matter may be implemented as a computer-controlled apparatus, a
computer process, a computing system, or as an article of
manufacture such as a computer-readable storage medium. These and
various other features will be apparent from a reading of the
following Detailed Description and a review of the associated
drawings.
[0011] This Summary is provided to introduce a selection of
concepts in a simplified form that are further described below in
the Detailed Description. This Summary is not intended to identify
key features or essential features of the claimed subject matter,
nor is it intended that this Summary be used to limit the scope of
the claimed subject matter. Furthermore, the claimed subject matter
is not limited to implementations that solve any or all
disadvantages noted in any part of this disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] FIG. 1 is a system diagram illustrating an illustrative
operating environment for the various embodiments disclosed
herein.
[0013] FIG. 2 is a block diagram showing additional aspects of the
visualization component described herein, according to an
illustrative embodiment.
[0014] FIG. 3 is a flow diagram showing aspects of a method for
generating animation effects and transitions in a spreadsheet
application, according to an illustrative embodiment.
[0015] FIG. 4 is a flow diagram showing aspects of a method for
generating animation effects in a spreadsheet application,
according to an illustrative embodiment.
[0016] FIG. 5 is a flow diagram showing aspects of a method for
generating animation transitions in a spreadsheet application,
according to an illustrative embodiment.
[0017] FIG. 6 is a line diagram illustrating additional aspects of
the concepts and technologies disclosed herein for configuring
animation effects, according to an illustrative embodiment.
[0018] FIGS. 7A-7H are line drawings illustrating some aspects of
several example animation effects and transitions, according to
some illustrative embodiments.
[0019] FIGS. 8A-8B are UI diagrams showing example UIs for use in
configuring and outputting animation transitions and effects in a
spreadsheet application, according to some illustrative
embodiments.
[0020] FIG. 9 is a computer architecture diagram illustrating an
illustrative computer hardware and software architecture for a
computing system capable of implementing aspects of the embodiments
presented herein.
[0021] FIG. 10 is a diagram illustrating a distributed computing
environment capable of implementing aspects of the embodiments
presented herein.
[0022] FIG. 11 is a computer architecture diagram illustrating a
computer system architecture capable of implementing aspects of the
embodiments presented herein.
DETAILED DESCRIPTION
[0023] The following detailed description is directed to concepts
and technologies for animation transitions and effects in a
spreadsheet application. According to the concepts and technologies
described herein, a computer system can execute a visualization
component. The visualization component can obtain spreadsheet data
that includes, or can be used to generate, a visualization of
spreadsheet data. The visualization can include one or more scenes
and can show the spreadsheet data in geographic and temporal
contexts based upon geographical information and/or time
information included in the spreadsheet data. The visualization
component can be configured to add camera effects to scenes, and/or
to add camera transitions between two or more scenes. The effects
can include movement of the camera around or past a center point or
other focus of the camera during the scene, and the transitions can
include movement of the camera along a flight path and/or varying
zoom levels of the camera.
[0024] The visualization component can be configured to detect
selection of a scene of the visualization, for example via a user
interface presenting the visualization. The visualization component
can be configured to generate an effect for the scene based, at
least in part, upon a known or selected effect type, effect
duration, effect speed, and/or a camera distance for the effect.
The visualization component also can be configured to identify two
scenes in the visualization and generate a transition between the
scenes. The visualization component can be configured to generate
the transition based upon a known or selected transition type, a
distance between geographic locations depicted in the scenes, and a
duration of the transition. The visualization component can be
configured to output an effect animation and a transition
animation.
[0025] While the subject matter described herein is presented in
the general context of program modules that execute in conjunction
with the execution of an operating system and application programs
on a computer system, those skilled in the art will recognize that
other implementations may be performed in combination with other
types of program modules. Generally, program modules include
routines, programs, components, data structures, and other types of
structures that perform particular tasks or implement particular
abstract data types. Moreover, those skilled in the art will
appreciate that the subject matter described herein may be
practiced with other computer system configurations, including
hand-held devices, multiprocessor systems, microprocessor-based or
programmable consumer electronics, minicomputers, mainframe
computers, and the like.
[0026] In the following detailed description, references are made
to the accompanying drawings that form a part hereof, and in which
are shown by way of illustration specific embodiments or examples.
Referring now to the drawings, in which like numerals represent
like elements throughout the several figures, aspects of a
computing system, computer-readable storage medium, and
computer-implemented methodology for animation transitions and
effects in a spreadsheet application will be presented.
[0027] Referring now to FIG. 1, aspects of one operating
environment 100 for the various embodiments presented herein will
be described. The operating environment 100 shown in FIG. 1
includes a computer system 102 operating as a part of and/or in
communication with a communications network ("network") 104.
According to various implementations of the concepts and
technologies disclosed herein, the functionality of the computer
system 102 can be provided by a cloud-based computing platform that
can be provided by one or more application servers, Web servers,
data storage systems, network appliances, dedicated hardware
devices, and/or other server computers or computing devices.
[0028] According to some other embodiments, the computer system 102
can include a user computing device, such as a tablet computing
device, a personal computer ("PC"), a desktop computer, a laptop
computer, a notebook computer, a cellular phone or smartphone,
other mobile computing devices, a personal digital assistant
("PDA"), or the like. Some example architectures of the computer
system 102 are illustrated and described below with reference to
FIGS. 6-8. For purposes of illustrating and describing the concepts
and technologies disclosed herein, the functionality of the
computer system 102 is described herein as being provided by a
server computer. In light of the above alternative embodiments of
the computer system 102 described above, it should be understood
that this example is illustrative, and should not be construed as
being limiting in any way.
[0029] The computer system 102 can be configured to execute an
operating system 106 and one or more application programs such as,
for example, a spreadsheet application 108, a visualization
component 110, and/or other application programs. The operating
system 106 is a computer program for controlling the operation of
the computer system 102. The application programs are executable
programs configured to execute on top of the operating system 106
to provide the functionality described herein for displaying
temporal information in a spreadsheet application.
[0030] In particular, the spreadsheet application 108 can be
configured to create, manipulate, store, and/or otherwise interact
with tabular or other structured data such as spreadsheets.
According to some embodiments of the concepts and technologies
disclosed herein, the functionality of the spreadsheet application
108 can be provided by a member of the MICROSOFT EXCEL family of
spreadsheet applications from Microsoft Corporation of Redmond,
Washington. In some other embodiments, the functionality of the
spreadsheet application 108 can be provided by a database
application, a data reporting application, a data presentation
application, combinations thereof, or the like.
[0031] According to some implementations, the spreadsheet
application 108 can be executed by one or more server computers in
the computer system 102, such as application servers and/or Web
servers. Thus, the functionality of the spreadsheet application 108
can be accessed by other computing devices and/or accessed at the
computer system 102. In the illustrated embodiment, the
functionality of the spreadsheet application 108 can be accessed
and/or interacted with by a user computing device 112. The
functionality of the user computing device 112 can be provided by,
for example, a tablet computing device, a smartphone, a laptop
computer, a desktop computer, other computing devices, combinations
thereof, or the like. The user computing device 112 can communicate
with the computer system 102 over one or more links or networks
such as, for example, the network 104, a private network, a direct
wireless or wired connection, the Internet, and/or combinations of
these and other networks and/or communication links.
[0032] Although not visible in FIG. 1, the user computing device
112 can execute one or more client applications. The client
applications can include Web browser applications and/or other
applications for accessing the spreadsheet application 108
executing on the computer system 102. In some embodiments, the
spreadsheet application 108 can be executed locally on the user
computing device 112 or other devices that can include the
functionality of the computer system 102 described herein. The
spreadsheet application 108 can be implemented as hardware,
software, and/or a combination of the two. Furthermore, the
spreadsheet application 108 can include one or more application
program modules and other components on the user computing device
112, the computer system 102, and/or other computing platforms. As
will be explained in more detail herein, the user computing device
112 can generate one or more user interfaces ("UIs") 114 to present
temporal information to a user 116.
[0033] According to various embodiments, the spreadsheet
application 108 can be configured to generate, manipulate, and/or
store tabular or other structured data that can be included in
spreadsheet data 118. The spreadsheet data 118 also can be stored
in tables of a database, objects stored in an object store, or the
like. Because the functionality of the spreadsheet application 108
is generally understood, the spreadsheet application 108 will not
be described in additional detail herein.
[0034] According to various implementations, the spreadsheet data
118 can be obtained by the computer system 102 from a local or
remote data source 120. In some embodiments, the data source 120
can include a memory, disk drive, or other data storage element of
or associated with the computer system 102. In some other
embodiments such as the embodiment illustrated in FIG. 1, the data
source 120 can include a network drive, a server computer operating
as a part of and/or in communication with the network 104, a
database or other real or virtual data storage elements, and/or
other data storage devices. As such, it should be understood that
the data source 120 can include almost any type of data storage
device that is local to and/or remote from the computer system
102.
[0035] The visualization component 110 can be executed by the
computer system 102 to provide the functionality described herein
for displaying temporal information in a spreadsheet application.
In particular, the visualization component 110 can be configured to
obtain the spreadsheet data 118 from the spreadsheet application
108 and/or directly from the data source 120, and to generate,
based upon the spreadsheet data 118, three-dimensional
visualizations of the spreadsheet data 118 in a geographical and/or
temporal context. In some embodiments, the visualization component
110 can be implemented as a component of the spreadsheet
application 108, and in some embodiments, the visualization
component 110 can be implemented as a component separate from the
spreadsheet application. Thus, while the spreadsheet application
108 and the visualization component 110 are illustrated as
components of the computer system 102, it should be understood that
each of these components, or combinations thereof, may be embodied
as or in stand-alone devices or components thereof operating on or
in communication with the network 104 and/or the computer system
102. Thus, the illustrated embodiment is illustrative, and should
not be construed as being limiting in any way.
[0036] In some embodiments, the visualization component 110 may be
implemented as a plugin or add-in for the spreadsheet application
108. In some other embodiments, the visualization component 110 can
include a service and/or set of application programming interfaces
("APIs") that can provide the functionality described herein. Thus,
it should be appreciated that the visualization component 110 can
be implemented as hardware, software, or a combination thereof.
[0037] According to various embodiments of the concepts and
technologies disclosed herein, the visualization component 110 can
be configured to access one or more geocoding services 122. The
geocoding services 122 can be configured to map geographical data
included in the spreadsheet data 118 to geographic information.
Thus, for example, the visualization component 110 can provide
geographical data included in the spreadsheet data 118 such as, for
example, a street address, a city, a state, a ZIP code, or the
like, to the geocoding services 122. The geocoding services 122 can
map this geographical data to latitude and longitude information
and/or other geocoded location data. Thus, it can be appreciated
that the geocoding services 122 can be called by the computer
system 102 via one or more APIs exposed by the geocoding services
122, though this is not necessarily the case. Furthermore, the
geocoding services 122 can be configured to provide geographic
mapping data 124 representing mappings of the geographical data to
the geocoded location data to the computer system 102, though this
is not necessarily the case.
[0038] In some embodiments, the visualization component 110 can
access the geocoding services 122 via one or more networks such as,
for example, the network 104, the Internet, other networks, and/or
a combination thereof. In some other embodiments, the geocoding
services 122 can be implemented on the computer system 102. In one
contemplated embodiment, the geocoding services 122 are implemented
as a component of the visualization component 110. It should be
understood that this embodiment is illustrative, and should not be
construed as being limiting in any way.
[0039] The visualization component 110 also can be configured to
obtain and/or access map data 126. The map data 126 can be used to
provide geolocation and/or graphical data for the creation of the
three-dimensional geographical maps as described herein. The
visualization component 110 may be configured to obtain or access
the map data 126 from or at a computing device such as, for
example, a map server 128. In some embodiments, the functionality
of the map server 128 can be provided by a mapping application
executed by a search engine such as the BING search engine from
Microsoft Corporation in Redmond, Washington. Because the
functionality of the map server 128 can be provided by additional
and/or other devices and/or applications, it should be understood
that this embodiment is illustrative, and should not be construed
as being limiting in any way.
[0040] The computer system 102 can access the map server 128 via
one or more networks such as, for example, the network 104. In some
embodiments, the visualization component 110 can be configured to
access map tiles from the map data 126, and to stitch the map tiles
together over a three-dimensional globe armature to create a
three-dimensional geographic globe. The visualization component 110
can be configured to use geocoded location data such as latitude
and longitude data from the geocoding services 122 to place
visualizations of data included in the spreadsheet data 118 on the
three-dimensional geographic globe. As such, various embodiments of
the visualization component 110 can be configured to generate
displays of geographic data.
[0041] As used herein, a "visualization" can include an animation,
scene, and/or a tour of multiple scenes. The animation, scene,
and/or tour of scenes can represent the spreadsheet data 118 on a
globe, map, or other representation of a geographic location
associated with the spreadsheet data 118. In particular, the
spreadsheet data 118 can be displayed on a visual representation of
the globe, a map, or other surface at points corresponding to
geographic location data included in the spreadsheet data 118
and/or mapped to locations as described above. The visualization
also can show data changes over time.
[0042] The user 116 may interact with the spreadsheet application
108 and the visualization component 110 to create and/or navigate
the visualization of the spreadsheet data 118 through a display of
the user computing device 112. In some embodiments, the user 116
may use one or more input devices of the user computing device 112
such as a touchscreen, a keyboard, a mouse, a game controller,
combinations thereof, or the like. The UIs 114 can be presented on
the touchscreen, a monitor, a display, other display surfaces or
devices, combinations thereof, or the like.
[0043] The visualization component 110 also can be executed by the
computer system 102 to provide the functionality described herein
for animation transitions and effects in a spreadsheet application.
In particular, the visualization component 110 can be configured to
obtain the spreadsheet data 118 and to generate the visualization
of the spreadsheet data 118 as described above. According to
various embodiments of the concepts and technologies disclosed
herein, the visualization component 110 can be configured to
generate and/or present various user interfaces for creating,
modifying, and/or saving scenes. In particular, the visualization
component 110 can be configured to arrange scenes in a "tour,"
which is used herein to refer to a sequence of multiple scenes.
[0044] The scenes of the tour can be rendered by the visualization
component 110 from a viewpoint or perspective referred to herein as
a "camera." It should be understood that there is no physical
"camera" in the scenes described herein, which are rendered by the
visual component 110. Thus, as used herein, a "camera" can refer to
a viewpoint associated with a virtual camera. Other aspects of the
camera can be set by options, user settings, combinations thereof,
or the like. These aspects can include, but are not limited to, a
location of the camera, a field of view of the camera, a focal
length and/or view angle of the camera, a line of sight and/or
tilt, skew, or other orientation of the camera, movement of the
camera, or the like. Thus, the visualization component 110 can be
configured to generate the scenes described herein as if filmed
with a real camera.
[0045] Furthermore, various embodiments of the visualization
component 110 can be configured to apply and/or modify various
camera effects in a scene and/or to apply and/or modify various
camera transitions between scenes. As will be described herein in
detail, particularly with reference to FIGS. 3-8B, the
visualization component 110 can present the tour in a user
interface and detect selection of a scene in the tour. The
visualization component 110 can generate one or more effects for
the selected scene.
[0046] In some embodiments, the visualization component 110 can
generate the effects by determining an effect type, determining
timing for the effect such as duration and speed, determining a
camera distance from the rendered data, and generating the effect
animation. These and other aspects of the effects can be specified
by a user or other entity via interactions with a user interface;
specified by settings, options, or the like; and/or otherwise
determined by the visualization component 110 via analysis of the
scene. The effects can include, but are not limited to, an orbit
effect, a stationary effect, a fly by effect, a figure eight
effect, a line effect, other effects, or the like.
[0047] The effects also can be tailored to represent a desired
speed and/or magnitude. Thus, for example, a user can increase a
magnitude of an effect such as an orbit effect. In such an example,
the speed of the effect can be increased, the effect can be
repeated in a scene. In some embodiments, a viewing distance
associated with an effect, e.g., a radius associated with an orbit
effect, can be determined automatically by the visualization
component 110 and/or specified by a user or other entity. In the
case of a linear effect, an increase in magnitude of an effect can
cause an increase in a viewing angle and/or a trajectory on which
the camera travels. Thus, the magnitude of an effect can affect the
speed of the camera, wherein the speed can be bound to the
magnitude and/or automatically calculated based upon the magnitude,
if desired. These effects are illustrated and described in more
detail below, particularly with reference to FIGS. 4 and 6-7E. The
visualization component 110 can apply the selected effect(s),
timing, and camera distance to generate the effect animation.
[0048] The visualization component 110 also can generate
transitions to be applied between two or more scenes of a tour. The
visualization component 110 can apply the transitions by
determining a transition type, determining a distance between the
scenes, determining a transition time or a duration of the
transition, and generating the transition animation. The
visualization component 110 also can receive a duration of the
transition or determine the duration of the transition. Because the
scenes can be associated with geographic locations, determining a
distance between the scenes can correspond to determining a
distance between the geographic locations associated with the
scenes. In particular, the visualization component 110 can receive
information indicating, for a particular transition, a start
location of the camera at the beginning of a transition and an end
location of the camera at the end of the transition. The
visualization component 110 can determine a path and orientation of
the camera between the two locations and over the determined or
received duration. The visualization component 110 also can be
configured to analyze geographic information to determine the
distance between the scenes of a tour and use that information in
generating the transition.
[0049] The visualization component 110 can determine the duration
and/or time of a transition by determining or receiving a duration
of the transition via interactions of a user or other entity with a
user interface; settings, options, or the like associated with the
visualization component 110 and/or the user; and/or otherwise
determined by the visualization component 110 via analysis of the
scene. In some embodiments, the visualization component 110
receives selections or indications of the duration of the
transition, though this is not necessarily the case. The
transitions can include, but are not limited to, a cut transition
type; a cross-fade transition type; a linear or direct transition
type; an arc, jump, or curved transition type; a zoom-out/zoom-in
transition type; combinations thereof; or the like.
[0050] The generated effects and transitions can be added to the
visualization by the visualization component 110. In some
embodiments, the visualization component 110 can provide a preview
of a scene, tour, and/or specific transitions and/or effects in a
user interface having a preview portion or screen. Thus, a user or
other entity can generate and/or modify the effects and/or
transitions and preview the scenes or tour via the visualization
component 110. In some embodiments, a user or other entity can set
a duration of a transition. If a transition time is reduced to zero
(0), the visualization component 110 can be configured to change a
transition type to a cut-type transition automatically. Similarly,
if a duration of a cut-type transition type is increased to a time
greater than zero (0), the visualization component can be
configured to produce another type of transition and/or to prompt a
user or other entity to select a different transition type. In some
embodiments, transitions can be configured to encourage camera
travel in an efficient and/or most efficient vector path to a
destination. Similarly, the cameras can be controlled by the
transitions to encourage the cameras to orient a final viewpoint
with a least amount of turns. Further, additional smoothing of the
paths between transitions and effects can be applied, which may
result in movement of the camera along a round curve as opposed to
along sharp angles. These and other aspects of the concepts and
technologies disclosed herein are described in more detail
below.
[0051] In some embodiments of the concepts and technologies
disclosed herein, a "fly to" type of transition may produce turning
of the camera because the camera may be configured to orient toward
a next camera target. While this type of motion may sometimes be
desirable, this type of motion may sometimes not be desirable. As
such, some embodiments of the cant include a "move to" transition,
in which the globe move under the camera can move in an efficient
and/or most efficient manner. For example, if a next scene is
located behind the camera relative to a view angle, the camera can
move backward and/or reverse, while if a next scene is forward or
in front of the camera, the camera can be moved forward. The camera
also can be moved side-to-side to take the fastest and most
efficient way to the next target. In some embodiments, the "move
to" transition may be desirable because this transition may be a
most efficient way to move the camera from one spot to another, and
with a least amount of turning. In some embodiments, the "move to"
transition can have an arc shape and/or can avoid flying at low
altitude relative to the globe to avoid producing blurry tiles. It
should be understood that this embodiment is illustrative, and
should not be construed as being limiting in any way.
[0052] In some embodiments, linear transitions can cause the camera
to travel in a linear trajectory. According to some embodiments, as
a user controls speed and/or duration of a scene, the visualization
component 110 can adjust a length of the trajectory of the camera
as the camera may end up at a location far from a desired viewpoint
or subject of a scene. According to various embodiments,
controlling speed and scene duration can result in additional loops
of the camera through the transition. In some embodiments, the
visualization component can change or allow a user to change a
speed control for linear effects to control the extent of the
effect, namely, the user may control the length of the trajectory
instead of controlling the speed. In some embodiments, the speed
can be automatically determined based upon on the scene duration
and the length of the trajectory or the extent. It should be
understood that these embodiments are illustrative, and should not
be construed as being limiting in any way.
[0053] FIG. 1 illustrates one computer system 102, one network 104,
one user computing device 112, one data source 120, one instance of
geocoding services 122, and one map server 128. It should be
understood, however, that some implementations of the operating
environment 100 can include multiple computer systems 102, multiple
networks 104, multiple user computing devices 112, multiple data
sources 120, multiple instances the geocoding services 122, and/or
multiple map servers 128. As such, the illustrated embodiment of
the operating environment should be understood as being
illustrative, and should not be construed as being limiting in any
way.
[0054] Turning now to FIG. 2, additional aspects of the
visualization component 110 will be presented, according to one
illustrative embodiment. In particular, FIG. 2 provides further
details regarding architecture and subcomponents of the
visualization component 110, according to some embodiments. The
visualization component 110 can include a number of components
and/or subsystems including, but not limited to, a visualization
control 200, a visualization engine 202, a spreadsheet plugin core
204, and/or other components and/or subsystems.
[0055] The visualization control 200 can include functionality for
representing data, performing searches and/or providing search
services, a globe control for visualizing and/or presenting
representations of the globe, video recording functionality for
recording animations and/or videos of illustrated tours, and a
client. The visualization engine 202 can include functionality for
generating a tour including multiple scenes, images, and/or
animation sequences; functionality for measuring and/or
representing time in the visualization space; an engine core for
providing the visualization component functionality described
herein; annotations functionality for generating and/or rendering
two-dimensional and/or three-dimensional annotations; spatial
indexing functionality; and camera functionality. The visualization
engine 202 also can include globe models and/or functionality for
representing the globe; input and touch modules for interpreting
touch and/or multi-touch commands as input; visual layers
functionality for representing and/or interacting with layers of a
visualization space; a tile cache for storing map tiles; a
three-dimensional graphics module for generating and/or rendering
three-dimensional visualizations; and shaders for providing shading
of generated and/or rendered three-dimensional objects.
[0056] In some embodiments, the shaders can include or implement a
number of algorithms to facilitate the rendering of the
three-dimensional geographical visualizations of data described
herein. For example, the visualization component 110 can implement
a dark aura effect for disambiguating visualization of a number of
similarly colored objects. A dark aura effect can include a visual
treatment that allows a viewer, for example the user 116, to
differentiate between items in a three-dimensional visualization
space. When there are multiple, similarly colored columns in a
three-dimensional visualization or view, some of these columns may
be next to and/or behind one another in the three-dimensional view.
Thus, the multiple columns may appear to be grouped together and/or
may look like a single polygon. In some embodiments of the concepts
and technologies disclosed herein, the dark aura effect can be
added around one or more of the columns, thereby allowing the one
or more columns to appear to stand out from one another. Because
other visual effects are possible and are contemplated, it should
be understood that this example is illustrative, and should not be
construed as being limiting in any way.
[0057] In another example, the visualization component 110 may
implement a GPU-based framework for asynchronous hit testing for
large number of arbitrary three-dimensional elements. This may
comprise adding "out-of-channel" color information to pixels of the
objects rendered in the three-dimensional visualization that may be
invisible to the viewer, but can contain information identifying
the object. Thus, if a user taps, clicks, or otherwise interacts
with a point in the three-dimensional visualization, the identity
of the object represented by the selected pixel can be known
without deconstructing the three-dimensional visualization and
determining the object rendered at the selected location. This may
be implemented in the GPU.
[0058] The spreadsheet plugin core 204 can include functionality
for storing workbook state information, as well as a query engine
for generating and/or executing queries against various data
sources. In some embodiments, the query engine can be configured to
generate a query based upon data stored in the spreadsheet data
118, and to submit the queries to a search engine. It should be
understood that this embodiment is illustrative, and should not be
construed as being limiting in any way.
[0059] The visualization component 110 also can include various
other components and/or subsystems such as, for example, a
spreadsheet program native plugin and a spreadsheet program API
such as, for example, a command object model ("COM") API, a Java
API, and/or other technologies such as Perl, Apple Cocoa framework,
various server and/or client-side script execution environments or
the like. The visualization component 110 also can include various
graphics plugins and/or APIs such as the illustrated DIRECTX APIs,
API call emulators such as the illustrated DIRECTX WRAPPER, a
WINDOWS Presentation Foundation ("WPF") subsystem, combinations
thereof, or the like. The visualization component 110 also can
include analytics engines such as the illustrated VERTIPAQ engine
and/or modules associated with other data providers, if desired. It
should be appreciated that the visualization component 110 can
include additional and/or alternative functionality not shown in
FIG. 2. As such, the embodiment illustrated in FIG. 2 should be
understood as being illustrative and should not be construed as
being limiting in any way.
[0060] Turning now to FIG. 3, aspects of a method 300 for
generating animation effects and transitions in a spreadsheet
application will be described in detail. It should be understood
that the operations of the methods disclosed herein are not
necessarily presented in any particular order and that performance
of some or all of the operations in an alternative order(s) is
possible and is contemplated. The operations have been presented in
the demonstrated order for ease of description and illustration.
Operations may be added, omitted, and/or performed simultaneously,
without departing from the scope of the appended claims.
[0061] It also should be understood that the illustrated methods
disclosed herein can be ended at any time and need not be performed
in their respective (or collective) entireties. Some or all
operations of the methods disclosed herein, and/or substantially
equivalent operations, can be performed by execution of
computer-readable instructions included on a computer-storage
media, as defined herein. The term "computer-readable
instructions," and variants thereof, as used in the description and
claims, is used expansively herein to include routines,
applications, application modules, program modules, programs,
components, data structures, algorithms, and the like.
Computer-readable instructions can be implemented on various system
configurations, including single-processor or multiprocessor
systems, minicomputers, mainframe computers, personal computers,
hand-held computer systems, microprocessor-based, programmable
consumer electronics, combinations thereof, and the like.
[0062] Thus, it should be appreciated that the logical operations
described herein are implemented (1) as a sequence of computer
implemented acts or program modules running on a computing system
and/or (2) as interconnected machine logic circuits or circuit
modules within the computing system. The implementation is a matter
of choice dependent on the performance and other requirements of
the computing system. Accordingly, the logical operations described
herein are referred to variously as states, operations, structural
devices, acts, or modules. These operations, structural devices,
acts, and modules may be implemented in software, in firmware, in
special purpose digital logic, and any combination thereof.
[0063] For purposes of illustrating and describing the concepts of
the present disclosure, the methods disclosed herein are described
as being performed by the computer system 102 via execution of one
or more software modules such as, for example, the visualization
component 110. It should be understood that additional and/or
alternative devices and/or network nodes can provide the
functionality described herein via execution of one or more
modules, applications, and/or other software including, but not
limited to, the visualization component 110. Thus, the illustrated
embodiments are illustrative, and should not be viewed as being
limiting in any way.
[0064] The method 300 begins at operation 302, wherein the computer
system 102 obtains spreadsheet data 118. As explained above, the
spreadsheet data 118 can include various types of information or
content such as, for example, spreadsheet files, database
application data, and/or other types of information. In one
contemplated embodiment, the spreadsheet data 118 corresponds to a
spreadsheet file such as a file generated by a member of the
MICROSOFT EXCEL family of spreadsheet application software products
from Microsoft Corporation in Redmond, Washington. Other
contemplated spreadsheet applications include, but are not limited
to, a member of the GOOGLE DOCS family of programs, a member of the
OPENOFFICE family of programs, a member of the APPLE IWORK NUMBERS
family of programs, and/or other spreadsheet, table, and/or
database programs. The spreadsheet data 118 can be obtained from a
data storage device or component associated with the computer
system 102. Some examples of data storage devices are described in
more detail below with reference to FIGS. 9-11.
[0065] In some other embodiments, the spreadsheet data 118 can be
stored at or hosted by a remote storage device or resource such as
the data source 120 described herein. Thus, the spreadsheet data
118 can be obtained by the computer system 102 via communications
with the data source 120. As such, it should be understood that the
spreadsheet data 118 can be obtained from any real or virtual
device via a direct connection, via one or more networks, and/or
via other nodes, devices, and/or device components.
[0066] In the embodiments illustrated in FIG. 3, the spreadsheet
data 118 obtained by the computer system 102 in operation 302 can
include a visualization. Thus, it should be understood that the
computer system 102 and/or another device or application can
generate a visualization and store the visualization as or in the
spreadsheet data 118 obtained in operation 302. Similarly, the
computer system 102 can generate and output the UI 114 to the user
computing device 112, and the user computing device 112 can present
the UI 114 at a display associated with user computing device 112.
Thus, it should be understood that operation 302 also can include
receiving spreadsheet data 118 and generating the visualization at
the computer system 102. Thus, while the methods described herein
are illustrated and described as occurring at the computer system
102, it should be understood that user input can occur via a web
browser or other program executing at the user computing device 112
and/or other devices or systems remote from the computer system
102.
[0067] Additionally, as will be illustrated and described in more
detail herein, the visualization, which can be obtained as or with
the spreadsheet data 118 and/or generated based upon spreadsheet
data 118 obtained in operation 302, can include one or more scenes.
"Scenes" can include animation sequences that alone or together can
be correspond to the visualization. For example, a first scene of a
visualization may include visualization of a data set associated
with a location such as New York City. Thus, the scene can include
an animated sequence showing the data associated with New York City
over some time. A next scene may include visualization of another
data set associated with Washington, D.C. Thus, it can be
appreciated that scenes can include animation sequences that are
included in a visualization and/or a tour of multiple scenes.
[0068] From operation 302, the method 300 proceeds to operation
304, wherein the computer system 102 selects a scene. In
particular, the computer system 102 can select a scene included in
the visualization of the spreadsheet data 118 as described herein.
In some embodiments, the computer system 102 selects a scene based
upon input from a user or the user computing device 112. Thus, for
example, a user may select a first scene to configure the
transitions and effects associated with the scene, and this
selection can be detected by the computer system 102 in operation
340. In some other embodiments, the computer system 102 can create
a scene in operation 304, and the created scene can be selected
automatically. An example user interface for selecting a scene will
be illustrated and described in more detail below with reference to
FIG. 8A.
[0069] From operation 304, the method 300 proceeds to operation
306. In operation 306, the computer system 102 can determine a
duration of the scene. Though not explicitly shown in FIG. 3, the
computer system 102 can obtain a selection of a start time and an
end time for the scene and/or receive data specifying a duration of
the scene. The computer system 102 also can be configured to
analyze the scene and determine, based upon the analysis, a
duration of the scene. In some other embodiments, the computer
system 102 can determine the duration by receiving input, for
example, input for specifying a start time and an end time, which
may be received via one or more user interfaces and/or user input,
and/or can be automatically determined by the computer system 102.
Because the duration of the scene can be determined in additional
and/or alternative ways, it should be understood that these
embodiments are illustrative, and should not be construed as being
limiting in any way.
[0070] From operation 306, the method 300 proceeds to operation
308, wherein the computer system 102 generates effects for the
scene selected in operation 304. As is explained in more detail
below, particularly with reference to FIG. 4, the computer system
102 can generate effects for the scene selected in operation 304.
The effects can include camera effects for animating the scene.
"Camera effects" can include simulation of camera movement in a
visualization to animate the visualization. As noted above, there
is no physical "camera" in the scenes. Thus, the computer system
102 and/or the visualization component 110 can be configured to
draw the visualization from a viewpoint associated with a virtual
camera. As such, a "camera" as used herein can refer to a viewpoint
such as a virtual camera location, field of view, and/or line of
sight in accordance with which the visualization is drawn and/or
animated to mimic a scene filmed with a real camera. In operation
308, the computer system 102 also can determine one or more effects
for a scene and timing for the scene, which may be based, at least
partially, upon the duration of the scene determined in operation
306. It should be understood that this embodiment is illustrative,
and should not be construed as being limiting in any way.
[0071] From operation 308, the method 300 proceeds to operation
310, wherein the computer system 102 determines if another scene is
to be configured. According to various embodiments of the concepts
and technologies disclosed herein, a visualization can include
multiple scenes, and in addition to effects, the computer system
102 can be configured to generate transitions between the scenes.
Thus, the computer system 102 can determine, in operation 310, if
another scene exists to configure with effects and/or transitions.
According to various implementations of the method 300, the
determination made in operation 310 is determined in the
affirmative at least one time to configure a transition between two
scenes. It should be understood that this embodiment is
illustrative, and should not be construed as being limiting in any
way.
[0072] If the computer system 102 determines, in operation 310,
that another scene is to be configured, the method 300 can return
to operation 304, and the computer system 102 can select another
scene for configuration. Thus, it can be appreciated that
operations 304-310 can be repeated by the computer system 102 any
number of times. In some embodiments, the computer system 102 is
configured to repeat operations 304-310 until the computer system
102 determines, in any iteration of operation 310, that another
scene is not to be configured. If the computer system 102
determines, in operation 310, that another scene is not to be
configured, the method 300 proceeds to operation 312.
[0073] In operation 312, the computer system 102 generates one or
more transitions for the scenes. Thus, the computer system 102 can
generate a transition between at least two scenes selected in the
multiple iterations of operation 304 as explained above. The
transitions can be generated to provide animation between scenes in
addition to, or instead of, the effects described herein. Thus, in
the above example of two scenes corresponding to New York City and
Washington, D.C., the computer system 102 can generate, in
operation 312, an animated transition between the scene in New York
City and Washington, D.C. Additional details of generating the
transitions are provided below with reference to FIGS. 5-8B.
[0074] From operation 312, the method 300 proceeds to operation
314, wherein the computer system 102 outputs the effects and
transitions generated in operations 308 and 312. According to
various implementations, the computer system 102 can output a user
interface 114 and/or instructions for generating the user interface
114. The user interface 114 can include a visualization of the
spreadsheet data 118. The visualization can include at least two
scenes and a transition between the at least two scenes. The
visualization also can include effects for at least one scene of
the visualization, in some embodiments. From operation 314, the
method 300 proceeds to operation 316. The method 300 ends at
operation 316.
[0075] Turning now to FIG. 4, aspects of a method 400 for
generating animation effects in a spreadsheet application will be
described in detail. As mentioned above, the functionality
described herein with respect to the method 400 can be performed by
the computer system 102 in operation 306 of the method 300 shown in
FIG. 3, though this is not necessarily the case.
[0076] The method 400 begins at operation 402, wherein the computer
system 102 determines an effect for the scene selected in operation
304 of the method 300 illustrated in FIG. 3. As mentioned above,
"effects" can include animations applied to a camera or other
animation viewpoint in a scene. Some example effects are
illustrated and described in more detail herein with respect to
FIGS. 6-7E. Briefly, the effects can include an orbit effect, where
a camera or other animation viewpoint circles a center point or
focus of the camera; a stationary effect, where the camera or other
animation viewpoint views the center point or focus from a
stationary position; a fly by effect, where the camera moves along
a path perpendicular to a viewing vector between the camera and the
center point or focus of the camera; a figure eight effect, where
the camera moves around the center point or focus of the camera in
a figure eight shape; a line effect, wherein the camera moves back
and forth along a path perpendicular to the viewing vector between
the camera and the center point or focus of the camera; other
effects; combinations thereof; or the like.
[0077] In operation 402, the computer system 102 can identify an
effect to be applied to the scene. In particular, the computer
system 102 can detect an option, setting, or other preferences to
determine the effect. In some embodiments, the computer system 102
can detect user input for selecting the effect. Thus, the computer
system 102 can detect, in association with operation 402, selection
of an effect by a user or other entity, for example, via a user
interface associated with the computer system 102. An example user
interface that can be used to select the effect is illustrated and
described in more detail below with reference to FIGS. 8A-8B.
[0078] From operation 402, the method 400 proceeds to operation
404, wherein the computer system 102 determines timing for the
effect. As used herein, the "timing" of an effect can include a
time duration of the effect and/or a speed of the effect. As
mentioned above, timing of the effect can correspond to a time
duration of the scene to which the effect is applied. Thus, the
computer system 102 can obtain or determine the duration and/or can
determine the duration based upon other values determined or
received by the computer system 102 such as, for example, a start
time of the scene, an end time of the scene, or the like. In some
other embodiments, the time duration of the effect may not
correspond to a time duration of the scene, and as such, the effect
applied to the camera in a particular scene may be visible in a
portion of the scene. In various embodiments of the concepts and
technologies disclosed herein, the time duration of the effect can
be a user-settable or configurable option. The time duration can be
provided in a user interface, as shown in FIG. 8B, or otherwise set
by a user or other entity. Thus, in operation 404, the computer
system 102 can determine a time duration and/or detect input from a
user or other entity for setting the time duration.
[0079] Similarly, the speed of the effect can correspond to a rate
at which the animation is applied to the camera or other viewpoint
from which the animation is drawn. Thus, for example, the speed of
an orbit effect can correspond to a speed or rate at which the
camera orbits the focal point or center point of the scene. Thus,
the computer system 102 can determine, in operation 404, the speed
of the effect and/or detect user input for specifying the speed of
the effect. Because the speed and/or time duration of an effect can
be determined in additional and/or alternative ways, it should be
understood that these examples are illustrative, and should not be
construed as being limiting in any way.
[0080] From operation 404, the method 400 proceeds to operation
406, wherein the computer system 102 determines a camera
orientation, path, and/or distance. The computer system 102 can be
configured to automatically determine these and other aspects of
the camera according to various embodiments of the concepts and
technologies disclosed herein. The camera orientation can define a
viewing angle associated with the camera such as, for example, a
tilt, pitch, yaw, or the like associated with the camera.
[0081] The camera path can include, but is not limited to, a path
along which the camera moves during a scene. The camera distance
can correspond to a distance from which the camera (or other
viewpoint from which the animation is drawn) views the center point
or other focal point of the scene. According to various
embodiments, the camera distance can be a default value, can be set
by user preference and/or configuration settings, and/or can be set
by a program setting. In some other embodiments, the camera
distance can be determined by the computer system 102 based upon a
particular scene. For example, the computer system 102 can identify
data points or other representations of data to be viewed in a
scene, an angle of view of the camera, an effect applied to the
camera, combinations thereof, or the like.
[0082] In some embodiments, the computer system 102 can calculate,
based upon these and/or other data, the camera distance that will
be used to ensure that all data fits in the animated scene. One
example of a camera distance and how the camera distance may be set
is illustrated and described below with reference to FIG. 6.
Because the camera distance can be determined by settings, user
input, and/or analysis as described above, it should be understood
that the computer system 102 can determine the camera distance in a
variety of ways and that operation 406 therefore can include
additional and/or alternative operations for obtaining the data to
be considered to determine the camera distance.
[0083] From operation 406, the method 400 proceeds to operation
408, wherein the computer system 102 generates the scene animation
based upon the determinations made in operations 402-406. Thus, the
computer system 102 can apply the effect, the time duration, the
speed, the camera orientation, the camera path, and the camera
distance to the scene and output animation frames corresponding to
the animated scene. It should be understood that if the effect
determined in operation 402 corresponds to a stationary camera
effect, the output animation frames may be substantially identical,
though this is not necessarily the case. From operation 408, the
method 400 proceeds to operation 410. The method 400 ends at
operation 410.
[0084] Turning now to FIG. 5, aspects of a method 500 for
generating animation transitions in a spreadsheet application will
be described in detail. As mentioned above, the functionality
described herein with respect to the method 500 can be performed by
the computer system 102 in operation 310 of the method 300 shown in
FIG. 3, though this is not necessarily the case.
[0085] The method 500 begins at operation 502, wherein the computer
system 102 determines a transition type for the scenes selected in
the method 300 illustrated in FIG. 3. As mentioned above,
"transition types" can include animations applied to the camera or
other animation viewpoint in a scene as it moves from a position
associated with a first scene, also referred to herein as a "start
point," to a position associated with a second scene, also referred
to herein as an "end point." Some example transition types are
illustrated and described in more detail herein with respect to
FIGS. 7F-7H.
[0086] Briefly, the transition types can include, but are not
limited to, a cut transition type, where a camera or other
animation viewpoint moves instantly from a first viewpoint
associated with a first scene to a second viewpoint associated with
a second scene; a cross-fade transition type where the camera at
the first viewpoint fades out or dissolves away a view of the first
viewpoint and fades in or dissolves in a view of the second
viewpoint; a linear or direct transition type, where the camera
moves along a flight path of constant height (relative to the
ground or surface on which the data is displayed) from a first
viewpoint associated with a first scene to a second viewpoint
associated with a second scene; an arc, jump, or curved transition
type, where the camera moves along a curved flight path of varying
heights that can mimic a flight path of an aircraft from a first
viewpoint associated with a first scene to a second viewpoint
associated with a second scene; a zoom-out/zoom-in transition type,
where the camera moves away or zooms out from a first viewpoint
associated with a first scene, to a second viewpoint at which the
first viewpoint and a third viewpoint associated with a second
scene are both visible, and finally to the third viewpoint;
combinations thereof; or the like.
[0087] In some embodiments, the computer system 102 can use a smart
transition type that determines the transition based upon a
distance between the two viewpoints. For distances that are father
apart, an arc or linear transition type may be chosen, while for
closer distances, the linear or zoom-in/zoom-out transition types
may be chosen. Because other transition types are contemplated, and
because the computer system 102 can choose the transition types
based upon various considerations, it should be understood that
these examples are illustrative, and should not be construed as
being limiting in any way.
[0088] In operation 502, the computer system 102 can identify a
transition type to be applied to the scene in various ways. In
particular, the computer system 102 can detect an option, setting,
or other preferences to determine the transition type. In some
embodiments, the computer system 102 can detect user input for
selecting the transition type. An example user interface that can
be used to select the transition type is illustrated and described
in more detail below with reference to FIGS. 8A-8B.
[0089] From operation 502, the method 500 proceeds to operation
504, wherein the computer system 102 determines if the transition
type determined in operation 502 corresponds to a "cut" type of
transition. As mentioned above, the "cut" type of transition can
include a transition in which the camera instantly moves from a
first viewpoint to a second viewpoint without any animation between
the two viewpoints. Thus, the cut transition can be effectively
equivalent to no transition between the scenes.
[0090] If the computer system 102 determines, in operation 504,
that the transition type is not a "cut" type transition, the method
500 proceeds to operation 506. In operation 506, the computer
system 102 determines if the transition type determined in
operation 502 corresponds to a "fade" type of transition. As noted
above, the "fade" type of transition can include fading out or
dissolving a view at the first viewpoint and fading in or
dissolving in a view at the second viewpoint.
[0091] If the computer system 102 determines, in operation 504,
that the transition type is not a "fade" type transition, the
method 500 proceeds to operation 508. In operation 508, the
computer system 102 can determine a distance between the scenes
selected in the multiple iterations of operation 304 described
above with reference to FIG. 3. In some embodiments, the distance
determined in operation 508 can correspond to an actual distance
between geographic locations associated with each of the first
scene and the second scene. According to various implementations of
the concepts and technologies disclosed herein, the geographic
location of a scene can correspond to a center of mass or center
point of data points as calculated with respect to the geographic
mapping data 124 obtained from the geocoding services 122 and/or
other data. Thus, in the example above of two scenes in New York
City and Washington, D.C., respectively, the computer system can
determine the distance as being equivalent to an actual geographic
distance between these two cities, for example, two hundred twenty
nine miles. Because the distance between the scenes can be
determined in other ways, it should be understood that the above
example is illustrative and should not be construed as being
limiting in any way.
[0092] From operation 508, the method 500 proceeds to operation 510
wherein the computer system 102 determines a duration for the
transition being applied to the scene. The duration can be defined
as a time duration of the transition to be animated. The duration
can be determined by receiving a duration from users or other
sources, or determined automatically by the computer system 102
based upon preferences or the like. Thus, for example, if the
duration is defined as one second, the computer system 102 can
animate the transition in one second. According to some
embodiments, the duration can be determined based upon a
user-settable or configurable option or setting.
[0093] The duration also can be determined based upon the
determined distance between the scenes. Thus, for example, the
computer system 102 can be configured to determine a rate of speed
for the camera movement animated in the transition and divide the
distance between the scenes by the determined speed, thereby
determining the duration. An example user interface for setting the
duration is illustrated and described below with reference to FIG.
8B. Because the duration can be determined in additional and/or
alternative ways, it should be understood that these examples are
illustrative, and should not be construed as being limiting in any
way.
[0094] From operation 510, the method 500 proceeds to operation
512, wherein the computer system 102 generates the transition
animation. In particular, the computer system 102 can generate the
transition animation based upon the determinations made in
operations 502-510. Thus, the computer system 102 can generate the
transition animation based upon the determined transition type, the
determined distance between the scenes, and the determined
duration. The computer system 102 can generate the transition
animation and output the animation frames for the visualization
and/or store the frames for these and/or other uses.
[0095] From operation 512, the method 500 proceeds to operation
514. The method 500 also can proceed to operation 514 in response
to determining, in operation 504, that the transition type
corresponds to a cut transition type. The method 500 also can
proceed to operation 514 in response to determining, in operation
506, that the transition type corresponds to a fade transition
type. The method 500 ends at operation 514.
[0096] Turning now to FIG. 6, additional aspects of the concepts
and technologies disclosed herein for animation transitions and
effects in a spreadsheet application will be described. FIG. 6 is a
line drawing illustrating camera placement in an example scene 600.
The scene 600 is illustrative and is provided solely for purposes
of illustrating and describing various aspects of the concepts and
technologies mentioned herein. Thus, the illustrated scene 600
should not be construed as being limiting in any way.
[0097] As shown in the scene 600, a camera 602 is shown. The camera
602 can correspond to a viewpoint from which the animation
associated with the scene 600 is illustrated. Thus, it should be
understood that the camera 602 does not necessarily correspond to a
real camera. The camera 602 focuses along a line of sight 604 on a
viewpoint or center point C. As explained above, the center point C
can correspond to a center of mass or focal point of the scene 600
and can be determined by the computer system 102 based upon the
spreadsheet data 118. The calculation of the center point C can be
completed based upon distribution of data points 606 and/or their
associated values. As shown in FIG. 6, the data points 606 can be
shown as columns, though other types of representations are
contemplated and are possible.
[0098] The camera 602 can travel along a flight path 608. The
flight path 608 shown in FIG. 6 can correspond to an orbit effect,
as explained above. Thus, the camera 602 can orbit the center point
C along the flight path 608 at a height h above the "ground" or
other surface on which the data points 606 are displayed. Thus, in
the orbit effect example shown in FIG. 6, it can be appreciated
that the flight path 608 can have a radius equal to the camera
distance from the center point C if the center point C was moved to
the height h of the flight path 608.
[0099] As shown in FIG. 6, the scene 600 also can include a data
area 610, which can correspond to a boundary of the data shown in
the scene. The data area 610 also can define the visible parts of
the scene 600, for example, the bounds of the scene 600 that
preferably are captured by the camera 602 when animating the scene
600.
[0100] Turning now to FIGS. 7A-7H, some example effects and
transitions are illustrated, according to various embodiments of
the concepts and technologies disclosed herein. FIG. 7A shows an
example of the orbit effect. As described herein, the orbit effect
can include a circular flight path 608 placed above the geo-spatial
data used to represent the ground and/or other surface for the data
illustrated in a particular scene. As explained in FIG. 6, the
flight path 608 can be placed at a vertical position, relative to
the ground or other surface at a height h, though this is not
necessarily the case. The camera 602 can circle the center point C
along the flight path 608 while keeping the center point C in its
focus. It should be understood that this embodiment is
illustrative, and should not be construed as being limiting in any
way.
[0101] FIG. 7B shows a stationary camera effect. Thus, as shown,
the camera 602 can focus on the center point C along a line of
sight from a camera distance d. FIG. 7C shows a fly by effect. As
shown in FIG. 7C, the camera 602 can fly along a flight path 702
that passes over the center point C. According to various
embodiments, the camera 602 can move along a flight path 702 that
is perpendicular to a view line (not visible in FIG. 7C). It should
be understood that the flight path 702 can be offset to a side of
the center point C at a camera distance, and/or can fly directly
"over" the center point C, wherein the camera distance can be
substantially equal to the height of the flight path 702. It should
be understood that this embodiment is illustrative, and should not
be construed as being limiting in any way.
[0102] FIG. 7D illustrates a line effect, wherein the camera 602
flies back and forth between two positions along a flight path 704.
It should be understood that the flight path 704 can be offset to a
side of the center point C at a camera distance, and/or can fly
directly "over" the center point C, wherein the camera distance can
be substantially equal to the height of the flight path 704. It
should be understood that this embodiment is illustrative, and
should not be construed as being limiting in any way.
[0103] FIG. 7E illustrates a figure eight effect. The camera 602
can move along a flight path 706 while focusing on the center point
C. Thus, it can be appreciated that the camera distance can vary in
the figure eight effect. It should be understood that this
embodiment is illustrative, and should not be construed as being
limiting in any way.
[0104] FIG. 7F illustrates a linear or direct transition type. As
mentioned above, in the linear or direct transition, the camera 602
moves along a flight path 708 from one scene to a next scene. Thus,
the camera 602 can move from a first viewpoint VP.sub.1 associated
with a first scene and/or center point C.sub.1 to a second
viewpoint VP.sub.2 associated with a second scene and/or center
point C.sub.2. As mentioned above, the height of the camera
relative to the ground 710 and therefore the height of the flight
path 708 can be constant and can correspond to the height h shown
in FIG. 7F. It should be understood that this embodiment is
illustrative, and should not be construed as being limiting in any
way.
[0105] FIG. 7G illustrates an arc, jump, or curved transition type.
As mentioned above, in arc transition, the camera 602 moves along a
flight path 712 from one scene to a next scene. Thus, the camera
602 can move from a first viewpoint VP.sub.1 associated with a
first scene and/or center point C.sub.1 to a second viewpoint
VP.sub.2 associated with a second scene and/or center point
C.sub.2. As mentioned above, the height of the camera relative to
the ground 710, and therefore the height of the flight path 712,
can grow from a first height h.sub.1 at a first viewpoint VP.sub.1
associated with a first scene and/or center point C.sub.1, to a
height h.sub.2 at an apex 714 of the flight path 712, and then can
be reduced back to the height h.sub.1 at a second viewpoint
VP.sub.2 associated with a second scene and/or center point
C.sub.2. Thus, it can be appreciated that the arc, jump, or curved
transition type can simulate an aircraft flight between the two
scenes, though this is not necessarily the case. It should be
understood that this embodiment is illustrative, and should not be
construed as being limiting in any way.
[0106] FIG. 7H illustrates a zoom-in/zoom-out transition type. As
mentioned above, in the zoom-in/zoom-out transition, the camera 602
may or may not move. In some embodiments, the camera 602 zooms out
from a first zoom level at which the first scene and/or center
point C.sub.1 is visible to a second zoom level at which the first
viewpoint VP.sub.1 and a second viewpoint VP.sub.2 associated with
a second scene and/or center point C.sub.2 are both visible. The
camera 602 can move from a first line or sight 716A to a second
line of sight 716B, and then can zoom into the scene until only the
second viewpoint VP.sub.2 associated with a second scene and/or
center point C.sub.2 is visible. It should be understood that this
embodiment is illustrative, and should not be construed as being
limiting in any way.
[0107] Turning now to FIGS. 8A-8B, UI diagrams showing various
aspects of the concepts and technologies disclosed herein for
animation transitions and effects in a spreadsheet application will
be described according to various illustrative embodiments. FIG. 8A
shows an illustrative screen display 800A generated by a device
such as the computer system 102 and/or the user computing device
112. In some embodiments, the screen display 800A can correspond to
the UI 114 displayed by the user computing device 112, as shown in
FIG. 1, though this is not necessarily the case. It should be
appreciated that the UI diagram illustrated in FIG. 8A is
illustrative of one contemplated example, and therefore should not
be construed as being limited in any way.
[0108] As shown in FIG. 8A, the screen display 800A can include a
tour display screen 802 for viewing scenes associated with a tour
of a three-dimensional visualization of data such as the
spreadsheet data 118 described herein. In the illustrated
embodiment, the screen display 800A includes UI controls 804A-C,
the selection of which can cause the computer system 102 and/or the
user computing device 112 to open various options and/or settings
associated with the scene in a scene properties bar 806. As shown
in FIG. 8A, a user or other entity can select the UI controls
804A-C using a touch input such as a tap with a finger or hand.
Because the UI controls 804A-C can be selected using other input
mechanisms and/or devices, it should be understood that this
embodiment is illustrative, and should not be construed as being
limiting in any way.
[0109] The screen display 800A also includes a time control window
808 for moving an animation corresponding to the tour with respect
to temporal data included in the spreadsheet data 118 represented
in the tour. The use of the time control window 808 is discussed in
additional detail below with reference to FIG. 8B. For purposes of
illustrating and describing the embodiments of the concepts and
technologies disclosed herein, it is assumed that a user or other
entity selects the UI control 804A to view and/or change various
properties of the effects and/or transitions associated with scene
one. It can be appreciated from the description of FIG. 3 that the
transition may apply to a transition from scene one to a next scene
of the tour and/or from a scene before scene one to scene one.
Thus, it should be understood that this embodiment is illustrative,
and should not be construed as being limiting in any way.
[0110] FIG. 8B shows an illustrative screen display 800B generated
by a device such as the computer system 102 and/or the user
computing device 112. In some embodiments, the screen display 800B
can correspond to the UI 114 displayed by the user computing device
112, as shown in FIG. 1, though this is not necessarily the case.
As mentioned above, the screen display 800B can, but is not
necessarily, displayed in response to detecting a command and/or
input for adjusting settings associated with scene one such as a
tap on the UI control 804A. Because the screen display 800B can be
shown at additional and/or alternative times, it should be
understood that this example is illustrative and should not be
construed as being limiting in any way.
[0111] As shown in FIG. 8B, the screen display 800B can include the
properties bar 806 shown in FIG. 8A, but the properties bar 806 can
display various settings and/or properties associated with scene
one. The properties bar 806 can include an indicator 810 of the
scene for which properties and/or settings are displayed. The
properties bar 806 also can include a UI control 812 for setting a
duration of the scene. Thus, the UI control 812 can be used to set
a time duration for an animation of scene one. While the UI control
812 is illustrated as showing a duration of two minutes, thirty
seconds, it should be understood that this embodiment is
illustrative, and should not be construed as being limiting in any
way. It also can be appreciated from the description of FIG. 4,
that the duration set via the UI control 812 also can correspond to
the duration of the effect applied to the scene, though this is not
necessarily the case.
[0112] The properties bar 806 also can include a transition menu
814 for setting various aspects of the transition between two
scenes. Because scene one is selected and/or is being displayed in
the properties bar 806, the transition configured via the
transition menu 814 can correspond to a transition between scene
one and scene two. It should be understood that this embodiment is
illustrative, and should not be construed as being limiting in any
way. The transition menu 814 can include a type control 816 for
setting the transition type. It can be appreciated that the
computer system 102 can be configured to detect input in the type
control 816 to perform the functionality described herein with
respect to operation 502 of the method 500, though this is not
necessarily the case.
[0113] The transition menu 814 also can include a camera direction
control 818 for specifying a camera view direction during the
transition. In the illustrated embodiment, the camera direction
control 818 is shown as being set to "next," which can indicate
that the camera is to focus on the next scene during the
transition. According to various embodiments, the camera direction
can be set to "previous," "next," and/or "interpolate" to focus on
a previous scene, a next scene, and/or an interpolated direction,
respectively. Thus, it should be understood that the illustrated
setting is illustrative and should not be construed as being
limiting in any way.
[0114] The transition menu 814 also includes a duration control 820
for setting the duration of the transition. In the illustrated
embodiment, the duration control 820 is shown as being set to
"medium," which can indicate that the duration of the transition is
to be a medium length time. The length of a medium duration can be
set by a user or a program developer and can correspond, in various
embodiments, to six seconds and/or other times. According to
various embodiments, the duration can be set to "short," "medium,"
"long" and/or "custom," which can correspond to a four second
duration, a six second duration, an eight second duration, and/or a
custom duration, respectively. Thus, it should be understood that
the illustrated setting is illustrative and should not be construed
as being limiting in any way.
[0115] The properties bar 806 also can include an effect menu 822
for setting various aspects of the effect to be applied to the
selected scene. Because scene one is selected and/or is being
displayed in the properties bar 806, the effect configured via the
effect menu 822 can correspond to an effect applied to scene one.
The effect menu 822 can include an effect type control 824 for
setting the effect type. It can be appreciated that the computer
system 102 can be configured to detect input in the effect type
control 824 to perform the functionality described herein with
respect to operation 402 of the method 400, though this is not
necessarily the case.
[0116] The effect type control 824 is illustrated as displaying an
orbit type effect. It should be understood that this indication is
illustrative. In particular, the effect type control 824 can
display other effect types including, but not limited to, no
effect, orbit, fly by, figure eight, a line effect, and/or other
effects. The effect menu 822 also can display an effect speed
control 826. The effect speed control can be used to set a speed of
the effect shown in the effect type control 824. According to
various embodiments, the computer system 102 can be configured to
detect input in the effect speed control 826 to perform the
functionality described herein with respect to operation 404 of the
method 400, though this is not necessarily the case.
[0117] The effect speed control 826 can be populated based upon the
selection in the type control 824, in some embodiments. In
particular, if the "none" option in the effect type control 824 is
selected, the effect speed control 826 can be empty. If the "orbit"
option or the "figure eight" option is selected in the effect type
control 824, the effect speed control 826 can be populated with
short, medium, long, and custom options. If the "fly by" option or
the "line" option is selected in the effect type control 824, the
effect speed control 826 can be populated with short, medium, and
long options. Because the effect speed control 826 can be populated
with additional and/or alternative options, it should be understood
that these embodiments are illustrative, and should not be
construed as being limiting in any way.
[0118] The screen display 800B also includes the time control
window 808 shown in FIG. 8A. The time control window 808 can be
used to play and/or scrub through the scene and/or tour. The
computer system 102 can be configured to display a preview screen
828 of the scene and/or tour on the screen display 800B, in some
embodiments. The time control window 808 can include a scrubber
control 830 that can display a time of the tour or scene shown in
the preview screen 828. The time control window 808 can include
additional and/or alternative controls. As such, it should be
understood that this embodiment is illustrative, and should not be
construed as being limiting in any way.
[0119] According to various embodiments of the concepts and
technologies disclosed herein, the visualization component 110 can
support a simplified flow in creating tours. In some embodiments, a
tour and/or a scene can be created automatically by the computer
system 102 as soon as geocoding starts. According to some
embodiments, a globe or other display surface may not show any data
on the globe before geocoding is initiated.
[0120] According to various embodiments, the computer system 102
can preserve state by creating scenes and tours. Tours also can be
deleted, and in some instances, if a last tour is deleted, the
computer system 102 can delete metadata from the spreadsheet data
112. As such, embodiments of the concepts and technologies
disclosed herein can enable management of tours and scenes by
users.
[0121] According to embodiments of the concepts and technologies
disclosed herein, the visualization component 110 can be configured
to animate data time over a duration of a scene. A user or other
entity can choose or specify a start time and an end time of the
data. The duration of the scene can be used to determine how fast
data time is animated during playback of a scene or tour
corresponding to the data. In some instances, if the duration of
the scene increases, data time can be played slower. Similarly, if
the speed of playback of data time increases, the duration of the
scene can be decreased. As such, two controls, playback speed and
duration can collectively define a speed of playback of data
time.
[0122] According to some embodiments, transitions can correspond to
animations between two or more scenes. Transitions can have a type
and a duration. A camera path, an associated turning, angles,
speed, acceleration, height of the arch, and/or other aspects of
camera movement associated with a transition can be adjusted and
calculated by the visualization component 110 automatically. An
algorithm can be tuned to minimize a number of turns of the camera
during a transition and/or effect, to optimize a viewing angle
and/or framing of data in scenes, and the like. In some
implementations, transitions can be assigned to each scene of a
tour automatically. In some embodiments, transitions can be
disabled for a first scene.
[0123] Effects can be enabled and assigned automatically to scenes.
In some instances, a default effect can be defined. A user can
choose a type of effect and a speed at which an effect is applied.
The effect may last for an entire duration of a scene and can be
matched to the scene duration. Also, loop effects such as orbit and
figure eight can be looped multiple times if the duration of the
scene is longer than duration of the effect. In the case of linear
effects, the visualization component 110 can be configured to
locate a focus of a scene in the middle of the linear trajectory
for some effects. For some other effects, a capture spot can be
adjusted as the camera or other viewpoint moves along a path. The
line of travel or trajectory can get shorter or longer depending on
the speed of the effect, which may be adjusted by a user or other
entity. In addition, the effect may change depending on a height at
which the scene is captured. Effects also can change direction
automatically so that less turning is applied inside a scene and so
that the effect in the scene.
[0124] FIG. 9 illustrates an illustrative computer architecture 900
for a device capable of executing the software components described
herein for animation transitions and effects in a spreadsheet
application. Thus, the computer architecture 900 illustrated in
FIG. 9 illustrates an architecture for a server computer, mobile
phone, a PDA, a smart phone, a desktop computer, a netbook
computer, a tablet computer, and/or a laptop computer. The computer
architecture 900 may be utilized to execute any aspects of the
software components presented herein.
[0125] The computer architecture 900 illustrated in FIG. 9 includes
a central processing unit 902 ("CPU"), a system memory 904,
including a random access memory 906 ("RAM") and a read-only memory
("ROM") 908, and a system bus 910 that couples the memory 904 to
the CPU 902. A basic input/output system containing the basic
routines that help to transfer information between elements within
the computer architecture 900, such as during startup, is stored in
the ROM 908. The computer architecture 900 further includes a mass
storage device 912 for storing the operating system 106 and one or
more application programs including, but not limited to, the
spreadsheet application 108, the visualization component 110, other
application programs, or the like. Although not shown in FIG. 9,
the mass storage device 912 also can be configured to store the
spreadsheet data 118, the geographic mapping data 124, the map data
126, and/or graphical data corresponding to one or more of the UIs
114 described herein, if desired.
[0126] The mass storage device 912 is connected to the CPU 902
through a mass storage controller (not shown) connected to the bus
910. The mass storage device 912 and its associated
computer-readable media provide non-volatile storage for the
computer architecture 900. Although the description of
computer-readable media contained herein refers to a mass storage
device, such as a hard disk or CD-ROM drive, it should be
appreciated by those skilled in the art that computer-readable
media can be any available computer storage media or communication
media that can be accessed by the computer architecture 900.
[0127] Communication media includes computer readable instructions,
data structures, program modules, or other data in a modulated data
signal such as a carrier wave or other transport mechanism and
includes any delivery media. The term "modulated data signal" means
a signal that has one or more of its characteristics changed or set
in a manner as to encode information in the signal. By way of
example, and not limitation, communication media includes wired
media such as a wired network or direct-wired connection, and
wireless media such as acoustic, RF, infrared and other wireless
media. Combinations of the any of the above should also be included
within the scope of computer-readable media.
[0128] By way of example, and not limitation, computer storage
media may include volatile and non-volatile, removable and
non-removable media implemented in any method or technology for
storage of information such as computer-readable instructions, data
structures, program modules or other data. For example, computer
media includes, but is not limited to, RAM, ROM, EPROM, EEPROM,
flash memory or other solid state memory technology, CD-ROM,
digital versatile disks ("DVD"), HD-DVD, BLU-RAY, or other optical
storage, magnetic cassettes, magnetic tape, magnetic disk storage
or other magnetic storage devices, or any other medium that can be
used to store the desired information and which can be accessed by
the computer architecture 900. For purposes of the claims, the
phrase "computer storage medium," and variations thereof, does not
include waves or signals per se and/or communication media.
[0129] According to various embodiments, the computer architecture
900 may operate in a networked environment using logical
connections to remote computers through a network such as the
network 104. The computer architecture 900 may connect to the
network 104 through a network interface unit 914 connected to the
bus 910. It should be appreciated that the network interface unit
914 also may be utilized to connect to other types of networks and
remote computer systems such as, for example, the user computing
device 112, the data source 120, the geocoding services 122, the
map server 128, and/or other systems or devices. The computer
architecture 900 also may include an input/output controller 916
for receiving and processing input from a number of other devices,
including a keyboard, mouse, or electronic stylus (not shown in
FIG. 9). Similarly, the input/output controller 916 may provide
output to a display screen, a printer, or other type of output
device (also not shown in FIG. 9).
[0130] It should be appreciated that the software components
described herein may, when loaded into the CPU 902 and executed,
transform the CPU 902 and the overall computer architecture 900
from a general-purpose computing system into a special-purpose
computing system customized to facilitate the functionality
presented herein. The CPU 902 may be constructed from any number of
transistors or other discrete circuit elements, which may
individually or collectively assume any number of states. More
specifically, the CPU 902 may operate as a finite-state machine, in
response to executable instructions contained within the software
modules disclosed herein. These computer-executable instructions
may transform the CPU 902 by specifying how the CPU 902 transitions
between states, thereby transforming the transistors or other
discrete hardware elements constituting the CPU 902.
[0131] Encoding the software modules presented herein also may
transform the physical structure of the computer-readable media
presented herein. The specific transformation of physical structure
may depend on various factors, in different implementations of this
description. Examples of such factors may include, but are not
limited to, the technology used to implement the computer-readable
media, whether the computer-readable media is characterized as
primary or secondary storage, and the like. For example, if the
computer-readable media is implemented as semiconductor-based
memory, the software disclosed herein may be encoded on the
computer-readable media by transforming the physical state of the
semiconductor memory. For example, the software may transform the
state of transistors, capacitors, or other discrete circuit
elements constituting the semiconductor memory. The software also
may transform the physical state of such components in order to
store data thereupon.
[0132] As another example, the computer-readable media disclosed
herein may be implemented using magnetic or optical technology. In
such implementations, the software presented herein may transform
the physical state of magnetic or optical media, when the software
is encoded therein. These transformations may include altering the
magnetic characteristics of particular locations within given
magnetic media. These transformations also may include altering the
physical features or characteristics of particular locations within
given optical media, to change the optical characteristics of those
locations. Other transformations of physical media are possible
without departing from the scope and spirit of the present
description, with the foregoing examples provided only to
facilitate this discussion.
[0133] In light of the above, it should be appreciated that many
types of physical transformations take place in the computer
architecture 900 in order to store and execute the software
components presented herein. It also should be appreciated that the
computer architecture 900 may include other types of computing
devices, including hand-held computers, embedded computer systems,
personal digital assistants, and other types of computing devices
known to those skilled in the art. It is also contemplated that the
computer architecture 900 may not include all of the components
shown in FIG. 9, may include other components that are not
explicitly shown in FIG. 9, or may utilize an architecture
completely different than that shown in FIG. 9.
[0134] FIG. 10 illustrates an illustrative distributed computing
environment 1000 capable of executing the software components
described herein for animation transitions and effects in a
spreadsheet application. Thus, the distributed computing
environment 1000 illustrated in FIG. 10 can be used to provide the
functionality described herein with respect to the computer system
102. The distributed computing environment 1000 thus may be
utilized to execute any aspects of the software components
presented herein.
[0135] According to various implementations, the distributed
computing environment 1000 includes a computing environment 1002
operating on, in communication with, or as part of the network
1004. The network 1004 also can include various access networks.
According to various implementations, the functionality of the
network 1004 can be provided by the network 104 illustrated in FIG.
1. One or more client devices 1006A-1006N (hereinafter referred to
collectively and/or generically as "clients 1006") can communicate
with the computing environment 1002 via the network 1004 and/or
other connections (not illustrated in FIG. 10). In the illustrated
embodiment, the clients 1006 include a computing device 1006A such
as a laptop computer, a desktop computer, or other computing
device; a slate or tablet computing device ("tablet computing
device") 1006B; a mobile computing device 1006C such as a mobile
telephone, a smart phone, or other mobile computing device; a
server computer 1006D; and/or other devices 1006N. It should be
understood that any number of clients 1006 can communicate with the
computing environment 1002. Two example computing architectures for
the clients 1006 are illustrated and described herein with
reference to FIGS. 9 and 11. It should be understood that the
illustrated clients 1006 and computing architectures illustrated
and described herein are illustrative, and should not be construed
as being limited in any way.
[0136] In the illustrated embodiment, the computing environment
1002 includes application servers 1008, data storage 1010, and one
or more network interfaces 1012. According to various
implementations, the functionality of the application servers 1008
can be provided by one or more server computers that are executing
as part of, or in communication with, the network 1004. The
application servers 1008 can host various services, virtual
machines, portals, and/or other resources. In the illustrated
embodiment, the application servers 1008 host one or more virtual
machines 1014 for hosting applications or other functionality.
According to various implementations, the virtual machines 1014
host one or more applications and/or software modules for providing
the functionality described herein for animation transitions and
effects in a spreadsheet application. It should be understood that
this embodiment is illustrative, and should not be construed as
being limiting in any way. The application servers 1008 also host
or provide access to one or more Web portals, link pages, Web
sites, and/or other information ("Web portals") 1016.
[0137] According to various implementations, the application
servers 1008 also include one or more mailbox services 1018 and one
or more messaging services 1020. The mailbox services 1018 can
include electronic mail ("email") services. The mailbox services
1018 also can include various personal information management
("PIM") services including, but not limited to, calendar services,
contact management services, collaboration services, and/or other
services. The messaging services 1020 can include, but are not
limited to, instant messaging services, chat services, forum
services, and/or other communication services.
[0138] The application servers 1008 also can include one or more
social networking services 1022. The social networking services
1022 can include various social networking services including, but
not limited to, services for sharing or posting status updates,
instant messages, links, photos, videos, and/or other information;
services for commenting or displaying interest in articles,
products, blogs, or other resources; and/or other services. In some
embodiments, the social networking services 1022 are provided by or
include the FACEBOOK social networking service, the LINKEDIN
professional networking service, the MYSPACE social networking
service, the FOURSQUARE geographic networking service, the YAMMER
office colleague networking service, and the like. In other
embodiments, the social networking services 1022 are provided by
other services, sites, and/or providers that may or may not
explicitly be known as social networking providers. For example,
some web sites allow users to interact with one another via email,
chat services, and/or other means during various activities and/or
contexts such as reading published articles, commenting on goods or
services, publishing, collaboration, gaming, and the like. Examples
of such services include, but are not limited to, the WINDOWS LIVE
service and the XBOX LIVE service from Microsoft Corporation in
Redmond, Wash. Other services are possible and are
contemplated.
[0139] The social networking services 1022 also can include
commenting, blogging, and/or microblogging services. Examples of
such services include, but are not limited to, the YELP commenting
service, the KUDZU review service, the OFFICETALK enterprise
microblogging service, the TWITTER messaging service, the GOOGLE
BUZZ service, and/or other services. It should be appreciated that
the above lists of services are not exhaustive and that numerous
additional and/or alternative social networking services 1022 are
not mentioned herein for the sake of brevity. As such, the above
embodiments are illustrative, and should not be construed as being
limited in any way.
[0140] As shown in FIG. 10, the application servers 1008 also can
host other services, applications, portals, and/or other resources
("other resources") 1024. The other resources 1024 can include, but
are not limited to, the geocoding services 122, the map server 128,
the data source 120, and/or other services and/or resources. It
thus can be appreciated that the computing environment 1002 can
provide integration of the concepts and technologies disclosed
herein provided herein for animation transitions and effects in a
spreadsheet application with various mailbox, messaging, social
networking, and/or other services or resources. For example, the
concepts and technologies disclosed herein can support sharing
visualizations with social network users, mail recipients, message
recipients or the like. Similarly, users or other entities can
share visualizations and/or spreadsheet data 118 with social
networking users, friends, connections, mail recipients, systems or
devices, combinations thereof, or the like.
[0141] As mentioned above, the computing environment 1002 can
include the data storage 1010. According to various
implementations, the functionality of the data storage 1010 is
provided by one or more databases operating on, or in communication
with, the network 1004. The functionality of the data storage 1010
also can be provided by one or more server computers configured to
host data for the computing environment 1002. The data storage 1010
can include, host, or provide one or more real or virtual
datastores 1026A-1026N (hereinafter referred to collectively and/or
generically as "datastores 1026"). The datastores 1026 are
configured to host data used or created by the application servers
1008 and/or other data. Although not illustrated in FIG. 10, the
datastores 1026 also can host or store the operating system 106,
the spreadsheet application 108, the visualization component 110,
graphics data corresponding to one or more UIs 114, the spreadsheet
data 118, the geographic mapping data 124, the map data 126,
combinations thereof, or the like.
[0142] The computing environment 1002 can communicate with, or be
accessed by, the network interfaces 1012. The network interfaces
1012 can include various types of network hardware and software for
supporting communications between two or more computing devices
including, but not limited to, the clients 1006 and the application
servers 1008. It should be appreciated that the network interfaces
1012 also may be utilized to connect to other types of networks
and/or computer systems.
[0143] It should be understood that the distributed computing
environment 1000 described herein can provide any aspects of the
software elements described herein with any number of virtual
computing resources and/or other distributed computing
functionality that can be configured to execute any aspects of the
software components disclosed herein. According to various
implementations of the concepts and technologies disclosed herein,
the distributed computing environment 1000 provides the software
functionality described herein as a service to the clients 1006. It
should be understood that the clients 1006 can include real or
virtual machines including, but not limited to, server computers,
web servers, personal computers, mobile computing devices, smart
phones, and/or other devices. As such, various embodiments of the
concepts and technologies disclosed herein enable any device
configured to access the distributed computing environment 1000 to
utilize the functionality described herein for animation
transitions and effects in a spreadsheet application.
[0144] Turning now to FIG. 11, an illustrative computing device
architecture 1100 for a computing device that is capable of
executing various software components described herein for
animation transitions and effects in a spreadsheet application. The
computing device architecture 1100 is applicable to computing
devices that facilitate mobile computing due, in part, to form
factor, wireless connectivity, and/or battery-powered operation. In
some embodiments, the computing devices include, but are not
limited to, mobile telephones, tablet devices, slate devices,
portable video game devices, and the like. Moreover, the computing
device architecture 1100 is applicable to any of the clients 1106
shown in FIG. 10. Furthermore, aspects of the computing device
architecture 1100 may be applicable to traditional desktop
computers, portable computers (e.g., laptops, notebooks,
ultra-portables, and netbooks), server computers, and other
computer systems, such as described herein with reference to FIG.
9. For example, the single touch and multi-touch aspects disclosed
herein below may be applied to desktop computers that utilize a
touchscreen or some other touch-enabled device, such as a
touch-enabled track pad or touch-enabled mouse.
[0145] The computing device architecture 1100 illustrated in FIG.
11 includes a processor 1102, memory components 1104, network
connectivity components 1106, sensor components 1108, input/output
components 1110, and power components 1112. In the illustrated
embodiment, the processor 1102 is in communication with the memory
components 1104, the network connectivity components 1106, the
sensor components 1108, the input/output ("I/O") components 1110,
and the power components 1112. Although no connections are shown
between the individuals components illustrated in FIG. 11, the
components can interact to carry out device functions. In some
embodiments, the components are arranged so as to communicate via
one or more busses (not shown).
[0146] The processor 1102 includes a central processing unit
("CPU") configured to process data, execute computer-executable
instructions of one or more application programs, and communicate
with other components of the computing device architecture 1100 in
order to perform various functionality described herein. The
processor 1102 may be utilized to execute aspects of the software
components presented herein and, particularly, those that utilize,
at least in part, a touch-enabled input.
[0147] In some embodiments, the processor 1102 includes a graphics
processing unit ("GPU") configured to accelerate operations
performed by the CPU, including, but not limited to, operations
performed by executing general-purpose scientific and engineering
computing applications, as well as graphics-intensive computing
applications such as high resolution video (e.g., 1020p, 1080p, and
greater), video games, three-dimensional modeling applications, and
the like. In some embodiments, the processor 1102 is configured to
communicate with a discrete GPU (not shown). In any case, the CPU
and GPU may be configured in accordance with a co-processing
CPU/GPU computing model, wherein the sequential part of an
application executes on the CPU and the computationally-intensive
part is accelerated by the GPU.
[0148] In some embodiments, the processor 1102 is, or is included
in, a system-on-chip ("SoC") along with one or more of the other
components described herein below. For example, the SoC may include
the processor 1102, a GPU, one or more of the network connectivity
components 1106, and one or more of the sensor components 1108. In
some embodiments, the processor 1102 is fabricated, in part,
utilizing a package-on-package ("PoP") integrated circuit packaging
technique. Moreover, the processor 1102 may be a single core or
multi-core processor.
[0149] The processor 1102 may be created in accordance with an ARM
architecture, available for license from ARM HOLDINGS of Cambridge,
United Kingdom. Alternatively, the processor 1102 may be created in
accordance with an x86 architecture, such as is available from
INTEL CORPORATION of Mountain View, Calif. and others. In some
embodiments, the processor 1102 is a SNAPDRAGON SoC, available from
QUALCOMM of San Diego, Calif., a TEGRA SoC, available from NVIDIA
of Santa Clara, Calif., a HUMMINGBIRD SoC, available from SAMSUNG
of Seoul, South Korea, an Open Multimedia Application Platform
("OMAP") SoC, available from TEXAS INSTRUMENTS of Dallas, Tex., a
customized version of any of the above SoCs, or a proprietary
SoC.
[0150] The memory components 1104 include a random access memory
("RAM") 1114, a read-only memory ("ROM") 1116, an integrated
storage memory ("integrated storage") 1118, and a removable storage
memory ("removable storage") 1120. In some embodiments, the RAM
1114 or a portion thereof, the ROM 1116 or a portion thereof,
and/or some combination the RAM 1114 and the ROM 1116 is integrated
in the processor 1102. In some embodiments, the ROM 1116 is
configured to store a firmware, an operating system or a portion
thereof (e.g., operating system kernel), and/or a bootloader to
load an operating system kernel from the integrated storage 1118 or
the removable storage 1120.
[0151] The integrated storage 1118 can include a solid-state
memory, a hard disk, or a combination of solid-state memory and a
hard disk. The integrated storage 1118 may be soldered or otherwise
connected to a logic board upon which the processor 1102 and other
components described herein also may be connected. As such, the
integrated storage 1118 is integrated in the computing device. The
integrated storage 1118 is configured to store an operating system
or portions thereof, application programs, data, and other software
components described herein.
[0152] The removable storage 1120 can include a solid-state memory,
a hard disk, or a combination of solid-state memory and a hard
disk. In some embodiments, the removable storage 1120 is provided
in lieu of the integrated storage 1118. In other embodiments, the
removable storage 1120 is provided as additional optional storage.
In some embodiments, the removable storage 1120 is logically
combined with the integrated storage 1118 such that the total
available storage is made available and shown to a user as a total
combined capacity of the integrated storage 1118 and the removable
storage 1120.
[0153] The removable storage 1120 is configured to be inserted into
a removable storage memory slot (not shown) or other mechanism by
which the removable storage 1120 is inserted and secured to
facilitate a connection over which the removable storage 1120 can
communicate with other components of the computing device, such as
the processor 1102. The removable storage 1120 may be embodied in
various memory card formats including, but not limited to, PC card,
CompactFlash card, memory stick, secure digital ("SD"), miniSD,
microSD, universal integrated circuit card ("UICC") (e.g., a
subscriber identity module ("SIM") or universal SIM ("USIM")), a
proprietary format, or the like.
[0154] It can be understood that one or more of the memory
components 1104 can store an operating system. According to various
embodiments, the operating system includes, but is not limited to,
SYMBIAN OS from SYMBIAN LIMITED, WINDOWS MOBILE OS from Microsoft
Corporation of Redmond, Wash., WINDOWS PHONE OS from Microsoft
Corporation, WINDOWS from Microsoft Corporation, PALM WEBOS from
Hewlett-Packard Company of Palo Alto, Calif., BLACKBERRY OS from
Research In Motion Limited of Waterloo, Ontario, Canada, IOS from
Apple Inc. of Cupertino, Calif., and ANDROID OS from Google Inc. of
Mountain View, Calif. Other operating systems are contemplated.
[0155] The network connectivity components 1106 include a wireless
wide area network component ("WWAN component") 1122, a wireless
local area network component ("WLAN component") 1124, and a
wireless personal area network component ("WPAN component") 1126.
The network connectivity components 1106 facilitate communications
to and from a network 1128, which may be a WWAN, a WLAN, or a WPAN.
Although a single network 1128 is illustrated, the network
connectivity components 1106 may facilitate simultaneous
communication with multiple networks. For example, the network
connectivity components 1106 may facilitate simultaneous
communications with multiple networks via one or more of a WWAN, a
WLAN, or a WPAN.
[0156] In some embodiments, the network 1128 can correspond to the
network 104 and/or the network 1004 illustrated and described in
FIGS. 1 and 9-10. In some other embodiments, the network 1128 can
include the network 104 illustrated and described with reference to
FIGS. 1 and 9 and/or the network 1004 illustrated and described in
FIG. 10. In yet other embodiments, the network 1128 can provide
access to the network 104 illustrated and described with reference
to FIGS. 1 and 9 and/or the network 1004 illustrated and described
in FIG. 10.
[0157] The network 1128 may be a WWAN, such as a mobile
telecommunications network utilizing one or more mobile
telecommunications technologies to provide voice and/or data
services to a computing device utilizing the computing device
architecture 1100 via the WWAN component 1122. The mobile
telecommunications technologies can include, but are not limited
to, Global System for Mobile communications ("GSM"), Code Division
Multiple Access ("CDMA") ONE, CDMA2000, Universal Mobile
Telecommunications System ("UMTS"), Long Term Evolution ("LTE"),
and Worldwide Interoperability for Microwave Access ("WiMAX").
Moreover, the network 1128 may utilize various channel access
methods (which may or may not be used by the aforementioned
standards) including, but not limited to, Time Division Multiple
Access ("TDMA"), Frequency Division Multiple Access ("FDMA"), CDMA,
wideband CDMA ("W-CDMA"), Orthogonal Frequency Division
Multiplexing ("OFDM"), Space Division Multiple Access ("SDMA"), and
the like. Data communications may be provided using General Packet
Radio Service ("GPRS"), Enhanced Data rates for Global Evolution
("EDGE"), the High-Speed Packet Access ("HSPA") protocol family
including High-Speed Downlink Packet Access ("HSDPA"), Enhanced
Uplink ("EUL") or otherwise termed High-Speed Uplink Packet Access
("HSUPA"), Evolved HSPA ("HSPA+"), LTE, and various other current
and future wireless data access standards. The network 1128 may be
configured to provide voice and/or data communications with any
combination of the above technologies. The network 1128 may be
configured to or adapted to provide voice and/or data
communications in accordance with future generation
technologies.
[0158] In some embodiments, the WWAN component 1122 is configured
to provide dual- multi-mode connectivity to the network 1128. For
example, the WWAN component 1122 may be configured to provide
connectivity to the network 1128, wherein the network 1128 provides
service via GSM and UMTS technologies, or via some other
combination of technologies. Alternatively, multiple WWAN
components 1122 may be utilized to perform such functionality,
and/or provide additional functionality to support other
non-compatible technologies (i.e., incapable of being supported by
a single WWAN component). The WWAN component 1122 may facilitate
similar connectivity to multiple networks (e.g., a UMTS network and
an LTE network).
[0159] The network 1128 may be a WLAN operating in accordance with
one or more Institute of Electrical and Electronic Engineers
("IEEE") 802.11 standards, such as IEEE 802.11a, 802.11b, 802.11g,
802.11n, and/or future 802.11 standard (referred to herein
collectively as WI-FI). Draft 802.11 standards are also
contemplated. In some embodiments, the WLAN is implemented
utilizing one or more wireless WI-FI access points. In some
embodiments, one or more of the wireless WI-FI access points are
another computing device with connectivity to a WWAN that are
functioning as a WI-FI hotspot. The WLAN component 1124 is
configured to connect to the network 1128 via the WI-FI access
points. Such connections may be secured via various encryption
technologies including, but not limited, WI-FI Protected Access
("WPA"), WPA2, Wired Equivalent Privacy ("WEP"), and the like.
[0160] The network 1128 may be a WPAN operating in accordance with
Infrared Data Association ("IrDA"), BLUETOOTH, wireless Universal
Serial Bus ("USB"), Z-Wave, ZIGBEE, or some other short-range
wireless technology. In some embodiments, the WPAN component 1126
is configured to facilitate communications with other devices, such
as peripherals, computers, or other computing devices via the
WPAN.
[0161] The sensor components 1108 include a magnetometer 1130, an
ambient light sensor 1132, a proximity sensor 1134, an
accelerometer 1136, a gyroscope 1138, and a Global Positioning
System sensor ("GPS sensor") 1140. It is contemplated that other
sensors, such as, but not limited to, temperature sensors or shock
detection sensors, also may be incorporated in the computing device
architecture 1100.
[0162] The magnetometer 1130 is configured to measure the strength
and direction of a magnetic field. In some embodiments the
magnetometer 1130 provides measurements to a compass application
program stored within one of the memory components 1104 in order to
provide a user with accurate directions in a frame of reference
including the cardinal directions, north, south, east, and west.
Similar measurements may be provided to a navigation application
program that includes a compass component. Other uses of
measurements obtained by the magnetometer 1130 are
contemplated.
[0163] The ambient light sensor 1132 is configured to measure
ambient light. In some embodiments, the ambient light sensor 1132
provides measurements to an application program stored within one
the memory components 1104 in order to automatically adjust the
brightness of a display (described below) to compensate for
low-light and high-light environments. Other uses of measurements
obtained by the ambient light sensor 1132 are contemplated.
[0164] The proximity sensor 1134 is configured to detect the
presence of an object or thing in proximity to the computing device
without direct contact. In some embodiments, the proximity sensor
1134 detects the presence of a user's body (e.g., the user's face)
and provides this information to an application program stored
within one of the memory components 1104 that utilizes the
proximity information to enable or disable some functionality of
the computing device. For example, a telephone application program
may automatically disable a touchscreen (described below) in
response to receiving the proximity information so that the user's
face does not inadvertently end a call or enable/disable other
functionality within the telephone application program during the
call. Other uses of proximity as detected by the proximity sensor
1134 are contemplated.
[0165] The accelerometer 1136 is configured to measure proper
acceleration. In some embodiments, output from the accelerometer
1136 is used by an application program as an input mechanism to
control some functionality of the application program. For example,
the application program may be a video game in which a character, a
portion thereof, or an object is moved or otherwise manipulated in
response to input received via the accelerometer 1136. In some
embodiments, output from the accelerometer 1136 is provided to an
application program for use in switching between landscape and
portrait modes, calculating coordinate acceleration, or detecting a
fall. Other uses of the accelerometer 1136 are contemplated.
[0166] The gyroscope 1138 is configured to measure and maintain
orientation. In some embodiments, output from the gyroscope 1138 is
used by an application program as an input mechanism to control
some functionality of the application program. For example, the
gyroscope 1138 can be used for accurate recognition of movement
within a three-dimensional environment of a video game application
or some other application. In some embodiments, an application
program utilizes output from the gyroscope 1138 and the
accelerometer 1136 to enhance control of some functionality of the
application program. Other uses of the gyroscope 1138 are
contemplated.
[0167] The GPS sensor 1140 is configured to receive signals from
GPS satellites for use in calculating a location. The location
calculated by the GPS sensor 1140 may be used by any application
program that requires or benefits from location information. For
example, the location calculated by the GPS sensor 1140 may be used
with a navigation application program to provide directions from
the location to a destination or directions from the destination to
the location. Moreover, the GPS sensor 1140 may be used to provide
location information to an external location-based service, such as
E911 service. The GPS sensor 1140 may obtain location information
generated via WI-FI, WIMAX, and/or cellular triangulation
techniques utilizing one or more of the network connectivity
components 1106 to aid the GPS sensor 1140 in obtaining a location
fix. The GPS sensor 1140 may also be used in Assisted GPS ("A-GPS")
systems.
[0168] The I/O components 1110 include a display 1142, a
touchscreen 1144, a data I/O interface component ("data I/O") 1146,
an audio I/O interface component ("audio I/O") 1148, a video I/O
interface component ("video I/O") 1150, and a camera 1152. In some
embodiments, the display 1142 and the touchscreen 1144 are
combined. In some embodiments two or more of the data I/O component
1146, the audio I/O component 1148, and the video I/O component
1150 are combined. The I/O components 1110 may include discrete
processors configured to support the various interface described
below, or may include processing functionality built-in to the
processor 1102.
[0169] The display 1142 is an output device configured to present
information in a visual form. In particular, the display 1142 may
present graphical user interface ("GUI") elements, text, images,
video, notifications, virtual buttons, virtual keyboards, messaging
data, Internet content, device status, time, date, calendar data,
preferences, map information, location information, and any other
information that is capable of being presented in a visual form. In
some embodiments, the display 1142 is a liquid crystal display
("LCD") utilizing any active or passive matrix technology and any
backlighting technology (if used). In some embodiments, the display
1142 is an organic light emitting diode ("OLED") display. Other
display types are contemplated.
[0170] The touchscreen 1144 is an input device configured to detect
the presence and location of a touch. The touchscreen 1144 may be a
resistive touchscreen, a capacitive touchscreen, a surface acoustic
wave touchscreen, an infrared touchscreen, an optical imaging
touchscreen, a dispersive signal touchscreen, an acoustic pulse
recognition touchscreen, or may utilize any other touchscreen
technology. In some embodiments, the touchscreen 1144 is
incorporated on top of the display 1142 as a transparent layer to
enable a user to use one or more touches to interact with objects
or other information presented on the display 1142. In other
embodiments, the touchscreen 1144 is a touch pad incorporated on a
surface of the computing device that does not include the display
1142. For example, the computing device may have a touchscreen
incorporated on top of the display 1142 and a touch pad on a
surface opposite the display 1142.
[0171] In some embodiments, the touchscreen 1144 is a single-touch
touchscreen. In other embodiments, the touchscreen 1144 is a
multi-touch touchscreen. In some embodiments, the touchscreen 1144
is configured to detect discrete touches, single touch gestures,
and/or multi-touch gestures. These are collectively referred to
herein as gestures for convenience. Several gestures will now be
described. It should be understood that these gestures are
illustrative and are not intended to limit the scope of the
appended claims. Moreover, the described gestures, additional
gestures, and/or alternative gestures may be implemented in
software for use with the touchscreen 1144. As such, a developer
may create gestures that are specific to a particular application
program.
[0172] In some embodiments, the touchscreen 1144 supports a tap
gesture in which a user taps the touchscreen 1144 once on an item
presented on the display 1142. The tap gesture may be used for
various reasons including, but not limited to, opening or launching
whatever the user taps. In some embodiments, the touchscreen 1144
supports a double tap gesture in which a user taps the touchscreen
1144 twice on an item presented on the display 1142. The double tap
gesture may be used for various reasons including, but not limited
to, zooming in or zooming out in stages. In some embodiments, the
touchscreen 1144 supports a tap and hold gesture in which a user
taps the touchscreen 1144 and maintains contact for at least a
pre-defined time. The tap and hold gesture may be used for various
reasons including, but not limited to, opening a context-specific
menu.
[0173] In some embodiments, the touchscreen 1144 supports a pan
gesture in which a user places a finger on the touchscreen 1144 and
maintains contact with the touchscreen 1144 while moving the finger
on the touchscreen 1144. The pan gesture may be used for various
reasons including, but not limited to, moving through screens,
images, or menus at a controlled rate. Multiple finger pan gestures
are also contemplated. In some embodiments, the touchscreen 1144
supports a flick gesture in which a user swipes a finger in the
direction the user wants the screen to move. The flick gesture may
be used for various reasons including, but not limited to,
scrolling horizontally or vertically through menus or pages. In
some embodiments, the touchscreen 1144 supports a pinch and stretch
gesture in which a user makes a pinching motion with two fingers
(e.g., thumb and forefinger) on the touchscreen 1144 or moves the
two fingers apart. The pinch and stretch gesture may be used for
various reasons including, but not limited to, zooming gradually in
or out of a website, map, or picture.
[0174] Although the above gestures have been described with
reference to the use one or more fingers for performing the
gestures, other appendages such as toes or objects such as styluses
may be used to interact with the touchscreen 1144. As such, the
above gestures should be understood as being illustrative and
should not be construed as being limiting in any way.
[0175] The data I/O interface component 1146 is configured to
facilitate input of data to the computing device and output of data
from the computing device. In some embodiments, the data I/O
interface component 1146 includes a connector configured to provide
wired connectivity between the computing device and a computer
system, for example, for synchronization operation purposes. The
connector may be a proprietary connector or a standardized
connector such as USB, micro-USB, mini-USB, or the like. In some
embodiments, the connector is a dock connector for docking the
computing device with another device such as a docking station,
audio device (e.g., a digital music player), or video device.
[0176] The audio I/O interface component 1148 is configured to
provide audio input and/or output capabilities to the computing
device. In some embodiments, the audio I/O interface component 1146
includes a microphone configured to collect audio signals. In some
embodiments, the audio I/O interface component 1146 includes a
headphone jack configured to provide connectivity for headphones or
other external speakers. In some embodiments, the audio interface
component 1148 includes a speaker for the output of audio signals.
In some embodiments, the audio I/O interface component 1146
includes an optical audio cable out.
[0177] The video I/O interface component 1150 is configured to
provide video input and/or output capabilities to the computing
device. In some embodiments, the video I/O interface component 1150
includes a video connector configured to receive video as input
from another device (e.g., a video media player such as a DVD or
BLURAY player) or send video as output to another device (e.g., a
monitor, a television, or some other external display). In some
embodiments, the video I/O interface component 1150 includes a
High-Definition Multimedia Interface ("HDMI"), mini-HDMI,
micro-HDMI, DisplayPort, or proprietary connector to input/output
video content. In some embodiments, the video I/O interface
component 1150 or portions thereof is combined with the audio I/O
interface component 1148 or portions thereof.
[0178] The camera 1152 can be configured to capture still images
and/or video. The camera 1152 may utilize a charge coupled device
("CCD") or a complementary metal oxide semiconductor ("CMOS") image
sensor to capture images. In some embodiments, the camera 1152
includes a flash to aid in taking pictures in low-light
environments. Settings for the camera 1152 may be implemented as
hardware or software buttons.
[0179] Although not illustrated, one or more hardware buttons may
also be included in the computing device architecture 1100. The
hardware buttons may be used for controlling some operational
aspect of the computing device. The hardware buttons may be
dedicated buttons or multi-use buttons. The hardware buttons may be
mechanical or sensor-based.
[0180] The illustrated power components 1112 include one or more
batteries 1154, which can be connected to a battery gauge 1156. The
batteries 1154 may be rechargeable or disposable. Rechargeable
battery types include, but are not limited to, lithium polymer,
lithium ion, nickel cadmium, and nickel metal hydride. Each of the
batteries 1154 may be made of one or more cells.
[0181] The battery gauge 1156 can be configured to measure battery
parameters such as current, voltage, and temperature. In some
embodiments, the battery gauge 1156 is configured to measure the
effect of a battery's discharge rate, temperature, age and other
factors to predict remaining life within a certain percentage of
error. In some embodiments, the battery gauge 1156 provides
measurements to an application program that is configured to
utilize the measurements to present useful power management data to
a user. Power management data may include one or more of a
percentage of battery used, a percentage of battery remaining, a
battery condition, a remaining time, a remaining capacity (e.g., in
watt hours), a current draw, and a voltage.
[0182] The power components 1112 may also include a power
connector, which may be combined with one or more of the
aforementioned I/O components 1110. The power components 1112 may
interface with an external power system or charging equipment via a
power I/O component 1144.
[0183] Based on the foregoing, it should be appreciated that
technologies for animation transitions and effects in a spreadsheet
application have been disclosed herein. Although the subject matter
presented herein has been described in language specific to
computer structural features, methodological and transformative
acts, specific computing machinery, and computer readable media, it
is to be understood that the invention defined in the appended
claims is not necessarily limited to the specific features, acts,
or media described herein. Rather, the specific features, acts and
mediums are disclosed as example forms of implementing the
claims.
[0184] The subject matter described above is provided by way of
illustration only and should not be construed as limiting. Various
modifications and changes may be made to the subject matter
described herein without following the example embodiments and
applications illustrated and described, and without departing from
the true spirit and scope of the present invention, which is set
forth in the following claims.
* * * * *