U.S. patent application number 13/856476 was filed with the patent office on 2014-02-13 for 3d data environment navigation tool.
This patent application is currently assigned to Microsoft Corporation. The applicant listed for this patent is MICROSOFT CORPORATION. Invention is credited to Kevin Fan, Jonathan Edgar Fay, Jai Srinivasan, Curtis G. Wong, Danhua Zhu.
Application Number | 20140047381 13/856476 |
Document ID | / |
Family ID | 50065859 |
Filed Date | 2014-02-13 |
United States Patent
Application |
20140047381 |
Kind Code |
A1 |
Fan; Kevin ; et al. |
February 13, 2014 |
3D DATA ENVIRONMENT NAVIGATION TOOL
Abstract
Concepts and technologies are described herein for providing a
3D data environment navigation tool. In accordance with some
concepts and technologies disclosed herein, the 3D data environment
navigation tool provides a way for a user to manipulate a 3D data
environment in which productivity data is rendered. The 3D data
environment navigation tool may provide a user interacting with the
3D data environment the ability to manipulate the viewing angle of
data rendered in a 3D data environment, thus allowing the user to
"tour" or "move around" the data. The 3D data environment
navigation tool may be configured to aggregate data at various zoom
levels.
Inventors: |
Fan; Kevin; (Seattle,
WA) ; Zhu; Danhua; (Redmond, WA) ; Srinivasan;
Jai; (Bellevue, WA) ; Fay; Jonathan Edgar;
(Woodinville, WA) ; Wong; Curtis G.; (Medina,
WA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
MICROSOFT CORPORATION |
Redmond |
WA |
US |
|
|
Assignee: |
Microsoft Corporation
Redmond
WA
|
Family ID: |
50065859 |
Appl. No.: |
13/856476 |
Filed: |
April 4, 2013 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61681851 |
Aug 10, 2012 |
|
|
|
Current U.S.
Class: |
715/800 |
Current CPC
Class: |
G06F 3/04847 20130101;
G06F 3/04842 20130101; G06T 19/00 20130101; G06F 16/26 20190101;
G06F 3/01 20130101; G06F 3/048 20130101; G06F 16/9537 20190101;
G06F 3/04815 20130101; G06F 16/4393 20190101; G06T 11/206 20130101;
G06F 16/9038 20190101; G06F 3/0481 20130101; G06F 40/18 20200101;
G06T 13/00 20130101; G06F 16/248 20190101; G06T 15/00 20130101;
G06F 3/0482 20130101; G06F 40/166 20200101; G06F 16/50 20190101;
G06F 16/444 20190101; G06F 16/2477 20190101; G06F 16/29 20190101;
G06F 3/0488 20130101; G06F 40/169 20200101; G06T 15/10
20130101 |
Class at
Publication: |
715/800 |
International
Class: |
G06F 3/0481 20060101
G06F003/0481 |
Claims
1. A computer, comprising: a processor; and a computer-readable
storage medium in communication with the processor, the
computer-readable storage medium comprising computer-executable
instructions stored thereupon that, when executed by the processor,
cause the processor to detect selected data to be rendered in a 3D
data environment, render the selected data in the 3D data
environment in a first zoom level, detect an input to change the
first zoom level to a second zoom level, determine if the selected
data is aggregated based on the input to change the first zoom
level to the second zoom level by determining if the selected data
is below a data aggregation level at the second zoom level,
aggregate the selected data and render the selected data as
aggregated data if the selected data is below the data aggregation
level at the second zoom level, and render the selected data at the
second zoom level if the selected data is above the data
aggregation level at the second zoom level.
2. The computer of claim 1, wherein the data aggregation level
corresponds to a zoom region.
3. The computer of claim 2, wherein the zoom region comprises a
street level, a city level, a state level, a postal code, a county
level, and a country level.
4. The computer of claim 3, wherein data associated with the zoom
region is derived from a geographic data store.
5. The computer of claim 2, wherein the zoom region comprises a
user-defined region or a geographic level corresponding to the
second zoom level.
6. The computer of claim 5, wherein the geographic level
corresponding to the second zoom level is determined using a
pre-existing relationship between the selected data and a
geographic entity.
7. The computer of claim 5, wherein the geographic level
corresponding to the second zoom level is determined by geometric
polygons.
8. The computer of claim 1, wherein the input to change the first
zoom level to the second zoom level is received by an input
detected at a navigation pane, a keyboard, a mouse, or a tactile
input at a touchscreen.
9. A method for navigating data, the method comprising: detecting,
at a computer executing a visualization component, selected data to
be rendered in a 3D data environment; rendering, at the computer,
the 3D data environment in a first orientation; rendering, at the
computer, the selected data in the 3D data environment in a first
view of the selected data; detecting, at the computer, an input to
change the first orientation; and determining, at the computer, if
the first view of the selected data is changed based on the input
to change the first orientation; if the first view of the selected
data is not changed based on the input to change the first
orientation, changing, at the computer, the first orientation and
maintaining the first view of the selected data; and if the first
view of the selected data is changed based on the input to change
the first orientation, changing, at the computer, the first view of
the selected data to a second view of the selected data and
changing the first orientation.
10. The method of claim 9, wherein receiving the input to change
the first orientation comprises receiving the input from a
constrained input source.
11. The method of claim 10, wherein the constrained input source
comprises a keyboard or a mouse selector button.
12. The method of claim 9, wherein receiving the input to change
the first orientation comprises receiving the input from an
unconstrained input source.
13. The method of claim 12, wherein the unconstrained input source
comprises a mouse wheel or a tactile input on a touchscreen.
14. The method of claim 9, wherein the input to change the first
orientation comprises a zoom in input, a zoom out input, a rotate
input, a pitch input, a yaw input, a pan input, or a tilt
input.
15. The method of claim 9, further comprising receiving a framing
input to frame at least a portion of the selected data rendered in
the 3D data environment, wherein the framing input comprises an
input at a spreadsheet application or an input at the 3D data
environment.
16. The method of claim 15, further comprising changing a viewing
aspect of the at least a portion of the selected data rendered in
the 3D data environment to be framed.
17. A computer-readable storage medium in communication with a
processor, the computer-readable storage medium having
computer-executable instructions stored thereupon that, when
executed by the processor, cause the processor to: detect selected
data to be rendered in a 3D data environment; render the 3D data
environment in a first orientation; render the selected data in the
3D data environment in a first view of the selected data; detect an
input to change the first orientation; and determine if the first
view of the selected data is to be changed based on the input to
change the first orientation; if the first view of the selected
data is not to be changed based on the input to change the first
orientation, change the first orientation based on the input to
change the first orientation and maintain the first view of the
selected data; and if the first view of the selected data is to be
changed based on the input to change the first orientation, change
the first view of the selected data to a second view of the
selected data and change the first orientation based on the input
to change the first orientation.
18. The computer-readable storage medium of claim 17, wherein the
input to change the first orientation comprises a zoom in input, a
zoom out input, a rotate input, a pitch input, a yaw input, a pan
input, or a tilt input.
19. The computer-readable storage medium of claim 17, wherein the
selected data comprises geocoding data used to render the selected
data in a map.
20. The computer-readable storage medium of claim 17, wherein the
second view of the selected data comprises a billboard comprising a
representation of at least a portion of the selected data.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application is related to and claims the benefit of
U.S. Provisional Patent Application No. 61/681,851, filed on Aug.
10, 2012, entitled "3D Visualization of Data in Geographical and
Temporal Contexts," the entirety of which is hereby incorporated by
reference.
BACKGROUND
[0002] A spreadsheet application, reporting application, or other
data presentation application may present data in a format for
users to gain insight into the data and the relationships contained
therein. Conventional spreadsheet applications present to one or
more users data in cells typically organized in a column/row
format. A user can input data into one or more cells or have data
automatically input into one or more cells from one or more data
stores or other sources of data. Additionally, the user can
populate additional spreadsheet data cells with data calculated
from other spreadsheet data cells. In this manner, the user can
interact with data in one, convenient location, i.e. one or more
spreadsheets rather than at each data source.
[0003] Although providing several benefits, data displayed as
numbers or symbols in a spreadsheet environment can be limited when
analyzing the data. For example, when analyzing the data, a user
may suffer from visual fatigue when viewing only numbers for an
extended period of time. Further, a user may suffer from mental
fatigue, trying to analyze significant and vast amounts of data
presented in numerical format. Thus, a user may want to interact
with the data in a format different than conventional spreadsheet
applications, such as numbers in a cell.
[0004] It is with respect to these considerations and others that
the disclosure made herein is presented.
SUMMARY
[0005] Concepts and technologies are described herein for providing
a three-dimensional ("3D") environment data navigation tool. In
accordance with some concepts and technologies disclosed herein,
the 3D data environment navigation tool allows a user to manipulate
a 3D data environment in which spreadsheet data is rendered. The 3D
data environment navigation tool may provide a user interacting
with the 3D data environment the ability to manipulate the viewing
angle of data rendered in a 3D data environment, thus allowing the
user to "tour" the data.
[0006] According to one aspect, disclosed herein is an illustrative
computer which includes a processor and a computer-readable storage
medium in communication with the processor, the computer-readable
storage medium having computer-executable instructions stored
thereupon which, when executed by the processor, cause the
processor to receive selected data to be rendered in a 3D data
environment, render the 3D data environment in a first orientation,
and render the selected data in the 3D data environment in first
view of the selected data. The computer-readable storage medium
further has computer-executable instructions stored thereupon
which, when executed by the processor, cause the processor to
receive an input to change the first orientation and determine if
the first view of the selected data is changed based on the input
to change the first orientation. If the first view of the selected
data is not changed based on the input to change the first
orientation, the computer-executable instructions cause the
processor to change the first orientation and maintain the first
view of the selected data. If the first view of the selected data
is changed based on the input to change the first orientation, the
computer-executable instructions cause the processor to change the
first view of the selected data to a second view of the selected
data and change the first orientation.
[0007] According to an additional aspect, disclosed herein is a
method that includes receiving selected data to be rendered in a 3D
data environment, rendering the 3D data environment in a first
orientation, and rendering the selected data in the 3D data
environment in a first view of the selected data. The method
further includes receiving an input to change the first orientation
and determining if the first view of the selected data is changed
based on the input to change the first orientation. If the first
view of the selected data is not changed based on the input to
change the first orientation, the method includes changing the
first orientation and maintaining the first view of the selected
data. If the first view of the selected data is changed based on
the input to change the first orientation, the method includes
changing the first view of the selected data to a second view of
the selected data and changing the first orientation.
[0008] According to a further aspect, disclosed herein is an
illustrative computer-readable storage medium in communication with
a processor, the computer-readable storage medium having
computer-executable instructions stored thereupon which, when
executed by the processor, cause the processor to receive selected
data to be rendered in a 3D data environment, render the 3D data
environment in a first orientation and render the selected data in
the 3D data environment in a first view of the selected data. The
computer-executable instructions further include instructions
which, when executed by the processor, cause the processor to
receive an input to change the first orientation and determine if
the first view of the selected data is to be changed based on the
input to change the first orientation. If the first view of the
selected data is not to be changed based on the input to change the
first orientation, the computer-executable instructions cause the
processor to change the first orientation based on the input to
change the first orientation and maintain the first view of the
selected data. If the first view of the selected data is to be
changed based on the input to change the first orientation, the
computer-executable instructions cause the processor to change the
first view of the selected data to a second view of the selected
data and change the first orientation based on the input to change
the first orientation.
[0009] It should be appreciated that the above-described subject
matter may also be implemented as a computer-controlled apparatus,
a computer process, a computing system, or as an article of
manufacture such as a computer-readable storage medium. These and
various other features will be apparent from a reading of the
following Detailed Description and a review of the associated
drawings.
[0010] This Summary is provided to introduce a selection of
concepts in a simplified form that are further described below in
the Detailed Description. This Summary is not intended to identify
key features or essential features of the claimed subject matter,
nor is it intended that this Summary be used to limit the scope of
the claimed subject matter. Furthermore, the claimed subject matter
is not limited to implementations that solve any or all
disadvantages noted in any part of this disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] FIG. 1 is a system diagram of an illustrative operating
environment that may be used to implement various embodiments
disclosed herein.
[0012] FIG. 2 is a user interface ("UI") diagram showing
spreadsheet data that is selected to be rendered in a 3D data
environment, in accordance with some embodiments.
[0013] FIG. 3 is a UI diagram showing the rendering of the data
selected in FIG. 2 in a 3D data environment, in accordance with
some embodiments.
[0014] FIG. 4 is line diagram showing a navigation pane for
navigating data rendered in a 3D data environment, in accordance
with some embodiments.
[0015] FIG. 5 is a line diagram showing an alternative navigation
pane for navigating data rendered in a 3D data environment, in
accordance with some embodiments.
[0016] FIG. 6 is a UI diagram showing aspects of rendering data in
a 3D data environment, in accordance with some embodiments.
[0017] FIG. 7 is a UI diagram showing additional aspects of
rendering data in a 3D data environment, in accordance with some
embodiments.
[0018] FIG. 8 is a line diagram showing a system for providing a
navigation tool, in accordance with some embodiments.
[0019] FIG. 9 is a line drawing showing another embodiment for
providing a navigation tool, in accordance with some
embodiments.
[0020] FIGS. 10A-10H are line drawings showing various aspects of
rendered data replacement techniques, in accordance with some
embodiments.
[0021] FIGS. 11A-11B are line drawings illustrating the framing of
data in a 3D data environment, in accordance with some
embodiments.
[0022] FIG. 12 is a flow diagram showing aspects of a method for
providing a 3D data environment navigation tool, in accordance with
some embodiments.
[0023] FIG. 13 illustrates a computer architecture for a device
capable of executing the software components presented herein, in
accordance with some embodiments.
[0024] FIG. 14 is a diagram illustrating a distributed computing
environment capable of implementing aspects of the embodiments
presented herein, in accordance with some embodiments.
[0025] FIG. 15 is a computer architecture diagram illustrating a
computing device architecture capable of implementing aspects of
the embodiments presented herein.
DETAILED DESCRIPTION
[0026] The following detailed description is directed to a 3D data
environment navigation tool. The 3D data environment navigation
tool can be used within an application to provide 3D visualizations
of data. The 3D data environment navigation tool may provide a user
with the ability to move the orientation of a view along one or
more axes of rotation. The 3D data environment navigation tool may
also provide the ability to zoom in or zoom out on the view. The 3D
data environment navigation tool may further provide support for
multiple input modes with which may input commands to change a view
of the data rendered in the 3D data environment.
[0027] As used herein, "3D" includes the simulation of a space with
three dimensions. In some examples, the three dimensions are
represented by a spatial coordinate system, such as a 3-dimensional
Euclidean space having three directional axes (e.g. X, Y, and Z).
As used herein, an "orientation" of an element in a 3D data
environment is based on coordinates along the three directional
axes. Further, as used herein, a change in the orientation of an
element in a 3D data environment can include changing the
coordinates of the element along at least one of the three
directional axes. Further, as used herein, a "viewing aspect" can
include a visual appearance, or view, of the 3D data environment to
a user observing the 3D data environment. In some configurations, a
user can input various navigation commands and/or interact with
various controls to change the orientation of the 3D data
environment. The navigation controls can include, but are not
limited to, inputs to pan, pitch, roll, yaw, zoom, tilt, and/or
rotate the 3D data environment. As used herein, "pitch" includes a
change in a viewing aspect by rotation about a lateral axis. As
used herein, "roll" includes a change in viewing aspect by rotation
about a longitudinal axis. As used herein, "yaw" includes a change
in viewing aspect by rotation about a vertical axis.
[0028] While the subject matter described herein is presented in
the general context of program modules that execute in conjunction
with the execution of an operating system and application programs
on a computer system, those skilled in the art will recognize that
other implementations may be performed in combination with other
types of program modules. Generally, program modules include
routines, programs, components, data structures, and other types of
structures that perform particular tasks or implement particular
abstract data types. Moreover, those skilled in the art will
appreciate that the subject matter described herein may be
practiced with other computer system configurations, including
hand-held devices, multiprocessor systems, microprocessor-based or
programmable consumer electronics, minicomputers, mainframe
computers, and the like.
[0029] In the following detailed description, references are made
to the accompanying drawings that form a part hereof, and in which
are shown by way of illustration specific embodiments or examples.
Referring now to the drawings, in which like numerals represent
like elements throughout the several figures, aspects of a
computing system, computer-readable storage medium, and
computer-implemented methodologies for a 3D data environment
navigation tool and other aspects will be presented.
[0030] Referring now to FIG. 1, aspects of an operating environment
100 for implementing various embodiments presented herein will be
described. The operating environment 100 shown in FIG. 1 includes a
computing device 102 operating on or in communication with the
network 118. In some embodiments, the computing device 102 can
include a desktop computer, a laptop computer, a notebook computer,
an ultra-portable computer, a netbook computer, or a computing
device such as a mobile telephone, a tablet device, a slate device,
a portable video game device, or the like. Illustrative
architectures for the computing device 102 are illustrated and
described herein below with reference to FIGS. 13-15. It should be
understood that the concepts and technologies disclosed herein are
not limited to an operating environment connected to a network or
any external computing system, as various embodiments of the
concepts and technologies disclosed herein can be implemented
locally on the computing device 102.
[0031] An operating system 101 is executing on the computing device
102. The operating system 101 is an executable program for
controlling functions on the computing device 102. The computing
device 102 also can execute a productivity application 104. The
productivity application 104, in some examples, is used by a user
106 to collect, store, manipulate and analyze data stored in
spreadsheet data 112. It should be appreciated that the spreadsheet
data 112 is represented as being stored a single data store for
purposes of illustration. The spreadsheet data 112 may be stored in
one or more data stores accessible to the computing device 102.
Although the concepts and technologies disclosed herein are not
limited to any type of data stored in the spreadsheet data 112, in
some examples, data stored in the spreadsheet data 112 may be data
associated with various conditions, events, workflow processes,
business environments, and the like, with which the user 106 may
use in the productivity application 104.
[0032] In some embodiments, the productivity application 104 may
include, but is not limited to, one or more productivity
application programs that are part of the MICROSOFT OFFICE family
of products from Microsoft Corporation in Redmond, Wash. The
examples of the application programs can include a member of, but
are not limited, MICROSOFT WORD, MICROSOFT EXCEL, MICROSOFT
POWERPOINT, MICROSOFT ACCESS, MICROSOFT VISIO, or MICROSOFT OUTLOOK
families of application programs. In the described embodiments, the
productivity application 104 is illustrated as including the
MICROSOFT EXCEL application program. The MICROSOFT EXCEL
application program is a spreadsheet application featuring various
functionalities including, but not limited to, calculation,
graphing tools, data pivot tables, and a macro programming language
called VISUAL BASIC for APPLICATIONS. It should be understood that
examples provided herein using MICROSOFT EXCEL are illustrative,
and should not be construed as limiting in any way.
[0033] In addition to accessing data from the spreadsheet data 112,
the productivity application 104 may also be configured to access
data from other sources. In one example, the user 106 may wish to
augment data stored in the spreadsheet data 112 with geographic
information. In such an example, the productivity application 104
may be configured to access map data 122. The productivity
application 104 may be configured to access the map data 122 at a
local storage device associated with the computing system 102
and/or may access the map data 122 via the network 118. The map
data 122 may include, among other information, geographic location
information, digital renderings of maps, and/or other information.
The 3D data visualization component 114 can be configured to
integrate the map data 122 into data stored in the spreadsheet data
112 to be rendered by the 3D data visualization component 114.
[0034] The operating environment 100 also can include a geocoding
component 120. The geocoding component 120 may be a component of
the computing device 102 or a separate component accessible to the
computing device 102. The geocoding component 120 can be accessed
by the productivity application 104 to map or correlate data stored
in the spreadsheet data 112 to location data included in or
represented by the map data 122. It should be appreciated that the
geocoding component 120 is illustrated as a separate component for
purposes of illustration only. In some examples, the geocoding
component 120 may be part of one or more other components or
programs, including, but not limited to, the productivity
application 104. The geocoding component 120 and/or the map data
122 may be provided using various data sources, including, but not
limited to, the BING mapping services provided by Microsoft
Corporation in Redmond, Wash. Because additional and/or alternative
mapping services are possible and are contemplated, it should be
understood that this example is illustrative, and should not be
construed as being limiting in any way.
[0035] The data stored in the spreadsheet data 112 can be rendered
in a 3D data environment by the 3D data visualization component
114. For example, data in the spreadsheet data 112 can be selected
and rendered. As discussed briefly above, the user 106 or another
entity may request rendering of the data to perform various
functions or tasks within the 3D data environment. For example, the
user 106 may request rendering of the data for purposes of
navigating through the data within the 3D data environment. In
another example, the user 106 may request rendering of the data for
purposes of creating or recording a "tour." As used herein, a
"tour" can refer to a created or recorded movement, path, and/or
collection of scenes within a 3D data environment corresponding to
the spreadsheet data 112. The tours can be saved and/or shared to
allow other users to view or watch the tour. Thus, a tour or
navigation of data or information can include manipulating an
orientation of the 3D data environment and/or simulating movement
through or around the 3D data environment. Thus, manipulating the
orientation of the 3D data environment includes moving or rotating
the 3D data environment about various geometric axes.
[0036] To manipulate the orientation of the 3D data environment to
perform various tasks, the user 106 can access a 3D data
environment navigation component 116 provided by the productivity
application 104. The 3D data environment navigation component 116
may provide the user 106 with one or more interfaces or can support
other input methods. The interfaces can be interacted with to
navigate through the 3D data environment rendered by 3D data
visualization component 114. The user 106 can view an output of the
productivity application 104 using a display or screen. In the
illustrated embodiment, the output is shown on a monitor 108
presenting a display, user interface, or other representation
("display") 110. The display 110 can allow the user 106 to view
and/or visually interface with data stored in the spreadsheet data
112. The productivity application 104 can include a 3D data
visualization component 114. The 3D visualization component 114 can
be configured to allow the user 106 to experience data stored in
the spreadsheet data 112 in a 3D data environment. In particular,
the user 106 can use the 3D data visualization component 114 to
render data included in the spreadsheet data 112 in a 3D data
environment. In some embodiments, the 3D visualization component
114 renders data selected by the user 106. As described above, by
rendering selected data stored in the spreadsheet data 112 in a 3D
data environment, the user 106 may be able to gain additional
knowledge and/or share information about the data with other users,
for example via tours.
[0037] FIG. 2 is a UI diagram showing the selection of spreadsheet
data to be rendered in a 3D data environment. It should be
appreciated that the disclosure provided below using a spreadsheet
application is for purposes of clarity only and does not limit the
disclosure to a spreadsheet application, as other applications or
programs that allow a user to interact with data from various
sources may also be used. Illustrated in FIG. 2 is the display 110
that includes a representation of a portion of data contained in
spreadsheet 202. The spreadsheet 202 has columns 204A-G
(hereinafter collectively and/or generically referred to as
"columns 204") of data that can be stored in the spreadsheet data
112 of FIG. 1. It should be appreciated that the columns 204 may be
populated from data stored in the spreadsheet data 112 or other
sources, such as from databases, internet data sources and the
like.
[0038] In the spreadsheet 202, a column 204F has been populated
with zip codes retrieved from the map data 122 using the geocoding
component 120. There may be several ways in which the columns 204
or other data contained in the spreadsheet 202 can be populated
with data. For example, and not by way of limitation, the user 106
can manually enter in the data. In another example, and not by way
of limitation, the data may be automatically populated within the
spreadsheet 202 with data obtained from other sources such as, for
example, the geocoding component 120, the map data 122, and/or
other sources. Additionally, the data within the spreadsheet 202
may be based on other data and therefore need not originate from an
external source. For example, the data within the spreadsheet 202
may be the result of one or more arithmetic operations on data in
one or more of the columns 204. It should be understood that this
embodiment is illustrative, and should not be construed as being
limiting in any way.
[0039] Data in the spreadsheet 202 can be selected and/or rendered
in a 3D data environment. For example, if the user 106 desires to
render data within the spreadsheet 202 in a 3D data environment,
the user 106 can select one or more of the columns 204 of data
within the spreadsheet 202 and/or particular records included in
the spreadsheet 202 for rendering by 3D data visualization
component 114. Rendering of the data in the spreadsheet 202 is
illustrated and described in additional detail below.
[0040] FIG. 3 is a UI diagram showing the rendering of the data
within the spreadsheet 202 selected in FIG. 2. In FIG. 3, the data
included in the spreadsheet 102 has been selected. For example, the
user 106 may select the data in one or more of the columns 204 of
data within the spreadsheet 202, and may have requested, commanded,
or directed that the 3D data visualization component 114 render the
selected data. It should be understood that the illustrated
rendering is illustrative and should not be construed as being
limiting in any way.
[0041] The rendering includes a map 300 showing the rendered data.
The map 300 is illustrated as being included in the display 110,
which can be presented on the monitor 108. The map 300 is
illustrated as having multiple data points 302, which can be spread
across and/or throughout the map 300 (in this example, a map of the
United States). As shown, the data points 302 can include clusters
of data points 304A, 304B and 304C (hereinafter collectively and/or
generically referred to as "clusters 304"). The clusters 304 can
include groups and/or sets of the data points 302. Although the map
300 may provide useful information in a default configuration or a
default display format, the user 106 may want to move or modify the
orientation of the map 300 rendered in the display 110 for various
purposes. Because a 3D data environment, like the map 300, can have
several axes of rotation associated with the visualization, the
user 106 can manipulate the 3D visualization using a 3D data
environment navigation component, such as 3D data environment
navigation component 116, to pan, zoom, tilt, and rotate (for
example) the map 300.
[0042] FIG. 4 is a line diagram showing a navigation pane 400 that
may be used in conjunction with the 3D data environment navigation
component 116. The navigation pane 400 can be used to manipulate
the view of the map 300 shown in FIG. 3. It should be understood
that the concepts and technologies disclosed herein are not limited
to the use of any particular type of navigation pane or any
configuration of controls available in a navigation pane, such as
the navigation pane 400 of FIG. 4. The navigation pane 400 and the
following description are illustrated for clarity and descriptive
purposes only and do not limit the concepts and technologies
disclosed herein in any way.
[0043] The navigation pane 400 includes a zoom bar 402 and a
navigation panel 404. One or more features provided in the zoom bar
402 can be used to submit an input to zoom in on and/or zoom out of
data rendered in the map 300 or other 3D data environments. The
zoom bar 402 can include one or more controls that can be
manipulated to perform a zoom function. For example, the zoom bar
402 can include a zoom out button 408 that, when selected, causes
the display 110 of the map 300 to be zoomed out to show a broader
or wider view of the map 300 relative to a view that was rendered
before selection of the zoom out button 408. The zoom bar 402 also
can include a zoom in button 410 that, when selected, causes the
display 110 of the map 300 to be zoomed in to show a more focused
or narrower view of the map 300 relative to a view that was
rendered before selection of the zoom in button 410. The zoom bar
402 also can include a zoom indicator 406, which can display a
level of zoom currently being applied to the map 300, or any other
3D data environment.
[0044] The navigation panel 404 also can be used to move or
manipulate the orientation of the data rendered in the map 300. The
navigation panel 404 can include one or more controls that can be
selected to navigate the 3D data environment. For example, the
navigation panel 404 can include an up control 414 that, when
selected, causes the orientation of the map 300 to move upward
along a Y-axis of spatial coordinates 424. The navigation panel 404
also can include a right control 416 that, when selected, causes
the orientation of the map 300 to move toward the right along a
Z-axis of the spatial coordinates 424. The navigation panel 404
also can include a down control 418 that, when selected, causes the
orientation of the map 300 to move downward along the Y-axis of the
spatial coordinates 424. The navigation panel 404 also can include
a left control 420 that, when selected, causes the orientation of
the map 300 to move leftward along the Z-axis of the spatial
coordinates 424.
[0045] Additional controls may be provided by menu button 422,
which when selected, may display one or more additional or
alternative navigation or control functions. Further, the
navigation panel 404 may be configured to allow the user 106 to
manipulate the orientation of the map 300 by rotating the
orientation of the map 300 about one or more of the spatial
coordinates 424. Thus, the navigation panel 404 can be interacted
with to tilt the map 300, to pan from one side to the other of the
map 300, to rotate the map 300, and/or to take other actions with
respect to the map 300.
[0046] Depending on the particular configuration, the navigation
panel 400 also may be configured to provide multiple types of
inputs and outputs. For example, the navigation panel 404
illustrated in FIG. 5 can be used to select or deselect one or more
features within the map 300 for viewing. In one embodiment, if a
menu button 422 is selected, the productivity application 104 can
modify the display to show the user 106 a feature panel 426. If the
feature panel 426 is selected from the menu button 422, the
navigation panel 404 can be replaced by the feature panel 426. It
should be understood that the feature panel 426 may also augment or
be rendered in addition to navigation panel 404, the concepts and
technologies disclosed herein of which is not limited to either
configuration.
[0047] The feature panel 426 can provide a feature list 428 showing
various features or illustrations in the 3D data environment that
can be selected using one or more selection buttons 430 to select
or deselect the features or illustrations from the view. For
example, the map 300 may show the properties of various data
rendered in the map 300. An input for properties button 432 can be
received which instructs the 3D data visualization component 114 to
illustrate various properties of one or more data rendered in the
map 300. Examples of some features may include, but are not limited
to: properties that may be shapes, sizes, or colors of the data; an
overlay that may be geographical information (such as a map)
associated with the location of the data; and a reset button to
reset the view of the 3D data environment back to a previous view.
It should be appreciated that the concepts and technologies
disclosed herein are not limited to any particular feature that may
be described herein with regards to the feature list 428.
[0048] FIG. 6 and FIG. 7 are UI diagrams showing aspects of
rendering data in a 3D data environment and navigating data within
a 3D data environment. Illustrated in FIG. 6 is a monitor 108 with
a 3D data environment 600 rendered in the display 110. The 3D data
environment 600 can include a rendering of data selected, for
example as described above with reference to FIG. 2, though this is
not necessarily the case. In the 3D data environment 600, by way of
example, the data selected can include building information,
location information, and sales information. In this example, the
3D data environment 600 has been configured so that the data
rendered visually as buildings. In the 3D data environment 600,
locations of the buildings can correspond to location data of the
particular data point associated with building, and the size of the
building can correspond to sales data associated with the
particular data point. It should be understood that this embodiment
is illustrative, and should not be construed as being limiting in
any way.
[0049] As shown, a store 602, a store 604, a store 606, a store
608, and a store 610 are placed in various locations within the 3D
data environment 600. Also shown is that the store 602, the store
604, the store 606, the store 608, and the store 610 are of various
sizes, with the store 602 being the largest. In the illustrated
embodiment, the store 602 can correspond to data indicating the
largest amount of sales. Similarly, the store 604 is shown as the
smallest, which can correspond to data indicating the smallest
amount of sales.
[0050] Although the 3D data environment 600 illustrated in FIG. 6
may provide useful information, a viewer or consumer of the 3D data
environment 600, for example the user 106, may manipulate the view
of the 3D data environment 600 within the display 110. According to
various embodiments, the data navigation panel 612 can be used to
manipulate the view of the 3D data environment 600 within display
110. By way of example, the data navigation panel 612 may include a
tilt button 614. The tilt button 614 can be used to tilt the 3D
data environment 600, resulting in the display 700 of FIG. 7,
though this is not necessarily the case.
[0051] In FIG. 7, a 3D data environment 700 is shown as an overhead
view of the 3D data environment 600 rendered in FIG. 6. As noted
above, the 3D data environment 700 shown in FIG. 7 can be shown in
response to selection of the tilt button 614 of the data navigation
panel 612. Because the 3D data environment 700 can be shown at
additional and/or alternative times, it should be understood that
this example is illustrative, and should not be construed as being
limiting in any way.
[0052] In FIG. 7, the store 602, the store 604, the store 606, the
store 608, and the store 610 are illustrated as 2D representations
(e.g. rectangles) of the data rendered in the 3D data environment
600. In this example, relative locations of the stores with respect
to one another may be more easily understood. It should be
appreciated that the types of view manipulation (e.g. pan, zoom,
and tilt) described above are merely illustrative and do not limit
the scope of the concepts and technologies disclosed herein in any
way. Further, it should be appreciated that the representation of
one or more navigation controls, such as the data navigation panel
612 of FIG. 7, is merely illustrative and is does not limit the
concepts and technologies disclosed herein to any particular
configuration. Other controls may be used to navigate through data
rendered in a 3D data environment.
[0053] FIG. 8 is a system diagram showing a system providing a
navigation tool. Illustrated in FIG. 8 is the 3D data environment
600 rendered in the display 110 of the monitor 108 of a computer
800. The computer 800 can include a computing device, such as the
computing device 102 of FIG. 1, though this is not necessarily the
case. The computer 800 can be configured to provide a 3D data
environment navigation tool according to various embodiments
disclosed herein.
[0054] In one configuration, the 3D data environment 600 can be
manipulated using one or more input methods, such as a mouse 802
and/or a keyboard 804 of the computer 800. Additionally, the input
sources can be configured to be "constrained" or "unconstrained"
input sources. Constrained input sources, as used herein, can refer
to technologies, control techniques or methods having one or more
discrete or linear inputs such as, for example, a click of a mouse
button, depressing a key on a keyboard, or the like. An
unconstrained input source can refer to technologies, control
techniques or methods having one or more non-discrete or non-linear
inputs, such as the use of a mouse wheel or tactile input on a
touchscreen. It should be understood that the concepts and
technologies disclosed herein are not limited to any category of
constrained or unconstrained input source, as the concepts and
technologies disclosed herein may make use of various combinations
of these and/or other input sources in various configurations.
[0055] In some configurations, a pointer 806 may be moved to
various locations within the 3D data environment 600. A selector
button 808 can be pressed to cause, request, and/or trigger various
navigational controls. For example, the user 106 may move the
pointer 806 to a location 810 and double click the selector button
808. In this example, the computer 800 may be configured to receive
the double click selection and zoom in on the 3D data environment
600, in a fashion similar to the use of the zoom bar 402 as
described in FIGS. 4 and 5, above.
[0056] In another example, the selector button 808 can be selected
and held while the mouse 802 is moved along one axis, for example a
forward or backward direction, to effectuate a zoom control. In a
similar manner, the selector button 808 can be selected and held
while the mouse 802 is moved along another axis, for example a
leftward or rightward direction, to move the 3D data environment
600 along one or more axes in a manner similar to the use of the
data navigation panel 612 as disclosed in FIGS. 6 and 7.
Additionally, the data rendered in the 3D data environment 600 can
be explored by moving the pointer 806 to one of the data rendered,
such as the store 606, and select the store 606 using the selector
button 808 of mouse 802. In some configurations, in response to a
selection of one or more data points for further exploration, the
computer 800 can modify the display 110 to frame, highlight, or
center the selected data, as explained by way of example in FIGS.
11A and 11B. In this configuration, the navigation method using the
mouse 802 may be termed a highly constrained navigation
control.
[0057] FIG. 9 is a line drawing showing another embodiment for
providing a navigation tool. In this example, touch inputs can be
used to cause navigational commands or other input within the 3D
data environment 600. For example, the user 106 may touch a
location 900 with a right index finger 902 and perform selecting
acts similar to the selecting acts described using the selector
button 808 of mouse 802. In this example, the user 106 can touch
the surface of the monitor 108 in a manner similar to pushing down
the selector button 808 of mouse 802.
[0058] In another example, the user 106 can place and hold the
right index finger 902 on the location 900 and, thereafter while
maintaining contact with the surface of monitor 108, move the
location 900. In this configuration, by way of example, the user
106 can use a touch input to navigate through the 3D data
environment 600 in a manner similar to the use of the data
navigation panel 612 as described in FIGS. 6 and 7. In another
example, the user 106 can place and hold the right index finger 902
on the location 900 and place and hold a left index finger 904 on a
location 906 and, while maintaining contact with the screen, move
the left index finger 904 and the right index finger 902 in
relation to each other (sometimes referred to as a "pinch gesture")
to effectuate a zoom navigation control, similar to the use of the
zoom bar 402 described in FIGS. 4 and 5.
[0059] In some configurations, the user 106 may zoom in or zoom out
to a level that causes rendered data to become imperceptible or
causes a visually confusing view. For example, if the user 106
zooms out on the 3D data environment 600 in FIG. 6 to a certain
level, the rendered data, for example the store 602, the store 604,
the store 606, the store 608, and the store 610, may become so
close in location relative to each other that the rendered data
appears as a single point of data or the user 106 cannot readily
distinguish between each of the rendered data. In some
configurations, to avoid or reduce the effects of data rendered in
this way, an annotation, such as a billboard, can be displayed to
replace one or more portions of the rendered data.
[0060] FIGS. 10A-10H are line drawings showing various aspects of
rendered data replacement techniques. FIG. 10A illustrates an
illustrative scaling chart 1001 that may be used to determine how
rendered data is scaled when zoomed in on or zoomed out from. As
used herein, scaling includes the determination of the shapes,
sizes and/or placement of various data within a 3D data environment
(by way of example) based on a level of zoom applied to a 3D data
environment. It should be appreciated that an algorithm or other
program representing scaling chart 1001 can be included in the 3D
data visualization component 114.
[0061] The scaling chart 1001 may be used or applied by the 3D data
visualization component 114 to determine how various rendered data
appears in a 3D data environment once a zoom in or zoom out
navigation input is received from the user 106. The vertical axis
is entitled, "Scale", which represents the scaling applied to the
objects (e.g. the 3D representations of selected data) in a 3D data
environment. In this example, scaling can refer to the size of an
object based on a level of zoom. The horizontal axis, entitled,
"Camera Distance From Earth Surface", is representative of the
level of zoom applied to the 3D data environment, with the level
entitled, "Sea Level" being 100% zoom (or the closest the 3D data
environment can be rendered to a specific location), and the level
entitled, "Camera Ceiling" being 0% zoom (or the farthest the 3D
data environment can be rendered), with various levels in between.
It should be appreciated that titles of the levels of zoom
illustrated in FIG. 10A are merely illustrative, as other titles
may be used. Further, it should be appreciated that the concepts
and technologies disclosed herein is not limited to the particular
levels illustrated in FIG. 10A, as one or more levels may be
used.
[0062] In the configuration illustrated in FIG. 10A, between zoom
levels "Sea Level" and "Street Level", the data rendered in a 3D
data environment may not be scaled. When a zoom input, either zoom
in or zoom out, is received by the productivity application 104 at
zoom levels between "Sea Level" and "Street Level", the size of the
objects can remain the same. When a zoom input, either zoom in or
zoom out, is received by the productivity application 104 at zoom
levels between "Street Level" and "State Level", the sizes of the
objects can be decreased in an exponential (or curve) fashion. When
a zoom input, either zoom in or zoom out, is received by the
productivity application 104 at zoom levels between "State Level"
and "Country Level", the sizes of the objects can remain constant
from their previous size. When a zoom input, either zoom in or zoom
out, is received by the productivity application 104 at zoom levels
between "Country Level" and "Camera Ceiling", the objects can be
replaced by a "billboard" or other annotation that replaces a
portion of the object with another 3D object, such as text.
[0063] FIGS. 10B-C illustrate one configuration in which the sizes
of the rendered data are changed by a zoom input, and further
illustrate how, at a level of zoom, the rendered data is replaced
by another 3D object. As illustrated, a data point 1002 and a data
point 1004 are rendered in a 3D data environment 1000. By way of
illustration, an 80 percent zoom level is applied to the 3D data
environment 1000, resulting in the shapes and sizes of the data
point 1002 and the data point 1004 as illustrated. In FIG. 10C, a
30 percent zoom level has been applied to the 3D data environment
1000, resulting in a scaling operation in which the sizes of the
data point 1002 and the data point 1004 are smaller when compared
to the sizes of the data point 1002 and the data point 1004
illustrated in FIG. 10B. Further, the distance of the data point
1002 and the data point 1004 relative to each other can be smaller
when compared to the distance of the data point 1002 and the data
point 1004 illustrated in FIG. 10B.
[0064] As illustrated in FIG. 10C, it may be difficult, at this
zoom level, for a user 106 or other entity to visually
differentiate between the data point 1002 and the data point 1004.
If a still lower level of zoom is applied, for example, a zoom
level between "Country Level" and "Camera Ceiling" as described in
FIG. 10A, the 3D representations of the data point 1002 and the
data point 1004 may be of such a size and/or location that visually
differentiating between the data point 1002 and the data point 1004
in the 3D data environment 1000 may be difficult. Therefore, in
some configurations, a billboard 1006 may be substituted for the 3D
representations of data point 1002 and data point 1004. The
billboard 1006 can be used in place of the 3D representations of
data point 1002 and data point 1004. The billboard 1006 can have
information in textual form relating to the data point 1002 and/or
the data 1004, thus providing the user 106 with some information
relating to the data point 1002 and/or the data 1004, while
minimizing the impact of a low zoom level.
[0065] In some configurations, the user 106 may want data to
visually aggregate based on various factors. For example, the user
106 may want data in a specific geographic region relating to one
zoom level aggregated at another zoom level. In some
implementations, the geographic region is determined by the 3D data
visualization component 114 using information about the rendered
data. In other implementations, the geographic region is specified
by the user 106 or other entity. The aggregation of data may
provide various benefits. The rendered data may be aggregated to a
degree that relates to the zoom level. For example, when reviewing
sales data on a country-wide basis, it may be beneficial aggregate
data for a particular state and show the data as aggregated data
rather than individual city or county data. This may visually
reduce the amount of information presented to the user 106.
[0066] In some configurations, the 3D data visualization component
114 may automatically decide a geographic level, such as a city
level, state level, country level, and the like, to aggregate data
by based off of a zoom level. Further, even if the user 106 does
not specify all geographic levels in data, the 3D data
visualization component 114 may be configured to determine
aggregation geographic levels using various technologies. For
example, the 3D data visualization component 114 may use a
preexisting defined relationship between geographic entities to
determine data not previously provided.
[0067] For example, data may have a state/province column, but may
only provide data relating to New Jersey, Pennsylvania, and New
York. The 3D data visualization component 114 may be configured to
determine that New Jersey, Pennsylvania, and New York are in the
United States. Thus, the data may be aggregated at a country level
even though the data may not include the country. In some
configurations, instead of using previously known geographical
relationships, the user 106 may define the relationships using
known geographic levels. For example, the user 106 may define a
relationship of the Pacific North West to include the states of
Washington and Oregon. In another example, the user 106 may define
a relationship as a sales territory including Morris, Union, and
Camden Counties in New Jersey. The 3D data visualization component
114 may use these defined relationships to determine aggregation
levels.
[0068] In some configurations, the geographic aggregation level may
be determined used geometric polygons. The data may have a
coordinates including latitude/longitude, associated with it that
can be provided either by the user 106 or by using geographic data
from a data store such as the map data store 122. If the polygons
and coordinates are provided for a county, state, country, etc. the
3D data visualization component 114 may determine which data belong
to a particular county, state, country, and the like, and may,
thereafter, aggregate the data by using those levels. In some
configurations, the user 106 may also provide user-defined polygons
with related coordinates. The 3D data visualization component 114
may then determine which data lies in the user-defined polygons to
determine the data to aggregate in particular coordinates. These
and other aspects are further explained in relation to FIGS. 10E-10
H.
[0069] Shown in FIG. 10E is a data aggregation chart 1020. The data
aggregation chart 1020 has several zoom regions 1022. Illustrated
by way of example are zoom regions "STREET LEVEL," "CITY LEVEL,"
"STATE LEVEL," and "COUNTRY LEVEL." Other zoom regions may include,
but are not limited to, postal codes and county levels. The zoom
regions 1022 can be used to define levels above which data is
aggregated and below which data is not aggregated. The zoom bar
1024 is an exemplary mechanism for determining the level of zoom to
which data is aggregated. The zoom bar 1024 includes a zoom level
indicator 1026 and a relational data aggregation level 1028.
[0070] The zoom level indicator 1026 corresponds to a current zoom
level of a 3D data environment. The data aggregation level 1028
corresponds to the zoom level below which data is aggregated and
above which the data is rendered in the current zoom level. The
difference between the data aggregation level 1028 and the zoom
level indicator 1026 may be adjusted or may vary depending on
various configurations or settings. Further, the difference between
the data aggregation level 1028 and the zoom level indicator 1026
may correspond to the difference zoom regions 1022. In one
implementation, the zoom level indicator 1026 may set to various
levels of zoom greater than the data aggregation level 1028. For
example, the zoom level indicator 1026 may indicate a zoom level at
one level, whereas the data aggregation level 1028 may be set to
cause the aggregation of data one level below the zoom level
corresponding to the zoom level indicator 1026. So, in this
example, if the user 106 zooms out to the country level of the zoom
regions, the data at or below the state level of the zoom regions
may be aggregated and displayed in a 3D data environment as
aggregated, not separate, data.
[0071] FIGS. 10F and 10G further illustrate the data aggregation
aspect based on geographic zoom levels. A map 1040 is displayed in
a display 1042 are data clusters 1044A-1044E. The data clusters
1044A-1044E may correspond to sales data rendered in the map 1040.
The data clusters 1044A-1044E may include sales data for a
particular city. As shown in FIG. 10F, the zoom level of the map
1040 may correspond to a city level. The data clusters 1044A-1044E
are positioned based on the data associated with that particular
location.
[0072] The user 106 may want to zoom out from the map 1040 to see
more data associated with the data rendered in FIG. 10F. FIG. 10G
illustrates the display 1042 displaying a map 1046. The zoom level
of the map 1046 is at a country level. Shown are data points
1048A-1048G. The data points 1048A-1048G correspond to the data
rendered in FIG. 10F at a country level rather than the state level
illustrated in FIG. 10F. The data points 1048A-1048G are aggregated
data of the particular states from which the data originates. For
example, data point 1048E corresponds to the data clusters
1044A-1044E of FIG. 10F. When zoomed to a country level, the data
clusters 1044A-1044E may be aggregated to the data point 1048E.
[0073] It should be understood that in some configurations the user
106 may not need to specify or provide data regarding the locations
to aggregate. For example, the 3D data visualization component 114
of FIG. 1 may be configured to take street level data and, using
data as may be provided from geographic data stores such as the map
data 122 of FIG. 1, derive the state, county, or country from the
data. In some configurations, the data may have regions already
provided, and thus, the 3D data visualization component 114 may not
need to determine the various geographic regions.
[0074] In some configurations, the user 106 may want to specify the
geographic regions to which data is aggregated. Returning to FIG.
10E, the data aggregation chart 1020 corresponds to geographic
regions, such as, a street, city, state or country. But, depending
on the data, the different geographic regions relating to various
states may have different amounts of data associated with the
states. For example, data relating to the sale of snow skis may
have highly dense clusters around locations known for snow skiing,
but, the same data may have large sparse regions not typically
associated with snow skiing. Aggregating data in which the states
are treated as equal geographic boundaries may cause a loss of
information due to the aggregation of the dense clusters while
providing little to no useful information in states in which no
data is present. Therefore, in some configurations, the user 106 or
other entity may delineate specific locations of aggregation rather
than using geographic boundaries. An aspect of this is further
illustrated in FIG. 10F.
[0075] Illustrated in a map 1050 of FIG. 10H are user-defined
regions 1052A-1052F. It should be understood that the regions
1052A-1052F may be defined by various technologies. For example,
the user 106 may use various inputs to delineate the regions
1052A-1052F. Some of the inputs may include, but are not limited
to, graphical regions, regional definitions provided by a data
store such as the map data 122, organizational regions provided by
a company, activity regions provided by an entity such as the 3D
data visualization component 114. An example of activity regions
may be regions defined according to the overall activity of the
selected data. For example, it may be beneficial to use the
selected data to divide a 3D rendering of that data into regions
having the same or similar activity, whereby regions of large
activity are split into multiple regions to reduce the activity
within any one region and regions of sparse activity are collected
together into one region to closely proximate the activity of the
other regions. It should be appreciated that the present disclosure
is not limited to regions that are user-defined, as other
technologies may be used to define the regions 1052A-1052F. For
example, the 3D data visualization component 114 may be configured
to automatically delineate the regions 1052A-1052F based on various
factors such as the data rendered in the map 1050.
[0076] The regions 1052A-1052F define the locations in which data
is aggregated at a particular zoom level. An example, is region
line 1054, which delineates the generally western United States as
a single region. Data which may be displayed individually at lower
zoom levels, such as a city or street zoom level, may be aggregated
in the region 1052A when at a country zoom level. Thus, referring
back to the snow ski example provided above, using various
embodiments described herein, sparse data areas may be aggregated
together to more closely align with the amount of data associated
with the dense data areas.
[0077] In some configurations, the user 106 may want to analyze the
data rendered in a 3D data environment. FIGS. 11A-11B illustrate
one configuration in which the productivity application 104 changes
a viewing aspect of certain selected data rendered in a 3D data
environment based on an input from the user 106. FIG. 11A is a line
drawing illustrating the framing of data in a 3D data environment.
As used herein, a viewing aspect can refer to changing the
appearance, orientation, view, etc. of the data selected to be
framed. As illustrated in this configuration, the computing device
102 has rendered data 1100 from a spreadsheet 1102 into a 3D data
environment 1104, illustrated in FIG. 11B. While navigating through
the 3D data environment 1104, certain data rendered within the 3D
data environment 1104 may be the focus of additional analysis.
Illustrated in FIG. 11A are data points 1106 within 3D data
environment 1104 that have been selected for focus. The 3D data
environment 1104 of FIG. 11B is modified so that a viewing aspect
of the data points 1106 is changed in the framing area 1108.
[0078] It should be understood that the concepts and technologies
disclosed herein are not limited to any particular manner in which
the data in the 3D data environment 1104 is framed using framing
area 1108, as other methods or technologies may be used. Further,
it should be understood that framing as presently disclosed is not
limited to an operation originating with the data 1100 within the
spreadsheet 1102. For example, the productivity application 104 may
receive an input, wherein the input is a selection of certain
rendered data in the 3D data environment 1104. In response to the
selection, the data selected in the 3D data environment 1104 may be
framed in the spreadsheet 1102. Additionally, in some
configurations, when the data points 1106 are framed in the framing
area 1108, the 3D data environment 1104 may be configured to change
the view, for example, by zooming in or zooming out, depending on
various user preferences or the configuration of the system.
[0079] Turning now to FIG. 12, FIG. 12 is a flow diagram showing
aspects of a method 1200 for providing 3D data environment
navigation tool within a 3D data environment are illustrated,
according to an illustrative embodiment. It should be understood
that the operations of the methods disclosed herein are not
necessarily presented in any particular order and that performance
of some or all of the operations in an alternative order(s) is
possible and is contemplated. The operations have been presented in
the demonstrated order for ease of description and illustration.
Operations may be added, omitted, and/or performed simultaneously,
without departing from the scope of the appended claims.
[0080] It also should be understood that the illustrated methods
can be ended at any time and need not be performed in its entirety.
Some or all operations of the methods, and/or substantially
equivalent operations, can be performed by execution of
computer-readable instructions included on a computer-storage
media, as defined herein. The term "computer-readable
instructions," and variants thereof, as used in the description and
claims, is used expansively herein to include routines,
applications, application modules, program modules, programs,
components, data structures, algorithms, and the like.
Computer-readable instructions can be implemented on various system
configurations, including single-processor or multiprocessor
systems, minicomputers, mainframe computers, personal computers,
hand-held computing devices, microprocessor-based, programmable
consumer electronics, combinations thereof, and the like.
[0081] Thus, it should be appreciated that the logical operations
described herein are implemented (1) as a sequence of computer
implemented acts or program modules running on a computing system
and/or (2) as interconnected machine logic circuits or circuit
modules within the computing system. The implementation is a matter
of choice dependent on the performance and other requirements of
the computing system. Accordingly, the logical operations described
herein are referred to variously as states, operations, structural
devices, acts, or modules. These operations, structural devices,
acts, and modules may be implemented in software, in firmware, in
special purpose digital logic, and any combination thereof.
[0082] The operations of the method 1200 are described herein below
as being implemented, at least in part, by the productivity
application 104, the 3D data visualization component 114, and the
3D data environment navigation component 116, or combinations
thereof. One or more of the operations of the method 1200 may
alternatively or additionally be implemented, at least in part, by
the similar components in either computing device 102 or a
similarly configured server computer.
[0083] The method 1200 begins at operation 1202 and proceeds to
operation 1204, wherein the computing device 102 detects selected
data to be rendered in a 3D data environment. In some embodiments,
the data to be rendered can be selected by the user 106 or other
entity. The election of the data can be received by the
productivity application 104 executing on the computing device 102.
Various methods can be used to select the data to be rendered in
the 3D data environment. For example, the data can be selected
using the keyboard 804, the mouse 802 or the monitor 108, if
configured to be a touchscreen capable of receiving tactile inputs.
It should be appreciated that the concepts and technologies
disclosed herein are not limited to any particular data selection
input method. Additionally, it should be appreciated that the
concepts and technologies disclosed herein are not limited to any
particular type of selected data. For example, data can selected
from the spreadsheet data 112, the map data 122, other sources of
data (not shown), or any combination thereof.
[0084] From operation 1204, the method 1200 proceeds to operation
1206, wherein the computing device 102 renders the 3D data
environment. The 3D data environment can take various forms. In one
configuration, the 3D data environment can include a map in which
selected data can be geographically rendered. In another
configuration, the 3D data environment can include a three
dimensional space, wherein one or more of the axes of a three
dimensional space represent one or more data types to be rendered
in the 3D data environment. For example, the data can include sales
data, store opening data, and geographical location data. Thus, in
that example, the 3D data environment rendered can use the sales
data, store opening data, and geographical location data as the
axes in a three dimensional space.
[0085] From operation 1206, the method 1200 proceeds to operation
1208, wherein the selected data is rendered in the 3D data
environment that was rendered in operation 1206. The rendering of
the 3D data environment at operation 1206 and the selected data in
the 3D data environment at operation 1208 can be completed in
various manners. For example, if the selected data includes
geographical or location data, the selected data can be rendered in
a map such as the map 300 of FIG. 3. In another example, if the
selected data includes relative position and size information, the
selected data can be rendered in a 3D data environment such as the
3D data environment 600 in FIGS. 5-9.
[0086] From operation 1208, the method 1200 proceeds to operation
1210, wherein the computing device 102 detects an input to change
orientation of the 3D data environment. In some configurations, it
may be desirable or necessary to change the orientation of the 3D
data environment to further explore aspects of the selected data
rendered in the 3D data environment. For example, the user 106 may
want to take a tour of the selected data rendered in the 3D data
environment by changing the orientation (or view) of the selected
data. Furthermore, the user 106 can use various methods to change
the orientation of the 3D data environment, such as by using the
keyboard 804, the mouse 802 or the monitor 108 if configured to be
a touchscreen capable of receiving tactile input. The user also can
use various navigational controls, such as, the navigation pane 400
of FIG. 4 or the data navigation panel 612 of FIGS. 6-9.
Additionally, the user 106 can use tactile input on a touchscreen
(e.g. finger, toe, or other devices that provide a physical
interface with the touchscreen), as illustrated by way of example
in FIG. 9. Because the input to change the orientation of the 3D
data environment can be received in additional and/or alternative
manners, it should be understood that these examples are
illustrative, and should not be construed as being limiting in any
way. The method 1200 proceeds to operation 1212, where the
computing device 102 determines if the view of the selected data is
to be changed. In particular, the computing device 102 can
determine how the view of the selected data in a 3D data
environment is changed based on the input to change the orientation
of the 3D data environment at operation 1210.
[0087] Although not limited to any particular way in which the view
of the selected data rendered in a 3D data environment is changed
based on an input to change the orientation of the 3D data
environment, some examples include: changing the size of the data
based on a zoom out or zoom in operation, as illustrated by way of
example in FIGS. 10B and 10C; changing the data from one type of
configuration to another type of configuration, illustrated by way
of example as changing the data illustrated in FIGS. 10B and 10C to
a billboard illustrated in FIG. 10D; and/or change the orientation
of the data in a manner similar to the change in orientation of the
3D data environment, as illustrated by way of example in FIGS. 6
and 7.
[0088] If it is determined that the view of the selected data
rendered in the 3D data environment is not to be changed, the
method 1200 proceeds to operation 1214. In operation 1214, the
computing device 102 can change the orientation of the 3D data
environment, and the method 1200 ends at operation 1220. In this
configuration, the view (or orientation) of the selected data
rendered in the 3D data environment was not changed even though the
orientation of the 3D data environment was changed. An example of
this configuration is illustrated in FIG. 10D, wherein the data
point 1002 and the data point 1004 was replaced by the billboard
1006 because the zoom level was between "Country Level" and "Camera
Ceiling," as described in FIG. 10A. Between those levels, in some
configurations, inputs to change the orientation of the 3D data
environment 1000 would not result in a change of the view of the
data point 1002 and the data point 1004.
[0089] If it is determined that the view of the selected data
rendered in the 3D data environment is to be changed, the method
1200 proceeds to operation 1218, wherein the change in the view of
the selected data is applied, and the method 1200 proceeds to
operation 1220, wherein the orientation of the 3D data environment
and the selected data are changed based on the input received at
operation 1210. The method thereafter ends at operation 1216.
[0090] FIG. 13 illustrates an illustrative computer architecture
1300 for a device capable of executing the software components
described herein for providing the concepts and technologies
described herein. Thus, the computer architecture 1300 illustrated
in FIG. 13 illustrates an architecture for a server computer,
mobile phone, a PDA, a smart phone, a desktop computer, a netbook
computer, a tablet computer, and/or a laptop computer. The computer
architecture 1300 may be utilized to execute any aspects of the
software components presented herein.
[0091] The computer architecture 1300 illustrated in FIG. 13
includes a central processing unit ("CPU") 1302, a system memory
1304, including a random access memory 1306 ("RAM") and a read-only
memory ("ROM") 1308, and a system bus 1310 that couples the memory
1304 to the CPU 1302. A basic input/output system containing the
basic routines that help to transfer information between elements
within the computer architecture 1300, such as during startup, is
stored in the ROM 1308. The computer architecture 1300 further
includes a mass storage device 1312 for storing the operating
system 101 from FIG. 1 and one or more application programs
including, but not limited to, the productivity application 104,
the 3D data visualization component 114 and the 3D data environment
navigation component 116.
[0092] The mass storage device 1312 is connected to the CPU 1302
through a mass storage controller (not shown) connected to the bus
1310. The mass storage device 1312 and its associated
computer-readable media provide non-volatile storage for the
computer architecture 1300. Although the description of
computer-readable media contained herein refers to a mass storage
device, such as a hard disk or CD-ROM drive, it should be
appreciated by those skilled in the art that computer-readable
media can be any available computer storage media or communication
media that can be accessed by the computer architecture 1300.
[0093] Communication media includes computer readable instructions,
data structures, program modules, or other data in a modulated data
signal such as a carrier wave or other transport mechanism and
includes any delivery media. The term "modulated data signal" means
a signal that has one or more of its characteristics changed or set
in a manner as to encode information in the signal. By way of
example, and not limitation, communication media includes wired
media such as a wired network or direct-wired connection, and
wireless media such as acoustic, RF, infrared and other wireless
media. Combinations of the any of the above should also be included
within the scope of computer-readable media.
[0094] By way of example, and not limitation, computer storage
media may include volatile and non-volatile, removable and
non-removable media implemented in any method or technology for
storage of information such as computer-readable instructions, data
structures, program modules or other data. For example, computer
media includes, but is not limited to, RAM, ROM, EPROM, EEPROM,
flash memory or other solid state memory technology, CD-ROM,
digital versatile disks ("DVD"), HD-DVD, BLU-RAY, or other optical
storage, magnetic cassettes, magnetic tape, magnetic disk storage
or other magnetic storage devices, or any other medium that can be
used to store the desired information and which can be accessed by
the computer architecture 900. For purposes of the claims, the
phrase "computer storage medium," and variations thereof, does not
include waves or signals per se and/or communication media.
[0095] According to various embodiments, the computer architecture
1300 may operate in a networked environment using logical
connections to remote computers through a network such as the
network 118. The computer architecture 1300 may connect to the
network 118 through a network interface unit 1316 connected to the
bus 1310. It should be appreciated that the network interface unit
1316 also may be utilized to connect to other types of networks and
remote computer systems. The computer architecture 1300 also may
include an input/output controller 1318 for receiving and
processing input from a number of other devices, including a
keyboard, mouse, or electronic stylus (illustrated by way of
example in FIG. 13). Similarly, the input/output controller 1318
may provide output to a display screen, a printer, or other type of
output device.
[0096] It should be appreciated that the software components
described herein may, when loaded into the CPU 1302 and executed,
transform the CPU 1302 and the overall computer architecture 1300
from a general-purpose computing system into a special-purpose
computing system customized to facilitate the functionality
presented herein. The CPU 1302 may be constructed from any number
of transistors or other discrete circuit elements, which may
individually or collectively assume any number of states. More
specifically, the CPU 1302 may operate as a finite-state machine,
in response to executable instructions contained within the
software modules disclosed herein. These computer-executable
instructions may transform the CPU 1302 by specifying how the CPU
1302 transitions between states, thereby transforming the
transistors or other discrete hardware elements constituting the
CPU 1302.
[0097] Encoding the software modules presented herein also may
transform the physical structure of the computer-readable media
presented herein. The specific transformation of physical structure
may depend on various factors, in different implementations of this
description. Examples of such factors may include, but are not
limited to, the technology used to implement the computer-readable
media, whether the computer-readable media is characterized as
primary or secondary storage, and the like. For example, if the
computer-readable media is implemented as semiconductor-based
memory, the software disclosed herein may be encoded on the
computer-readable media by transforming the physical state of the
semiconductor memory. For example, the software may transform the
state of transistors, capacitors, or other discrete circuit
elements constituting the semiconductor memory. The software also
may transform the physical state of such components in order to
store data thereupon.
[0098] As another example, the computer-readable media disclosed
herein may be implemented using magnetic or optical technology. In
such implementations, the software presented herein may transform
the physical state of magnetic or optical media, when the software
is encoded therein. These transformations may include altering the
magnetic characteristics of particular locations within given
magnetic media. These transformations also may include altering the
physical features or characteristics of particular locations within
given optical media, to change the optical characteristics of those
locations. Other transformations of physical media are possible
without departing from the scope and spirit of the present
description, with the foregoing examples provided only to
facilitate this description.
[0099] In light of the above, it should be appreciated that many
types of physical transformations take place in the computer
architecture 1300 in order to store and execute the software
components presented herein. It also should be appreciated that the
computer architecture 1300 may include other types of computing
devices, including hand-held computers, embedded computer systems,
personal digital assistants, and other types of computing devices
known to those skilled in the art. It is also contemplated that the
computer architecture 1300 may not include all of the components
shown in FIG. 13, may include other components that are not
explicitly shown in FIG. 13, or may utilize an architecture
completely different than that shown in FIG. 13.
[0100] FIG. 14 illustrates an illustrative distributed computing
environment 1400 capable of executing the software components
described herein for searching for providing the concepts and
technologies described herein. Thus, the distributed computing
environment 1400 illustrated in FIG. 14 can be used to provide the
functionality described herein. The distributed computing
environment 1400 thus may be utilized to execute any aspects of the
software components presented herein.
[0101] According to various implementations, the distributed
computing environment 1400 includes a computing environment 1402
operating on, in communication with, or as part of the network 118.
The network 118 also can include various access networks. One or
more client devices 1406A-1406N (hereinafter referred to
collectively and/or generically as "clients 1406") can communicate
with the computing environment 1402 via the network 118 and/or
other connections (not illustrated in FIG. 14). In the illustrated
embodiment, the clients 1406 include a computing device 1406A such
as a laptop computer, a desktop computer, or other computing
device; a slate or tablet computing device ("tablet computing
device") 1406B; a mobile computing device 1406C such as a mobile
telephone, a smart phone, or other mobile computing device; a
server computer 1406D; and/or other devices 1406N. It should be
understood that any number of clients 1406 can communicate with the
computing environment 1402. It should be understood that the
illustrated clients 1406 and computing architectures illustrated
and described herein are illustrative, and should not be construed
as being limited in any way.
[0102] In the illustrated embodiment, the computing environment
1402 includes application servers 1408, data storage 1410, and one
or more network interfaces 1412. According to various
implementations, the functionality of the application servers 1408
can be provided by one or more server computers that are executing
as part of, or in communication with, the network 1404. The
application servers 1408 can host various services, virtual
machines, portals, and/or other resources. In the illustrated
embodiment, the application servers 1408 host one or more virtual
machines 1414 for hosting applications or other functionality.
According to various implementations, the virtual machines 1414
host one or more applications and/or software modules for providing
the functionality described herein. It should be understood that
this embodiment is illustrative, and should not be construed as
being limiting in any way. The application servers 1408 also host
or provide access to one or more Web portals, link pages, Web
sites, and/or other information ("Web portals") 1416.
[0103] According to various implementations, the application
servers 1408 also include one or more mailbox services 1418 and one
or more messaging services 1420. The mailbox services 1418 can
include electronic mail ("email") services. The mailbox services
1418 also can include various personal information management
("PIM") services including, but not limited to, calendar services,
contact management services, collaboration services, and/or other
services. The messaging services 1420 can include, but are not
limited to, instant messaging services, chat services, forum
services, and/or other communication services.
[0104] The application servers 1408 also can include one or more
social networking services 1422. The social networking services
1422 can include various social networking services including, but
not limited to, services for sharing or posting status updates,
instant messages, links, photos, videos, and/or other information;
services for commenting or displaying interest in articles,
products, blogs, or other resources; and/or other services. In some
embodiments, the social networking services 1422 are provided by or
include the FACEBOOK social networking service, the LINKEDIN
professional networking service, the MYSPACE social networking
service, the FOURSQUARE geographic networking service, the YAMMER
office colleague networking service, and the like. In other
embodiments, the social networking services 1422 are provided by
other services, sites, and/or providers that may or may not
explicitly be known as social networking providers. For example,
some web sites allow users to interact with one another via email,
chat services, and/or other means during various activities and/or
contexts such as reading published articles, commenting on goods or
services, publishing, collaboration, gaming, and the like. Examples
of such services include, but are not limited to, the WINDOWS LIVE
service and the XBOX LIVE service from Microsoft Corporation in
Redmond, Wash. Other services are possible and are
contemplated.
[0105] The social networking services 1422 also can include
commenting, blogging, and/or microblogging services. Examples of
such services include, but are not limited to, the YELP commenting
service, the KUDZU review service, the OFFICETALK enterprise
microblogging service, the TWITTER messaging service, the GOOGLE
BUZZ service, and/or other services. It should be appreciated that
the above lists of services are not exhaustive and that numerous
additional and/or alternative social networking services 1422 are
not mentioned herein for the sake of brevity. As such, the above
embodiments are illustrative, and should not be construed as being
limited in any way.
[0106] As shown in FIG. 14, the application servers 1408 also can
host other services, applications, portals, and/or other resources
("other resources") 1424. The other resources 1424 can include, but
are not limited to, the productivity application 104, the 3D data
visualization component 114 and/or the 3D data environment
navigation component 116. It thus can be appreciated that the
computing environment 1402 can provide integration of the concepts
and technologies disclosed herein with various mailbox, messaging,
social networking, and/or other services or resources.
[0107] As mentioned above, the computing environment 1402 can
include the data storage 1410. According to various
implementations, the functionality of the data storage 1410 is
provided by one or more data stores operating on, or in
communication with, the network 118. The functionality of the data
storage 1410 also can be provided by one or more server computers
configured to host data for the computing environment 1402. The
data storage 1410 can include, host, or provide one or more real or
virtual datastores 1426A-1426N (hereinafter referred to
collectively and/or generically as "datastores 1426"). The
datastores 1426 are configured to host data used or created by the
application servers 1408 and/or other data. Although not
illustrated in FIG. 14, the datastores 1426 also can host or store
data stores 224A-224N in data store 224 shown in FIG. 2.
[0108] The computing environment 1402 can communicate with, or be
accessed by, the network interfaces 1412. The network interfaces
1412 can include various types of network hardware and software for
supporting communications between two or more computing devices
including, but not limited to, the clients 1406 and the application
servers 1408. It should be appreciated that the network interfaces
1412 also may be utilized to connect to other types of networks
and/or computer systems.
[0109] It should be understood that the distributed computing
environment 1400 described herein can provide any aspects of the
software elements described herein with any number of virtual
computing resources and/or other distributed computing
functionality that can be configured to execute any aspects of the
software components disclosed herein. According to various
implementations of the concepts and technologies disclosed herein,
the distributed computing environment 1400 provides the software
functionality described herein as a service to the clients 1406. It
should be understood that the clients 1406 can include real or
virtual machines including, but not limited to, server computers,
web servers, personal computers, mobile computing devices, smart
phones, and/or other devices. As such, various embodiments of the
concepts and technologies disclosed herein enable any device
configured to access the distributed computing environment 1400 to
utilize the functionality described herein.
[0110] Turning now to FIG. 15, an illustrative computing device
architecture 1500 for a computing device that is capable of
executing various software components described herein for
navigating data within a 3D data environment. The computing device
architecture 1500 is applicable to computing devices that
facilitate mobile computing due, in part, to form factor, wireless
connectivity, and/or battery-powered operation. In some
embodiments, the computing devices include, but are not limited to,
mobile telephones, tablet devices, slate devices, portable video
game devices, and the like. Moreover, the computing device
architecture 1500 is applicable to any of the clients 1406 shown in
FIG. 14. Furthermore, aspects of the computing device architecture
1500 may be applicable to traditional desktop computers, portable
computers (e.g., laptops, notebooks, ultra-portables, and
netbooks), server computers, and other computer systems, such as
described herein with reference to FIG. 1. For example, the single
touch and multi-touch aspects disclosed herein below may be applied
to desktop computers that utilize a touchscreen or some other
touch-enabled device, such as a touch-enabled track pad or
touch-enabled mouse.
[0111] The computing device architecture 1500 illustrated in FIG.
15 includes a processor 1502, memory components 1504, network
connectivity components 1506, sensor components 1508, input/output
("I/O") components 1510, and power components 1512. In the
illustrated embodiment, the processor 1502 is in communication with
the memory components 1504, the network connectivity components
1506, the sensor components 1508, the I/O components 1510, and the
power components 1512. Although no connections are shown between
the individuals components illustrated in FIG. 15, the components
can interact to carry out device functions. In some embodiments,
the components are arranged so as to communicate via one or more
busses (not shown).
[0112] The processor 1502 includes a central processing unit
("CPU") configured to process data, execute computer-executable
instructions of one or more application programs, and communicate
with other components of the computing device architecture 1500 in
order to perform various functionality described herein. The
processor 1502 may be utilized to execute aspects of the software
components presented herein and, particularly, those that utilize,
at least in part, a touch-enabled input.
[0113] In some embodiments, the processor 1502 includes a graphics
processing unit ("GPU") configured to accelerate operations
performed by the CPU, including, but not limited to, operations
performed by executing general-purpose scientific and engineering
computing applications, as well as graphics-intensive computing
applications such as high resolution video (e.g., 720P, 1080P, and
greater), video games, three-dimensional ("3D") modeling
applications, and the like. In some embodiments, the processor 1502
is configured to communicate with a discrete GPU (not shown). In
any case, the CPU and GPU may be configured in accordance with a
co-processing CPU/GPU computing model, wherein the sequential part
of an application executes on the CPU and the
computationally-intensive part is accelerated by the GPU.
[0114] In some embodiments, the processor 1502 is, or is included
in, a system-on-chip ("SoC") along with one or more of the other
components described herein below. For example, the SoC may include
the processor 1502, a GPU, one or more of the network connectivity
components 1506, and one or more of the sensor components 1508. In
some embodiments, the processor 1502 is fabricated, in part,
utilizing a package-on-package ("PoP") integrated circuit packaging
technique. Moreover, the processor 1502 may be a single core or
multi-core processor.
[0115] The processor 1502 may be created in accordance with an ARM
architecture, available for license from ARM HOLDINGS of Cambridge,
United Kingdom. Alternatively, the processor 1502 may be created in
accordance with an x86 architecture, such as is available from
INTEL CORPORATION of Mountain View, Calif. and others. In some
embodiments, the processor 1502 is a SNAPDRAGON SoC, available from
QUALCOMM of San Diego, Calif., a TEGRA SoC, available from NVIDIA
of Santa Clara, Calif., a HUMMINGBIRD SoC, available from SAMSUNG
of Seoul, South Korea, an Open Multimedia Application Platform
("OMAP") SoC, available from TEXAS INSTRUMENTS of Dallas, Tex., a
customized version of any of the above SoCs, or a proprietary
SoC.
[0116] The memory components 1504 include a random access memory
("RAM") 1514, a read-only memory ("ROM") 1516, an integrated
storage memory ("integrated storage") 1518, and a removable storage
memory ("removable storage") 1520. In some embodiments, the RAM
1514 or a portion thereof, the ROM 1516 or a portion thereof,
and/or some combination the RAM 1514 and the ROM 1516 is integrated
in the processor 1502. In some embodiments, the ROM 1516 is
configured to store a firmware, an operating system or a portion
thereof (e.g., operating system kernel), and/or a bootloader to
load an operating system kernel from the integrated storage 1518 or
the removable storage 1520.
[0117] The integrated storage 1518 can include a solid-state
memory, a hard disk, or a combination of solid-state memory and a
hard disk. The integrated storage 1518 may be soldered or otherwise
connected to a logic board upon which the processor 1502 and other
components described herein also may be connected. As such, the
integrated storage 1518 is integrated in the computing device. The
integrated storage 1518 is configured to store an operating system
or portions thereof, application programs, data, and other software
components described herein.
[0118] The removable storage 1520 can include a solid-state memory,
a hard disk, or a combination of solid-state memory and a hard
disk. In some embodiments, the removable storage 1520 is provided
in lieu of the integrated storage 1518. In other embodiments, the
removable storage 1520 is provided as additional optional storage.
In some embodiments, the removable storage 1520 is logically
combined with the integrated storage 1518 such that the total
available storage is made available and shown to a user as a total
combined capacity of the integrated storage 1518 and the removable
storage 1520.
[0119] The removable storage 1520 is configured to be inserted into
a removable storage memory slot (not shown) or other mechanism by
which the removable storage 1520 is inserted and secured to
facilitate a connection over which the removable storage 1520 can
communicate with other components of the computing device, such as
the processor 1502. The removable storage 1520 may be embodied in
various memory card formats including, but not limited to, PC card,
CompactFlash card, memory stick, secure digital ("SD"), miniSD,
microSD, universal integrated circuit card ("UICC") (e.g., a
subscriber identity module ("SIM") or universal SIM ("USIM")), a
proprietary format, or the like.
[0120] It can be understood that one or more of the memory
components 1504 can store an operating system. According to various
embodiments, the operating system includes, but is not limited to,
SYMBIAN OS from SYMBIAN LIMITED, WINDOWS MOBILE OS from Microsoft
Corporation of Redmond, Wash., WINDOWS PHONE OS from Microsoft
Corporation, WINDOWS from Microsoft Corporation, PALM WEBOS from
Hewlett-Packard Company of Palo Alto, Calif., BLACKBERRY OS from
Research In Motion Limited of Waterloo, Ontario, Canada, IOS from
Apple Inc. of Cupertino, Calif., and ANDROID OS from Google Inc. of
Mountain View, Calif. Other operating systems are contemplated.
[0121] The network connectivity components 1506 include a wireless
wide area network component ("WWAN component") 1522, a wireless
local area network component ("WLAN component") 1524, and a
wireless personal area network component ("WPAN component") 1526.
The network connectivity components 1506 facilitate communications
to and from the network 118, which may be a WWAN, a WLAN, or a
WPAN. Although a single network 118 is illustrated, the network
connectivity components 1506 may facilitate simultaneous
communication with multiple networks. For example, the network
connectivity components 1506 may facilitate simultaneous
communications with multiple networks via one or more of a WWAN, a
WLAN, or a WPAN.
[0122] The network 118 may be a WWAN, such as a mobile
telecommunications network utilizing one or more mobile
telecommunications technologies to provide voice and/or data
services to a computing device utilizing the computing device
architecture 1500 via the WWAN component 1522. The mobile
telecommunications technologies can include, but are not limited
to, Global System for Mobile communications ("GSM"), Code Division
Multiple Access ("CDMA") ONE, CDMA2000, Universal Mobile
Telecommunications System ("UMTS"), Long Term Evolution ("LTE"),
and Worldwide Interoperability for Microwave Access ("WiMAX").
Moreover, the network 118 may utilize various channel access
methods (which may or may not be used by the aforementioned
standards) including, but not limited to, Time Division Multiple
Access ("TDMA"), Frequency Division Multiple Access ("FDMA"), CDMA,
wideband CDMA ("W-CDMA"), Orthogonal Frequency Division
Multiplexing ("OFDM"), Space Division Multiple Access ("SDMA"), and
the like. Data communications may be provided using General Packet
Radio Service ("GPRS"), Enhanced Data rates for Global Evolution
("EDGE"), the High-Speed Packet Access ("HSPA") protocol family
including High-Speed Downlink Packet Access ("HSDPA"), Enhanced
Uplink ("EUL") or otherwise termed High-Speed Uplink Packet Access
("HSUPA"), Evolved HSPA ("HSPA+"), LTE, and various other current
and future wireless data access standards. The network 118 may be
configured to provide voice and/or data communications with any
combination of the above technologies. The network 118 may be
configured to or adapted to provide voice and/or data
communications in accordance with future generation
technologies.
[0123] In some embodiments, the WWAN component 1522 is configured
to provide dual-multi-mode connectivity to the network 118. For
example, the WWAN component 1522 may be configured to provide
connectivity to the network 118, wherein the network 118 provides
service via GSM and UMTS technologies, or via some other
combination of technologies. Alternatively, multiple WWAN
components 1522 may be utilized to perform such functionality,
and/or provide additional functionality to support other
non-compatible technologies (i.e., incapable of being supported by
a single WWAN component). The WWAN component 1522 may facilitate
similar connectivity to multiple networks (e.g., a UMTS network and
an LTE network).
[0124] The network 118 may be a WLAN operating in accordance with
one or more Institute of Electrical and Electronic Engineers
("IEEE") 802.11 standards, such as IEEE 802.11a, 802.11b, 802.11g,
802.11n, and/or future 802.11 standard (referred to herein
collectively as WI-FI). Draft 802.11 standards are also
contemplated. In some embodiments, the WLAN is implemented
utilizing one or more wireless WI-FI access points. In some
embodiments, one or more of the wireless WI-FI access points are
another computing device with connectivity to a WWAN that are
functioning as a WI-FI hotspot. The WLAN component 1524 is
configured to connect to the network 118 via the WI-FI access
points. Such connections may be secured via various encryption
technologies including, but not limited, WI-FI Protected Access
("WPA"), WPA2, Wired Equivalent Privacy ("WEP"), and the like.
[0125] The network 118 may be a WPAN operating in accordance with
Infrared Data Association ("IrDA"), BLUETOOTH, wireless Universal
Serial Bus ("USB"), Z-Wave, ZIGBEE, or some other short-range
wireless technology. In some embodiments, the WPAN component 1526
is configured to facilitate communications with other devices, such
as peripherals, computers, or other computing devices via the
WPAN.
[0126] The sensor components 1508 include a magnetometer 1528, an
ambient light sensor 1530, a proximity sensor 1532, an
accelerometer 1534, a gyroscope 1536, and a Global Positioning
System sensor ("GPS sensor") 1538. It is contemplated that other
sensors, such as, but not limited to, temperature sensors or shock
detection sensors, also may be incorporated in the computing device
architecture 1500.
[0127] The magnetometer 1528 is configured to measure the strength
and direction of a magnetic field. In some embodiments the
magnetometer 1528 provides measurements to a compass application
program stored within one of the memory components 1504 in order to
provide a user with accurate directions in a frame of reference
including the cardinal directions, north, south, east, and west.
Similar measurements may be provided to a navigation application
program that includes a compass component. Other uses of
measurements obtained by the magnetometer 1528 are
contemplated.
[0128] The ambient light sensor 1530 is configured to measure
ambient light. In some embodiments, the ambient light sensor 1530
provides measurements to an application program stored within one
the memory components 1504 in order to automatically adjust the
brightness of a display (described below) to compensate for
low-light and high-light environments. Other uses of measurements
obtained by the ambient light sensor 1530 are contemplated.
[0129] The proximity sensor 1532 is configured to detect the
presence of an object or thing in proximity to the computing device
without direct contact. In some embodiments, the proximity sensor
1532 detects the presence of a user's body (e.g., the user's face)
and provides this information to an application program stored
within one of the memory components 1504 that utilizes the
proximity information to enable or disable some functionality of
the computing device. For example, a telephone application program
may automatically disable a touchscreen (described below) in
response to receiving the proximity information so that the user's
face does not inadvertently end a call or enable/disable other
functionality within the telephone application program during the
call. Other uses of proximity as detected by the proximity sensor
1532 are contemplated.
[0130] The accelerometer 1534 is configured to measure proper
acceleration. In some embodiments, output from the accelerometer
1534 is used by an application program as an input mechanism to
control some functionality of the application program. For example,
the application program may be a video game in which a character, a
portion thereof, or an object is moved or otherwise manipulated in
response to input received via the accelerometer 1534. In some
embodiments, output from the accelerometer 1534 is provided to an
application program for use in switching between landscape and
portrait modes, calculating coordinate acceleration, or detecting a
fall. Other uses of the accelerometer 1534 are contemplated.
[0131] The gyroscope 1536 is configured to measure and maintain
orientation. In some embodiments, output from the gyroscope 1536 is
used by an application program as an input mechanism to control
some functionality of the application program. For example, the
gyroscope 1536 can be used for accurate recognition of movement
within a 3D data environment of a video game application or some
other application. In some embodiments, an application program
utilizes output from the gyroscope 1536 and the accelerometer 1534
to enhance control of some functionality of the application
program. Other uses of the gyroscope 1536 are contemplated.
[0132] The GPS sensor 1538 is configured to receive signals from
GPS satellites for use in calculating a location. The location
calculated by the GPS sensor 1538 may be used by any application
program that requires or benefits from location information. For
example, the location calculated by the GPS sensor 1538 may be used
with a navigation application program to provide directions from
the location to a destination or directions from the destination to
the location. Moreover, the GPS sensor 1538 may be used to provide
location information to an external location-based service, such as
E911 service. The GPS sensor 1538 may obtain location information
generated via WI-FI, WIMAX, and/or cellular triangulation
techniques utilizing one or more of the network connectivity
components 1506 to aid the GPS sensor 1538 in obtaining a location
fix. The GPS sensor 1538 may also be used in Assisted GPS ("A-GPS")
systems.
[0133] The I/O components 1510 include a display 1540, a
touchscreen 1542, a data I/O interface component ("data I/O") 1544,
an audio I/O interface component ("audio I/O") 1546, a video I/O
interface component ("video I/O") 1548, and a camera 1550. In some
embodiments, the display 1540 and the touchscreen 1542 are
combined. In some embodiments two or more of the data I/O component
1544, the audio I/O interface component 1546, and the video I/O
component 1548 are combined. The I/O components 1510 may include
discrete processors configured to support the various interface
described below, or may include processing functionality built-in
to the processor 1502.
[0134] The display 1540 is an output device configured to present
information in a visual form. In particular, the display 1540 may
present graphical user interface ("GUI") elements, text, images,
video, notifications, virtual buttons, virtual keyboards, messaging
data, Internet content, device status, time, date, calendar data,
preferences, map information, location information, and any other
information that is capable of being presented in a visual form. In
some embodiments, the display 1540 is a liquid crystal display
("LCD") utilizing any active or passive matrix technology and any
backlighting technology (if used). In some embodiments, the display
1540 is an organic light emitting diode ("OLED") display. Other
display types are contemplated.
[0135] The touchscreen 1542 is an input device configured to detect
the presence and location of a touch. The touchscreen 1542 may be a
resistive touchscreen, a capacitive touchscreen, a surface acoustic
wave touchscreen, an infrared touchscreen, an optical imaging
touchscreen, a dispersive signal touchscreen, an acoustic pulse
recognition touchscreen, or may utilize any other touchscreen
technology. In some embodiments, the touchscreen 1542 is
incorporated on top of the display 1540 as a transparent layer to
enable a user to use one or more touches to interact with objects
or other information presented on the display 1540. In other
embodiments, the touchscreen 1542 is a touch pad incorporated on a
surface of the computing device that does not include the display
1540. For example, the computing device may have a touchscreen
incorporated on top of the display 1540 and a touch pad on a
surface opposite the display 1540.
[0136] In some embodiments, the touchscreen 1542 is a single-touch
touchscreen. In other embodiments, the touchscreen 1542 is a
multi-touch touchscreen. In some embodiments, the touchscreen 1542
is configured to detect discrete touches, single touch gestures,
and/or multi-touch gestures. These are collectively referred to
herein as gestures for convenience. Several gestures will now be
described. It should be understood that these gestures are
illustrative and are not intended to limit the scope of the
appended claims. Moreover, the described gestures, additional
gestures, and/or alternative gestures may be implemented in
software for use with the touchscreen 1542. As such, a developer
may create gestures that are specific to a particular application
program.
[0137] In some embodiments, the touchscreen 1542 supports a tap
gesture in which a user taps the touchscreen 1542 once on an item
presented on the display 1540. The tap gesture may be used for
various reasons including, but not limited to, opening or launching
whatever the user taps. In some embodiments, the touchscreen 1542
supports a double tap gesture in which a user taps the touchscreen
1542 twice on an item presented on the display 1540. The double tap
gesture may be used for various reasons including, but not limited
to, zooming in or zooming out in stages. In some embodiments, the
touchscreen 1542 supports a tap and hold gesture in which a user
taps the touchscreen 1542 and maintains contact for at least a
pre-defined time. The tap and hold gesture may be used for various
reasons including, but not limited to, opening a context-specific
menu.
[0138] In some embodiments, the touchscreen 1542 supports a pan
gesture in which a user places a finger on the touchscreen 1542 and
maintains contact with the touchscreen 1542 while moving the finger
on the touchscreen 1542. The pan gesture may be used for various
reasons including, but not limited to, moving through screens,
images, or menus at a controlled rate. Multiple finger pan gestures
are also contemplated. In some embodiments, the touchscreen 1542
supports a flick gesture in which a user swipes a finger in the
direction the user wants the screen to move. The flick gesture may
be used for various reasons including, but not limited to,
scrolling horizontally or vertically through menus or pages. In
some embodiments, the touchscreen 1542 supports a pinch and stretch
gesture in which a user makes a pinching motion with two fingers
(e.g., thumb and forefinger) on the touchscreen 1542 or moves the
two fingers apart. The pinch and stretch gesture may be used for
various reasons including, but not limited to, zooming gradually in
or out of a website, map, or picture.
[0139] Although the above gestures have been described with
reference to the use one or more fingers for performing the
gestures, other appendages such as toes or objects such as styluses
may be used to interact with the touchscreen 1542. As such, the
above gestures should be understood as being illustrative and
should not be construed as being limiting in any way.
[0140] The data I/O interface component 1544 is configured to
facilitate input of data to the computing device and output of data
from the computing device. In some embodiments, the data I/O
interface component 1544 includes a connector configured to provide
wired connectivity between the computing device and a computer
system, for example, for synchronization operation purposes. The
connector may be a proprietary connector or a standardized
connector such as USB, micro-USB, mini-USB, or the like. In some
embodiments, the connector is a dock connector for docking the
computing device with another device such as a docking station,
audio device (e.g., a digital music player), or video device.
[0141] The audio I/O interface component 1546 is configured to
provide audio input and/or output capabilities to the computing
device. In some embodiments, the audio I/O interface component 1544
includes a microphone configured to collect audio signals. In some
embodiments, the audio I/O interface component 1544 includes a
headphone jack configured to provide connectivity for headphones or
other external speakers. In some embodiments, the audio I/O
interface component 1546 includes a speaker for the output of audio
signals. In some embodiments, the audio I/O interface component
1544 includes an optical audio cable out.
[0142] The video I/O interface component 1548 is configured to
provide video input and/or output capabilities to the computing
device. In some embodiments, the video I/O interface component 1548
includes a video connector configured to receive video as input
from another device (e.g., a video media player such as a DVD or
BLURAY player) or send video as output to another device (e.g., a
monitor, a television, or some other external display). In some
embodiments, the video I/O interface component 1548 includes a
High-Definition Multimedia Interface ("HDMI"), mini-HDMI,
micro-HDMI, DisplayPort, or proprietary connector to input/output
video content. In some embodiments, the video I/O interface
component 1548 or portions thereof is combined with the audio I/O
interface component 1546 or portions thereof.
[0143] The camera 1550 can be configured to capture still images
and/or video. The camera 1550 may utilize a charge coupled device
("CCD") or a complementary metal oxide semiconductor ("CMOS") image
sensor to capture images. In some embodiments, the camera 1550
includes a flash to aid in taking pictures in low-light
environments. Settings for the camera 1550 may be implemented as
hardware or software buttons.
[0144] Although not illustrated, one or more hardware buttons may
also be included in the computing device architecture 1500. The
hardware buttons may be used for controlling some operational
aspect of the computing device. The hardware buttons may be
dedicated buttons or multi-use buttons. The hardware buttons may be
mechanical or sensor-based.
[0145] The illustrated power components 1512 include one or more
batteries 1552, which can be connected to a battery gauge 1554. The
batteries 1552 may be rechargeable or disposable. Rechargeable
battery types include, but are not limited to, lithium polymer,
lithium ion, nickel cadmium, and nickel metal hydride. Each of the
batteries 1552 may be made of one or more cells.
[0146] The battery gauge 1554 can be configured to measure battery
parameters such as current, voltage, and temperature. In some
embodiments, the battery gauge 1554 is configured to measure the
effect of a battery's discharge rate, temperature, age and other
factors to predict remaining life within a certain percentage of
error. In some embodiments, the battery gauge 1554 provides
measurements to an application program that is configured to
utilize the measurements to present useful power management data to
a user. Power management data may include one or more of a
percentage of battery used, a percentage of battery remaining, a
battery condition, a remaining time, a remaining capacity (e.g., in
watt hours), a current draw, and a voltage.
[0147] The power components 1512 may also include a power
connector, which may be combined with one or more of the
aforementioned I/O components 1510. The power components 1512 may
interface with an external power system or charging equipment via a
power I/O component (not illustrated).
[0148] Based on the foregoing, it should be appreciated that
concepts and technologies for providing a 3D data environment
navigation tool have been disclosed herein. Although the subject
matter presented herein has been described in language specific to
computer structural features, methodological and transformative
acts, specific computing machinery, and computer readable media, it
is to be understood that the invention defined in the appended
claims is not necessarily limited to the specific features, acts,
or media described herein. Rather, the specific features, acts and
mediums are disclosed as example forms of implementing the
claims.
[0149] The subject matter described above is provided by way of
illustration only and should not be construed as limiting. Various
modifications and changes may be made to the subject matter
described herein without following the example embodiments and
applications illustrated and described, and without departing from
the true spirit and scope of the present invention, which is set
forth in the following claims.
* * * * *