U.S. patent application number 15/591989 was filed with the patent office on 2017-11-16 for viewport for multi application user interface.
The applicant listed for this patent is SAP SE. Invention is credited to Florian Jann, Michael Krenkler, Tina Rauschenbach, Kai Richter, Jamila Schon, Emil Voutta, Marc Arno Ziegler.
Application Number | 20170329483 15/591989 |
Document ID | / |
Family ID | 60294606 |
Filed Date | 2017-11-16 |
United States Patent
Application |
20170329483 |
Kind Code |
A1 |
Jann; Florian ; et
al. |
November 16, 2017 |
VIEWPORT FOR MULTI APPLICATION USER INTERFACE
Abstract
In one general aspect, a method and system are described for
generating a user interface. The method may include obtaining a
plurality of viewports, providing, for display in a display device,
the user interface depicting at least one of the plurality of
viewports in the display. In response to receiving a request to add
one or more additional viewports, the method may include generating
the one or more additional viewports, appending the one or more
additional viewports to the user interface, and generating an
updated user interface to include the plurality of viewports and
the one or more additional viewports. The method may also include
displaying the updated user interface with a selected one or more
of the additional viewports being scrolled into view on the display
of the display device.
Inventors: |
Jann; Florian; (Heidelberg,
DE) ; Krenkler; Michael; (Wiesloch, DE) ;
Voutta; Emil; (Heidelberg, DE) ; Rauschenbach;
Tina; (Mannheim, DE) ; Ziegler; Marc Arno;
(Mauer, DE) ; Schon; Jamila; (Heidelberg, DE)
; Richter; Kai; (Muehltal, DE) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
SAP SE |
Walldorf |
|
DE |
|
|
Family ID: |
60294606 |
Appl. No.: |
15/591989 |
Filed: |
May 10, 2017 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
62335888 |
May 13, 2016 |
|
|
|
62335892 |
May 13, 2016 |
|
|
|
62335895 |
May 13, 2016 |
|
|
|
62335897 |
May 13, 2016 |
|
|
|
62335899 |
May 13, 2016 |
|
|
|
62335873 |
May 13, 2016 |
|
|
|
62335875 |
May 13, 2016 |
|
|
|
62335879 |
May 13, 2016 |
|
|
|
62335883 |
May 13, 2016 |
|
|
|
62335886 |
May 13, 2016 |
|
|
|
62335887 |
May 13, 2016 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06F 3/04817 20130101;
G06F 3/04845 20130101; G06F 16/9577 20190101; G06F 3/0481 20130101;
G06F 9/451 20180201; G06F 40/106 20200101; G06F 3/0482 20130101;
G06F 3/04812 20130101; G06F 3/0484 20130101; G06F 3/0486 20130101;
H04L 67/02 20130101; G06F 3/0485 20130101; G06F 16/2428 20190101;
G06F 2203/04803 20130101; G06F 3/04847 20130101; G06F 8/38
20130101 |
International
Class: |
G06F 3/0485 20130101
G06F003/0485; G06F 3/0482 20130101 G06F003/0482; G06F 3/0484
20130101 G06F003/0484 |
Claims
1. A computer-implemented method for generating a user interface,
the method comprising: obtaining a plurality of viewports, at least
some of the plurality of viewports including personalized content
from a plurality of sources and application functions integrated
from a plurality of different applications; providing, for display
in a display device, the user interface depicting at least one of
the plurality of viewports in the display, the plurality of
viewports being scrollable in the user interface; in response to
receiving a request to add one or more additional viewports,
generating the one or more additional viewports, appending the one
or more additional viewports to the user interface, and generating
an updated user interface to include the plurality of viewports and
the one or more additional viewports; and displaying the updated
user interface with a selected one or more of the additional
viewports being scrolled into view on the display of the display
device, wherein the selected one or more of the additional
viewports is based at least in part on a size of the display of the
display device.
2. The computer-implemented method of claim 1, wherein each
viewport is a partial view of the user interface and wherein the
plurality of sources include applications, social media platforms,
messaging facilities, and stored data pertaining to a role of a
user accessing the user interface.
3. The computer-implemented method of claim 1, further comprising
providing scrolling functionality between the plurality of
viewports, the scrolling functionality being animated by fading out
and downward scaling a first viewport before fading in and upward
scaling a second viewport in the user interface.
4. The computer-implemented method of claim 1, wherein a first
viewport in the plurality of viewports includes a container with a
user profile portion, a configuration settings portion, and a
plurality of controls, a second viewport in the plurality of
viewports includes a launch-able content container, a third
viewport in the plurality of viewports includes a notification
container, and a fourth viewport in the plurality of viewports
includes a header toolbar, the header toolbar including a plurality
of controls.
5. The computer-implemented method of claim 4, wherein the
plurality of controls are associated with a context of content
being displayed in one or more of the plurality of viewports.
6. The computer-implemented method of claim 1, wherein each
viewport provides a partial view of the user interface and wherein
the user interface has an adjustable surface area.
7. The computer-implemented method of claim 1, further comprising
resizing at least one viewport in the plurality of viewports, in
response to detecting new information is available to be displayed
in the user interface.
8. A system for generating a user interface, the system comprising:
a shell container, executing in a web browser and providing a
plurality of services for configuring a plurality of viewports in a
user interface; an application container, executing in the web
browser, the application container being programmed to, obtain the
plurality of viewports, at least some of the plurality of viewports
including personalized content from a plurality of sources and
application functions integrated from a plurality of different
applications; provide, for display in a display device, the user
interface depicting at least one of the plurality of viewports in
the display, the plurality of viewports being scrollable in the
user interface; in response to receiving a request to add one or
more additional viewports, generate the one or more additional
viewports, appending the one or more additional viewports to the
user interface, and generating an updated user interface to include
the plurality of viewports and the one or more additional
viewports; and display the updated user interface with a selected
one or more of the additional viewports being scrolled into view on
the display of the display device.
9. The system of claim 8, wherein each viewport is a partial view
of the user interface and wherein the plurality of sources include
applications, social media platforms, messaging facilities, and
stored data pertaining to a role of a user accessing the user
interface.
10. The system of claim 8, wherein the application container is
further programmed to provide scrolling functionality between the
plurality of viewports, the scrolling functionality being animated
by fading out and downward scaling a first viewport before fading
in and upward scaling a second viewport in the user interface.
11. The system of claim 8, wherein a first viewport in the
plurality of viewports includes a container with a user profile
portion, a configuration settings portion, and a plurality of
controls, a second viewport in the plurality of viewports includes
a launch-able content container, a third viewport in the plurality
of viewports includes a notification container, and a fourth
viewport in the plurality of viewports includes a header toolbar,
the header toolbar including a plurality of controls.
12. The system of claim 8, wherein each viewport provides a partial
view of the user interface and wherein the user interface has an
adjustable surface area.
13. The system of claim 8, wherein the application container is
further programmed to resize at least one viewport in the plurality
of viewports, in response to detecting new information is available
to be displayed in the user interface.
14. A computer program product for generating a user interface, the
computer program product being tangibly embodied on a
non-transitory computer-readable storage medium and comprising
instructions that, when executed by at least one computing device,
are configured to cause the at least one computing device to:
obtain a plurality of viewports, at least some of the plurality of
viewports including personalized content from a plurality of
sources and application functions integrated from a plurality of
different applications; provide, for display in a display device,
the user interface depicting at least one of the plurality of
viewports in the display, the plurality of viewports being
scrollable in the user interface; in response to receiving a
request to add one or more additional viewports, generate the one
or more additional viewports, append the one or more additional
viewports to the user interface, and generate an updated user
interface to include the plurality of viewports and the one or more
additional viewports; and display the updated user interface with a
selected one or more of the additional viewports being scrolled
into view on the display of the display device.
15. The computer program product of claim 14, wherein each viewport
is a partial view of the user interface and wherein the plurality
of sources include applications, social media platforms, messaging
facilities, and stored data pertaining to a role of a user
accessing the user interface.
16. The computer program product of claim 14, further comprising
scrolling functionality between the plurality of viewports, the
scrolling functionality being animated by fading out and downward
scaling a first viewport before fading in and upward scaling a
second viewport in the user interface.
17. The computer program product of claim 14, wherein a first
viewport in the plurality of viewports includes a container with a
user profile portion, a configuration settings portion, and a
plurality of controls, a second viewport in the plurality of
viewports includes a launch-able content container, a third
viewport in the plurality of viewports includes a notification
container, and a fourth viewport in the plurality of viewports
includes a header toolbar, the header toolbar including a plurality
of controls.
18. The computer program product of claim 17, wherein the plurality
of controls are associated with a context of content being
displayed in one or more of the plurality of viewports.
19. The computer program product of claim 14, wherein each viewport
provides a partial view of the user interface and wherein the user
interface has an adjustable surface area.
20. The computer program product of claim 14, further comprising
resizing at least one viewport in the plurality of viewports, in
response to detecting new information is available to be displayed
in the user interface.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to and the benefit of U.S.
Provisional Application No. 62/335,888, filed May 13, 2016, U.S.
Provisional Application No. 62/335,892, filed May 13, 2016, U.S.
Provisional Application No. 62/335,895, filed May 13, 2016, U.S.
Provisional Application No. 62/335,897, filed May 13, 2016, U.S.
Provisional Application No. 62/335,899, filed May 13, 2016, U.S.
Provisional Application No. 62/335,873, filed May 13, 2016, U.S.
Provisional Application No. 62/335,875, filed May 13, 2016, U.S.
Provisional Application No. 62/335,879, filed May 13, 2016, U.S.
Provisional Application No. 62/335,883, filed May 13, 2016, U.S.
Provisional Application No. 62/335,886, filed May 13, 2016, and
U.S. Provisional Application No. 62/335,887, filed May 13, 2016,
each of which provisional application is incorporated by reference
in its entirety.
TECHNICAL FIELD
[0002] This description generally relates to user interfaces and
user experiences. The description, in particular, relates to
systems and techniques for providing a user experience for
accessing and viewing data and information related to multiple
software applications on a computing device.
BACKGROUND
[0003] Users may utilize or interact with multiple software
applications at the same time. The multiple applications may be
hosted on the same or different types of computer platforms or
systems and accessed from the users' client devices. In example
implementations, the different types of computer platforms or
systems may include, for example, SAP HANA, SAP ABAP, or other
enterprise-type computer platforms or systems.
[0004] In example implementations, the suite of the multiple
applications which an enterprise may deploy (and which users may
need to use for their work) may be large. A sample of the large
number of applications that may be deployed by an enterprise for
its operations may, for example, include applications in the areas
or domains of Finance, R&D, Engineering, Human Resources,
Manufacturing, etc. Different subsets of these applications may be
used in the work of enterprise personnel, who, for example, may
have a variety of different roles. Each user may have a need to use
a different respective subset of the multiple applications, based,
for example, on the user's role in the enterprise.
[0005] Consideration is now given to a viewport mechanism for
displaying content and applications in an expandable user
interface.
SUMMARY
[0006] A system of one or more computers can be configured to
perform particular operations or actions by virtue of having
software, firmware, hardware, or a combination of them installed on
the system that in operation causes or cause the system to perform
the actions. One or more computer programs can be configured to
perform particular operations or actions by virtue of including
instructions that, when executed by data processing apparatus,
cause the apparatus to perform the actions. One general aspect
includes a computer-implemented method for generating a user
interface. The method may include obtaining a plurality of
viewports. At least some of the plurality of viewports including
personalized content from a plurality of sources and application
functions integrated from a plurality of different applications.
The method may include providing, for display in a display device,
the user interface depicting at least one of the plurality of
viewports in the display. The plurality of viewports may be
scrollable in the user interface. In response to receiving a
request to add one or more additional viewports, the method may
include generating the one or more additional viewports, appending
the one or more additional viewports to the user interface, and
generating an updated user interface to include the plurality of
viewports and the one or more additional viewports. The method may
also include displaying the updated user interface with a selected
one or more of the additional viewports being scrolled into view on
the display of the display device. The selected one or more of the
additional viewports may be based at least in part on a size of the
display of the display device. Other embodiments of this aspect
include corresponding computer systems, apparatus, and computer
programs recorded on one or more computer storage devices, each
configured to perform the actions of the methods.
[0007] Implementations may include one or more of the following
features. The computer-implemented method in which each viewport is
a partial view of the user interface and where the plurality of
sources include applications, social media platforms, messaging
facilities, and stored data pertaining to a role of a user
accessing the user interface. The computer-implemented method
further including providing scrolling functionality between the
plurality of viewports, the scrolling functionality being animated
by fading out and downward scaling a first viewport before fading
in and upward scaling a second viewport in the user interface. The
computer-implemented method where a first viewport in the plurality
of viewports includes a container with a user profile portion, a
configuration settings portion, and a plurality of controls, a
second viewport in the plurality of viewports includes a
launch-able content container, a third viewport in the plurality of
viewports includes a notification container, and a fourth viewport
in the plurality of viewports includes a header toolbar, the header
toolbar including a plurality of controls. The computer-implemented
method where the plurality of controls are associated with a
context of content being displayed in one or more of the plurality
of viewports. The computer-implemented method where each viewport
provides a partial view of the user interface and where the user
interface has an adjustable surface area. The computer-implemented
method further including resizing at least one viewport in the
plurality of viewports, in response to detecting new information is
available to be displayed in the user interface. Implementations of
the described techniques may include hardware, a method or process,
or computer software on a computer-accessible medium.
Implementations of the described techniques may include hardware, a
method or process, or computer software on a computer-accessible
medium.
[0008] The details of one or more implementations are set forth in
the accompanying drawings and the description below. Further
features of the disclosed subject matter, its nature and various
advantages will be more apparent from the accompanying drawings,
the following detailed description, and the appended claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] FIG. 1A is a screen shot of an example personalized user
interface (UI) display, in accordance with the principles of the
present disclosure.
[0010] FIG. 1B is an illustration showing an example login screen
displayed in a shell main container.
[0011] FIG. 1C is an illustration showing an example launchpad
displayed in a shell main container.
[0012] FIG. 1D is an illustration showing an example active
application screen (an overview page) displayed in a shell main
container.
[0013] FIG. 1E is an illustration showing an example object page
displayed in a shell main container.
[0014] FIG. 1F is an illustration showing an example footer
toolbar.
[0015] FIG. 1G is an illustration showing an example me area that
can be displayed in a left container.
[0016] FIG. 1H is an illustration showing an example notification
area that can be displayed in a right container.
[0017] FIG. 1I is an illustration showing an example copilot user
interface.
[0018] FIG. 1J is an illustration of a timeline user interface that
can display timeline entries.
[0019] FIG. 2 is a diagram of an example system that can implement
the user interfaces and user experiences described herein.
[0020] FIG. 3 is a diagram of an example system that can implement
the launchpad for the user interfaces and user experiences
described herein.
[0021] FIGS. 4A-4C illustrate screenshots depicting examples of the
viewport.
[0022] FIGS. 5A-5E illustrate screenshots of example user
interfaces depicting viewports.
[0023] FIG. 6 is an illustration of an example process for
generating a user interface.
[0024] Like reference symbols in the various drawings indicate like
elements.
DETAILED DESCRIPTION
[0025] The present disclosure relates to graphical user interfaces
of software applications that display content, referred to herein
as the "main content," together with functions and other
information besides the main content, i.e., supplemental content.
Such applications may include, among other things, standalone
software programs with a built-in display module that generates a
graphical user interface (e.g., a viewport), as described in the
example embodiments herein. Alternatively, display functionality
may be provided separately, e.g., as an add-on package, a plug-in
or through a separate program that communicates with a main content
providing program via an Application Program Interface (API). The
main content providing program and/or the display program may be
executed locally on a user device and/or remotely, as a Web
application, for example.
[0026] Example embodiments are described in which a personalized
web interface (referred to herein as a viewport) is switchable to
display main content in a scrollable graphical user interface. The
viewport includes user selectable options to switch the display to
the functions and information, thus ensuring access to everything
the user may need in a convenient, space saving and visually
appealing way. The options use minimal space to display so that
user interface can include similar features across different
computer devices.
[0027] Referring to FIG. 1A, an example display of a viewport 100
with a launchpad 101, in accordance with the principles of the
present disclosure. Launchpad 101 may be included in a center
container 120 (e.g., "Work") with content relevant to the user's
work, domain, or role in the enterprise. A left side container 110
(e.g., "ME") with content personal to the user may pertain to
content in the launchpad 101 or may be independent of content in
the launchpad 101. A right side container 130 (e.g.,
"Notifications") may include notifications directed to the user
that pertain to content in the launchpad or other content. In some
implementations, these containers 110, 120, and 130 may be referred
to collectively herein as viewports. In some implementations, the
combined areas 110, 120 and 130 (or other screens) are referred to
collectively herein as a viewport. In accordance with the
principles of the present disclosure, the personalized web
interface may be presented as a uniquely integrated, multifaceted
user interface which may, in effect, transform a single-screen view
on the client computer device into three multifunctional screen
areas (e.g., Left/Center/Right "viewports").
[0028] In some implementations, the viewport may function as an
entry point to access software applications and associated content.
The viewport may be configured to provide a single screen view that
depicts three (or more) multifunctional screen areas. In one
example, the three areas are displayed in parallel as a left panel,
a center panel, and a right panel. The center panel may include a
workspace area which can display a launchpad (e.g., home screen) or
one or more active application areas (e.g., screens) that a user
has launched from the launchpad (or tile/link in the launchpad).
The left panel may include a Me Area that provides various
generalized functionalities related to the user and operation and
personalization of the environments described herein. The right
panel may include a Notifications Area that displays a broad array
of notification types (System Alerts, messages, reminders, tasks
alerts, etc.) in a customizable listing format.
[0029] In an example embodiment, the functions and information
described herein are assigned to at least one virtual extension of
a viewport. That is, a portion of the display area can be
displayed, while other portions are virtual extensions and only
displayed as a user or algorithm scrolls to place one or more of
the other portions into view. In one example, a virtual extension
can include a first extension area to the left of the viewport and
a second extension area to the right of the viewport. When the main
content is selected, the extension area(s) are hidden from
display.
[0030] In another example embodiment, the viewport is switched to
display selected supplemental content by triggering a graphical
icon inside a viewport. Alternatively, if the display is
touch-sensitive, a viewport may be switched by a touch gesture such
as a swiping motion towards or away from the corresponding
extension area. A viewport may be switched back to the main
content, e.g., by triggering a respective icon or using a
gesture.
[0031] In another example embodiment, trigger icons indicate when
new or unread information is available inside a respective
extension area. The indication can be a numerical counter, a
symbol, a special graphic or animation, etc. Thus, the user need
not leave the current context, i.e., the main content, to be
alerted to new information.
[0032] In an example embodiment, the supplemental content is
displayed by moving the corresponding extension area over to the
viewport. The movement may be animated in the manner of a camera
pan. However, other movements such as instantaneous display or
fading in and out are also possible.
[0033] In yet another example embodiment, at least part of the main
content remains on display in the viewport when the extension area
is displayed. The main content may be shifted away from a central
portion of the viewport and reduced in size (e.g., scaled down to
75% of its original size) to direct the user's attention to the
supplemental content. In this display state, the main content may
be partially cut off by the border of the viewport.
[0034] As shown in FIG. 1A, a Work viewport 120 is located in the
center of the display screen. The Work viewport 120 may, for
example, display either the launchpad 101 or an active application
screen that was previously selected or opened from the launchpad
tile array. The left Me viewport 110 may, for example, provide
various generalized functionalities related to the user and their
operation and personalization. The right Notifications viewport 130
may, for example, display one or more of a broad array of
notification types (System Alerts, messages, reminders, tasks
alerts, etc.) in a customizable listing format.
[0035] The launchpad or home screen in the viewport, which may
available at all times and in any application, may provide a clear
screen orientation for accessing corresponding application
information as well as generalized functionalities and navigations
without ever disrupting a user's context of their current task. On
a client computer device (e.g., a mobile device), which has a
limited display screen area, a personalized UI display may be
adapted to present fewer of the three multifunctional screen areas
or viewports on the device's limited display screen area. For
example, only the Center, Left/Center or Center/Right screen areas
or viewports may be presented on a mobile device's display
screen.
[0036] For convenience in description, the terms "Work viewport",
"center viewport", "launchpad", "home screen" and "home page" may
be used interchangeably herein because each may persist as a
user-configured starting point in which to access content.
[0037] A client computer device structure or framework provides a
viewport for a web interface for access to, or interaction with, a
suite of multiple and diverse applications (or data sources), in
accordance with the principles of the present disclosure. The
viewport can be used for the multiple and diverse applications and
may, for example, provide services to a user for
application-to-application navigation, personalization, search, and
incident creation. The Viewport may be designed to provide a
common, same, or unified user experience (UX) to the user when
launching, accessing, or interacting with one or more of the
multiple applications. In an example implementation, a backend or
gateway computer system (which may be connected to multiple
applications or hosts) may generate the viewport. The Viewport may
be delivered or presented as a web page on the client computer
device and serve as a single web-based entry point for multiple
applications and analytics across platforms and devices.
[0038] As indicated above, the content of the viewport may be
organized in one or more containers (e.g., main or center "shell"
container, left container, right container) for display on a
display screen of a client computer device. The main container may
contain the launchpad (e.g., home page), which may act as the
starting or focal location for initiating
application-to-application navigation, personalization, search, and
incident creation, just to name a few examples.
[0039] Each of the multiple applications may be represented by, or
delivered via, content (e.g., a graphical user element (GUI), link,
tile, factsheet, or other object) on the viewport (or within the
launchpad). Further, the content of the launchpad may be customized
or personalized to a user (e.g., based on user role, authorization
level, user interests or needs, etc.) for access to, or interaction
with, a selected subset of the multiple applications (or data
sources). Each of the selected subset of multiple applications may
be represented a specific object (e.g., a tile or link) on the
viewport (or within the launchpad). The specific object (e.g., tile
or link) may be identified or labelled by a name, title, or icon
indicating the specific application which the specific object
represents. The tile or link (e.g., by a single click) may be used
as an application launcher on the viewport (e.g., web interface) to
launch the application that the tile or link represents.
[0040] The tiles corresponding to the specific applications
represented on the launchpad may be organized as a group or array
of tiles in a "tiles area" of the UI hosting the launchpad.
Similarly, links corresponding to specific applications represented
on the launchpad may be organized as a list of links in a "links
area." A Design Time Tool (e.g., available, for example, in a menu
or via a tile or link on the launchpad) may allow users or
administrators to define which applications should be displayed as
links or tiles on the launchpad. Users/Administrators may
personalize the tiles area and the link list area to a user.
[0041] One or more containers of the viewport may have adjustable
amounts of displayed content (e.g., number of tiles) (and
correspondingly adjustable display size or display area) so that
the same viewport can be adapted for display on different-sized
display screens of different client device types (e.g., smartphone,
smart watches, laptops, work station, tablet, desktop computer,
etc.), and across all possible deployment options (e.g., on
premise, cloud, as-a-service, etc.). Which ones of the one or more
containers are displayed on the display screen at given moment may
depend, for example, the status of tasks or activities of the user
navigating the viewport, and also, for example, on the size of the
display screen of the client computer device available for
display.
[0042] In example implementations, a container (e.g., center
container, launchpad) may be used to display main or core content
for a user (e.g., application/tiles relevant to a user's work or
role). The launchpad may serve as a shell container to access all
content. Other containers may include different panels with
different floorplans for different content corresponding user
interests or activities (e.g. a "ME" panel displaying information
or personal data about a user, a "notifications center" displaying
notifications (e.g., e-mail, text messages, alerts, etc.) for the
user, a panel displaying discussion threads or boards, an Overview
Page, an Object Page (e.g., a floorplan to view, edit and create
objects), a panel displaying context and ad-hoc workflows, a panel
displaying dynamic sidebar information, a dynamic side content
panel, etc. The dynamic side content is a layout control that
allows additional content such as timeline, chat, additional
information to be displayed in a way that flexibly adapts to
different screen sizes. In some implementation, if no notifications
are available, the launchpad may overtake space typically set aside
for notifications. In some implementation, the launchpad may be
placed with a visual effect, including sliding in from a top of a
UI and bouncing into place in the UI.
[0043] In example implementations, the applications (which, for
example, may be a set of applications implemented on HTML5/CSS/JS
technology using SAPUI5 framework) delivered via launchpad 101 may
adhere to a consistent, responsive design that allows users to
seamlessly experience the applications across interaction
channels--desktop, tablet, mobile, etc. Further, the applications
delivered via the launchpad may include legacy applications
implemented on traditional platforms using legacy UI technologies
(e.g., FPM/WDA, SAPGUI for HTML, SAPGUI for Windows, etc.). Access
to legacy applications may, for example, be provided via
corresponding links in a links area of the personalized UI
display.
[0044] In an example implementation of the personalized UI display,
a start screen (e.g., main container, "launchpad" or home page) may
present assigned applications as so-called "tiles" (e.g., tile 150,
tile 151, tile 152, etc.). Tiles (which are user-activable UI
elements) may only be used as application launchers for launching
applications and presenting the applications on the launchpad. An
App Descriptor defines Navigation Intent (=Semantic Object+Action)
to launch the transaction, Title, Subtitle and Icon for the
Application Launcher, i.e. the text of the tile; and Parameters,
e.g. order number.
[0045] A user may use these tiles (e.g., tile 150, tile 151, tile
152, etc.) to launch or navigate to specific applications.
Incorporated into the launchpad may be a launchpad Designer tool,
which allows assignment of tiles to users and user groups for
customization or personalization (e.g., based on user role) of
launchpad 101. As a general rule, each of the multiple applications
(for which launchpad 100 serves as an interface) may correspond to
at least one tile. An exception to the general rule may be for
factsheet applications, which need not be represented by tiles.
However, factsheets may optionally still be saved as and
represented by tiles on launchpad 101 if desired.
[0046] In accordance with the principles of the present disclosure,
a tile that represents an application (e.g., on launchpad 101 or
any other UI), apart from serving as a UI element or button for
launching the application and displaying the application
identifier, may be a container that displays different types of
additional information or content. The additional information may
include, for example, informative text, numbers, and charts. The
displayed tile content may be static or dynamic. The displayed tile
content may be dynamically updated and may include, for example,
data (e.g., trends or key performance indicators (KPIs), and
application status, etc.) supplied by the backend systems or
applications to which the tile is represents.
[0047] The multiple applications described herein may be hosted on
the same or different types of computer platforms or systems
(possibly including some applications hosted on the client device
itself). In example implementations, the different types of
computer platforms or systems may include, for example, SAP HANA,
SAP ABAP, or other enterprise-type computer platforms or
systems.
[0048] In example implementations, the suite of the multiple
applications which an enterprise may deploy for its operations
(e.g., in the areas or domains of Finance, R&D, Engineering,
Human Resources, Manufacturing, etc.) may be large. Different
subsets of these applications may be used in the work of enterprise
personnel who may have a variety of different roles. Each user may
have a need to use a different respective subset of the multiple
applications, based, for example, on the user's role in the
enterprise.
[0049] In general, viewports (e.g., viewports 110, 120, and 130)
may each represent a partial view of a larger surface. By opening
up this surface beyond the borders of a window (i.e., beyond the
borders of the actual screen) a user may prepare to use the
architecture described herein to extend to larger screens and
collaborative wall displays. For example, if a screen or window is
too small, the user will only see the viewport that fits to the
screen or window. On the other hand, if the virtual screen is wider
(e.g., multi-screen displays), the systems and methods described
herein can provide an advantage of allowing a widening of the
viewport to offer a panoramic view of the surface. While
maintaining the promise to responsively support small devices, the
systems and methods described herein offer the possibility to also
target larger displays.
[0050] The viewport also provides the advantage of a natural user
experience compared to the classical off-canvas designs that are
common in mobile applications. As shown in FIG. 1A, two off-screen
areas are shown, the Me area (e.g., viewport 110) with
user-specific information and a Notifications area (e.g., viewport
130) on the right. Each off-screen area is populated using
system-driven information. Users can access these areas through
actions in a shell bar on the top left and top right corners. The
transition that is shown upon accessing such content depicts a
smoothly animated lateral move that mimics the user's head turning
to the left and to the right in a panoramic view. User interaction
with the content can be mapped to mimic natural user (e.g., human)
gestures or input controls. The surface generated by the view
therefore removes any screen limitations. Such a surface offers
additional space for user-specific and system-driven data.
[0051] The Me Area can be found to the left of an off-screen area.
Because this area is located off-screen, it is not permanently
visible to the user. In order for the Me Area to slide into view,
the user can click on the profile image located on the top left
corner of the screen--an action that mimics the user turning his or
her head to the left. This action will also trigger the viewport to
move to the left and the main content area to zoom out. As the Me
Area slides into view, the user will be able access information
relevant to both the user and his or her usage environment. This
includes, for example, the user's profile picture and access to
online state, settings and preferences, a catalog of available apps
(App Finder), tools to personalize the current content in the main
area, and objects and apps recently visited by the user.
[0052] The Me Area may be available from each screen in the main
content area. On the background surface, the different areas
co-exist and influence one another. While most actions in the Me
Area are available independently of the current context, some of
the actions will be directly tied to the content shown in the main
content area. For example, settings will display the settings page
for the specific app in the main content area (not yet available).
Additionally, personalization options might only be available if
the respective screen is visible in the main area. In some
implementations, an option to allow users to view a list of their
most recently visited items is provided. This is especially useful
for those users who are used to working with a limited set of apps
or objects as it significantly simplifies their navigation.
[0053] The right off-screen area is dedicated to providing
system-driven information. This may include system-generated
notifications of events to which a user has subscribed. The system
may provide more live insights and actions, making a real-time push
channel increasingly important.
[0054] A notification center can provide system-generated
notifications from various sources such as the workflow inbox or
chat notifications. Notifications can be prioritized and grouped
into groups of similar items. Through these configurations, the
user will be able to access more information about a notification
and take immediate action.
[0055] Similar to the Me Area, the notification area is accessible
from every app that is shown in the main content area. Here, too,
the user can bring the notification area into focus through a
virtual turn of the head--that is, by clicking on the notification
icon on the top right corner of the screen.
[0056] The notification area exists independently of the
application in the main content area. The big difference between
this area and the notifications on the home page of the launchpad
is that the launchpad home area displays notifications within the
launch tiles. By separating the notifications from the tiles, our
rationale is to guide the user and make him aware of critical and
actionable issues immediately. Other types of information may be
suitable for display in the notification area, such as progress
indicators for long-running tasks (for example, for a build or
deployment process).
[0057] With the design of the viewport, the systems and methods
described herein can concurrently manage different screen areas
without sacrificing simplicity and responsiveness. The viewport
offers a partial view of a potentially infinite surface on which
content and functionality can be placed either in a fixed layout
with the three main areas, or in a more flexible layout of multiple
areas.
[0058] In one example, the Me Area slides into view from the left
to offer users access to various user-related information including
personalization, profile, settings and interaction history.
Similarly, the notification area slides into view from the right to
offer users access to system-driven information that helps them to
become aware of critical, real-time information. The notification
area may also offer other system-driven content.
[0059] FIG. 1B is an illustration showing an example login screen
110 displayed in the shell main container 104. The login screen 110
provides a UI that allows a user to enter credentials in order to
log into and begin a personalized and customized UX. In the example
shown in FIG. 1B, the login screen 110 appears to drop into the
shell main container 104 from a virtual extension area located
along a top of a display area. In some implementations, the virtual
extension area can be placed along the bottom of the display area.
In some implementations, the virtual extension area can be placed
to the left and/or the right of the display area.
[0060] FIG. 1C is an illustration showing an example launchpad 101
displayed in the shell main container 104. The launchpad 101 can be
a web-based entry point (or homepage) for enterprise applications
that can execute (run) across multiple platforms and computing
devices. In the example shown in FIG. 1C, the launchpad 101 appears
to drop into the shell main container 104 from the top of a display
area. In some implementations, the virtual extension area can be
placed along the bottom of the display area. In some
implementations, the virtual extension area can be placed to the
left and/or the right of the display area.
[0061] The launchpad 101 can serve as a bracket around (or a base
for) a set (or group) of enterprise applications, providing a
single point of entry for the set of enterprise applications. In
the example shown in FIG. 1C, the launchpad 101 presents (displays
on a screen of a computing device of a user) each application
represented by a tile. A tile can be a container that represents
the application. Each tile can display different types of content.
A user can interact with each tile to navigate to the specific
enterprise application associated with the tile. In addition, when
designing a tile to represent a specific application, a programmer
can assign a tile to a specific user or group of users. The
launchpad 101 can provide one or more services. The one or more
services can include, but are not limited to,
application-to-application navigation, personalization, role-based
application assignments, search, and incident creation.
[0062] The launchpad 101 can be a role based, personalized,
real-time and contextual aggregation point for business
applications and analytics. The launchpad 101 can run (execute) on
multiple computing devices including, but not limited to, desktop
computers and mobile computing devices such as laptop computers,
tablet computers, notebook computers, personal digital assistants
(PDAs), smartphones, mobile phones, smart watches, etc.). In
addition, the launchpad 101 can be deployed on multiple platforms
(e.g., Linux, Windows, Windows Phone, MAC.RTM., iOS.RTM., OS
X.RTM., Android.RTM., etc.).
[0063] The launchpad 101 includes tiles 114a-h. Each tile can
display different types of content. For example, tile 114a can be a
news and feeds tile that can enhance collaboration by providing a
user with information about the enterprise. The tiles 114a-h can be
individually color-coded. A color can represent a particular role
(e.g., finance, human resources, supply chain management (SCM),
customer relationship management (CRM), etc.). The tiles 114a-h can
be associated with a group 116. Tile 114f can be a key performance
indicator (KPI) tile. Tile 114b can be a basic launch tile. Tile
114d can be a monitoring tile. Tile 114g can display a comparison
chart for specific content.
[0064] The launchpad 101 includes a link list area 118 that
includes links 119a-f. The link list area 118 is an area on the
launchpad 101 that can provide links to enterprise applications
represented by the tiles 114a-h. For example, a user can select and
drag a tile from the tile area on the launchpad 101 into the link
list area 118 to create a link to the application associated with
(represented by) the tile. In some implementations, the launchpad
101 can include a footer toolbar (e.g., footer toolbar 132 as shown
in FIG. 1F). In some implementations, the footer toolbar can appear
to float over the content displayed in the launchpad 101.
[0065] In some implementations, the shell toolbar 108 can display a
search icon 111 and a copilot launch icon 113. A user can select
(click on) the copilot launch icon 113 to launch a copilot UI. A
copilot UI will be described in more detail with reference to FIG.
1I.
[0066] FIG. 1D is an illustration showing an example active
application screen (overview page 120) displayed in the shell main
container 104. The enterprise applications that can be accessed by
a user by way of the launchpad 101 and then subsequently displayed
in an active application screen (e.g., the overview page 120) can
include, but are not limited to, transactional applications,
analytical applications, and fact sheet applications (contextual
navigation applications). Transactional applications can allow a
user to create, change and/or approve processes with guided
navigation. Analytical applications can provide a user with a
visual overview of a dedicated topic for monitoring and tracking
purposes to allow for further key performance indicator (KPI)
related analysis. Fact sheet applications can allow a user to view
essential information about an object and to allow navigation
between related objects.
[0067] The overview page 120 can visualize all of the information a
user may need for a specific business context (business domain) on
a single page or screen. The information can be displayed in one or
more variable content packages (VCPs) or cards 122a-i. Each card
can be a container of content for organizing large amounts of
information on an equal plane within the overview page 120. In some
implementations, a user can rearrange the position of the cards
122a-i on the overview page 120. In some implementations, a user
defines, adds, or deletes cards included in the overview page
120.
[0068] An overview page (e.g., the overview page 120) can be a
selectable application (e.g., from the launchpad 101) providing an
integrated gateway into enterprise applications and application
content included in the launchpad 101. The UI of the overview page
(e.g., the overview page 120) can provide a user with a visual
summary of data, links, actions, and content that are relevant to a
business domain of expertise of a user and relevant to a selected
role of the user within the domain. The visual summary can be
presented in one or more cards (e.g., the cards 122a-i) that
display live content to a user at-a-glance without the user having
to open multiple applications and perform multiple drill downs
through application content to find and present the content.
[0069] In some implementations, the overview page 120 can include a
footer toolbar (e.g., footer toolbar 132 as shown in FIG. 1F). In
some implementations, the footer toolbar can appear to float over
the content displayed in the overview page 120.
[0070] In some implementations, an enterprise system can determine
content displayed on an overview page (e.g., the overview page
120). In addition or in the alternative, a selection of one or more
business domains and one or more roles of a user in the business or
enterprise can determine content displayed on an overview page
(e.g., the overview page 120). In some implementations, a user can
make the selection using a settings UI included in a launchpad
(e.g., the launchpad 101). In some implementations, a user can
select one or more business domains and/or one or more roles of the
user in the enterprise by way of an overview page (e.g., the
overview page 120). Selecting one or more business domains and/or
one or more roles of the user in the enterprise by way of the
overview page can maintain absolute relevance to the individual
user and the way in which the user works.
[0071] In some implementations, the user can personalize the layout
and placement of one or more cards (e.g., the cards 122a-i)
included in a UI of an overview page (e.g., the overview page 120)
and the display of content included in each card. The
personalization can enhance the workplace productivity of the
user.
[0072] FIG. 1E is an illustration showing an example object page
(object page 124) displayed in the shell main container 104. An
object page can be a floor-plan used to represent objects in a UI.
An object page can be used to display, create, or edit an object.
An object can represent a business entity (e.g., a customer, a
sales order, a product, an account, etc.). Enterprise applications
that reflect a specific scenario (e.g., a sales order, am account
status) can be bundled using an object. The object page can include
a header area 126, a navigation area 128, a content area 130, and,
in some implementations, a footer toolbar (e.g., footer toolbar 132
as shown in FIG. 1F). In some implementations, the footer toolbar
can appear to float over the content displayed in the object page
124. For example, referring to FIG. 1C, a user can select the tile
114f and an object page can be displayed to the user.
[0073] FIG. 1F is an illustration showing an example a footer
toolbar (e.g., footer toolbar 132). In some implementations,
referring to FIG. 1A, the footer toolbar 132 can appear at the
bottom of a screen displayed in the shell main container 104, the
left container 102, and/or the right container 106. For example, as
described herein with reference to FIGS. 1C-E, a footer toolbar
(e.g., the footer toolbar 132) can be displayed at the bottom of
the launchpad 101, the overview page 120, and the object page 124.
The footer toolbar (e.g., the footer toolbar 132) can continue to
appear at the bottom of the screen of the display area of the
display device even as the displayed screen is scrolled. The footer
toolbar (e.g., the footer toolbar 132) can appear to hover over or
float over the content being displayed on the screen. The footer
toolbar 132 can include buttons or controls 134a-k. The controls
134a-k can be selected by a user in order to perform one or more
actions that can affect content included on the page being
displayed on the screen. The controls 134a-k are examples of
controls that can be included in a footer toolbar. In some
implementations, the controls can be different, fewer than, or more
than the controls 134a-k. The type and number of controls included
in a footer toolbar can be based on the type of page being
displayed and/or the content being displayed in the page.
[0074] FIG. 1G is an illustration showing an example me area (e.g.,
me area 136) that can be displayed in the left container 102. In
some implementations, the me area 136 can be displayed in the right
container 106. The me area 136 includes an upper section 138 and a
lower section 140. The upper section 138 includes a user icon 142.
Selecting (clicking on) the user icon 142 can provide a user
profile. A dropdown indicator button 144 displays a status of the
user and, if selected, a user can logout of an application. The
upper section 138 includes navigation targets 146a-e. Selection of
(clicking on) a navigation target by a user triggers a
corresponding functionality (e.g., an application) associated with
a navigation target. The me area 136 can provide various
generalized functionalities as they are related to a user.
[0075] The upper section 138 can include sort selections 146a-b. A
user can select (click on) a sort selection (e.g., one of the sort
selections 146a-b) to determine how the listing of the recent
activities included in the lower section 140 will be sorted and
displayed.
[0076] The lower section 140 of the me area 136 includes a list of
recent activities 148a-c. The recent activities 148a-c can include
links 156a-c, respectively, that when selected (clicked on) by a
user can navigate the user to back to the shell main container 104,
opening an application (or function) that corresponds to the link
in the shell main container 104. Recent activity items can include,
but are not limited to, enterprise applications, triggered
searches, co-pilot collections, and co-pilot drafts.
[0077] FIG. 1H is an illustration showing an example notification
area (e.g., notification area 150) that can be displayed in the
right container 106. In some implementations, the notification area
150 can be displayed in the left container 102. The notification
area 150 includes notifications 152 a-c. A user interacting with
the UI in the notification area 150 can take immediate action on a
notification. A notification item (e.g., notifications 152 a-c) can
have an indicator (e.g., notification indicators 154a-c) that can
indicate the status of the notification. For example, a
notification indicator can be color coded to indicate a particular
status of the notification.
[0078] A user can reject a notification by selecting (clicking on)
a reject selection (e.g., a reject selection 156a-b). For example,
a user can reject the notification 152a by selecting (clicking on)
the reject selection 156a. The rejection of the notification 152a
(the notification status) can be indicated by content included in
(e.g., a color of) a notification indicator 154a. A user can
acknowledge a notification by selecting (clicking on) an
acknowledge selection (e.g., a acknowledge selection 158a-b). For
example, a user can acknowledge the notification 152b by selecting
(clicking on) the acknowledge selection 158b. The acknowledgement
of the notification 152b (the notification status) can be indicated
by content included in (e.g., a color of) a notification indicator
154b.
[0079] A user can drill down into a relevant application by
selecting (clicking on) a more info selection (e.g., a more info
selection 160a-b). In some cases, a user may contact someone
directly in response to a notification.
[0080] FIG. 1I is an illustration showing an example copilot UI
(e.g., copilot UI 162). For example, referring to FIG. 1C, a
copilot application can be launched from the launchpad 101 when a
user selects (clicks on) the copilot launch icon 113. The copilot
application can provide (generate and display) the copilot UI 162.
In some cases, the copilot UI 162 can float over the UI included in
the launchpad 101. As a floating UI control, the copilot UI 162 can
be visually unobtrusive and flexible in its cross-functional
omnipresent implementation across any device or application
screen.
[0081] The example copilot UI 162 is an example copilot start page
or start screen. The start screen (the copilot UI 162) can be an
entry point for copilot functionality for an enterprise system.
[0082] The copilot UI 162 can provide shortcuts to different
copilot features. For example, as shown in FIG. 1I, a collection
can be represented by an entry in a collection list 164 that
includes collection list entries 164a-d. A copilot collection can
be a cluster of items in relation to a specific topic. For example,
an item can be a note, a screenshot, a chat message, a copilot
message, an object, or a quick create. In some implementations, the
items included in the collection can be homogeneous (e.g., all of
the items are of the same type). In some implementations, the items
included in a collection can be non-homogeneous (e.g., the items
can be of different types). Each collection list entry 164a-d can
provide a representation of a collection that can include a title,
a timestamp (e.g., last changed), a visual content summary, and a
textual content preview. In some implementations, the collection
list 164 can be searched and/or filtered.
[0083] For example, the selection of a copilot shortcut 166a-d can
allow a user to create and navigate to a new collection with a
specified intention. The selection of a copilot create icon 168
located in a copilot footer toolbar 170 can create and navigate to
a new plain collection. The selection of a copilot settings icon
172 located in the copilot footer toolbar 170 can allow a user
access to copilot settings (e.g., display a copilot settings UI,
open a copilot settings application, etc.).
[0084] Copilot entries can be living, gradually growing artifacts
and software entities that can accompany a user from the
identification of an issue to a solution for the issue, while
providing support in the form of relevant context and actions.
Copilot entries can serve as memory aides while the copilot entries
can incrementally evolve into valuable transactional tasks and
collaborations as they mature in meaningful ways that bridge a gap
between predefined application functionality and processes based on
personal ways of working for a user. Though the example shown in
FIG. 1I describes launching the copilot application from the
launchpad 101, referring to FIG. 1A, the copilot application can be
launched from other screens displayed in (included in) the shell
main container 104, the left container 102, and/or the right
container 106.
[0085] Copilot entries can be made ready for users to use when
communicating, collaborating, and creating actionable transactions
in desktop or mobile scenarios. For example, copilot text entries
can be analyzed for recognizing and identifying relevant text
related objects. Copilot text entries can emphasize displayed text,
and a copilot application can recommend contextual entities for use
in a current task. The copilot application can understand user
context and can intelligently propose selections, auto-entries, and
user options.
[0086] A smart template can provide a framework for generating user
interfaces at runtime for an enterprise application. For example, a
smart template can be used to generate the UI for the overview page
120 as shown in FIG. 1D. In another example, a smart template can
be used to generate the UI for the object page 124, as shown in
FIG. 1E. A smart template can provide a framework for generating
the user interfaces based on metadata annotations and predefined
templates for the most used application patterns. The use of smart
templates can ensure design consistency by providing centralized
high quality code by using predefined templates and controllers.
The use of smart templates can keep applications up to date with
evolving design guidelines. The use of smart templates can reduce
an amount of front-end code used in building enterprise
applications. The term "smart" can refer to annotations that add
semantics and structures to provided data. The term "smart" can
also refer to the way in which the templates understand the
semantics.
[0087] FIG. 1J is an illustration of a timeline UI (e.g., the
timeline 174). A timeline UI (e.g., the timeline 174) can display
timeline entries 176a-e. For example, the entries can be events,
objects, and/or posts listed and displayed in a chronological
order. The timeline 174 includes nodes 178a-d that correspond to
respective timeline entries 176a-d.
[0088] The timeline 174 can be used for collaborative
communications. The timeline 174 can be configured in multiple
different ways depending on use case implementations. For example,
the timeline 174 can provide information about changes of an object
or about events related to an object. The timeline 174 can provide
information about generated entries (e.g., value XY changed from A
to B) or about manual entries (e.g., comments from an individual).
In some implementations, the latest entry is at the top of a list
displayed by a timeline. In some implementations, the timeline 174
can be displayed along with a business object. In some cases, the
timeline 174 can be displayed to the right of the business
object.
[0089] Two example versions of a timeline can include a basic
timeline and a social timeline. A basic timeline can be a read-only
timeline. A social timeline can allow for interaction and
collaboration among users.
[0090] FIG. 2 is a diagram of an example system 200 that can
implement the user interfaces and user experiences described
herein. The system 200 includes an enterprise computing system 202,
a network 204, and client computing devices 206a-e.
[0091] For example, computing device 206a can be a mobile phone, a
smartphone, a personal digital assistant, or other type of mobile
computing device. The computing device 206a includes a display
device 220. For example, computing device 206b can be a laptop or
notebook computer. The computing device 206b includes a display
device 222. For example, computing device 206c can be a tablet
computer. The computing device 206c includes a display device 224.
For example, the computing device 206d can be a wearable device
such as a smartwatch. The computing device 206d includes a display
device 226. For example, the computing device 206e can be a desktop
computer. The computing device 206e can include a display device
228. A user of the computing devices 206a-e can use/interface with
the display devices 220, 222, 224, 226, and 228, respectively, when
interacting with the enterprise computing system 202. The computing
devices 206a-e can display on the display devices 220, 222, 224,
226, and 228 any of the screens and UIs described herein.
[0092] The enterprise computing system 202 can include one or more
computing devices such as a web management server 214, a frontend
server 230, a backend server 208, and a mobile device management
server 210. The enterprise computing system 202 can also include a
database management computing system 212 that includes a database
management server 212a and a database 212b. Though not specifically
shown in FIG. 2, each server (the web management server 214, the
frontend server 230, the backend server 208, the mobile device
management server 210, and the database management server 212a) can
include one or more processors and one or more memory devices. Each
server can run (execute) a server operating system.
[0093] In some first implementations, the client computing devices
206a-d (e.g., the mobile computing devices) can communicate with
the enterprise computing system 202 (and the enterprise computing
system 202 can communicate with the client computing devices
206a-d) by way of the mobile device management server 210. The
mobile device management server 210 includes one or more mobile
device platform application(s) 216. By using the mobile device
platform application(s) 216, the enterprise computing system 202
can deliver cross-platform, secure, and scalable applications to
the computing devices 202a-d, independent of the mobile computing
device-type (e.g., laptop, notebook, smartwatch, mobile phone, PDA,
etc.) and independent of the operating system running on the
computing device 206a-d. In these implementations, the mobile
device management server 210 can then communicate with the web
management server 214.
[0094] In some second implementations, the client computing devices
206a-e (both the mobile computing devices (computing devices
206a-d) and the desktop computing device 206e) can communicate with
the enterprise computing system 202 (and specifically with the web
management server 214), and the enterprise computing system 202
(and specifically with the web management server 214) can
communicate with each of the client computing devices 202a-e) using
the network 204. The web management server 214 includes a web
dispatcher application 218. In both the first implementations and
the second implementations, the web dispatcher application 218 can
act as a "software web switch" accepting or rejecting connections
to the enterprise computing system 202.
[0095] In some implementations, the network 204 can be a public
communications network (e.g., the Internet, cellular data network,
dialup modems over a telephone network) or a private communications
network (e.g., private LAN, leased lines). In some implementations,
the computing devices 206a-e can communicate with the network 204
using one or more high-speed wired and/or wireless communications
protocols (e.g., 802.11 variations, WiFi, Bluetooth, Transmission
Control Protocol/Internet Protocol (TCP/IP), Ethernet, IEEE 802.3,
etc.).
[0096] The frontend server 230 can include product specific UI
Add-On Applications 232 and a UI infrastructure 234. The UI
infrastructure 234 can include a design portion and a runtime
portion. The frontend server 230 can decouple a lifecycle of a UI
(e.g., design and runtime deployment) from the backend server 208.
The decoupling can allow UI applications to interface with a
plurality of different databases. The decoupling provides a single
point of UI design, access, and maintenance allowing for theming,
branding, configuring, and personalizing a UI without a need for
development privileges to the backend server 208 (e.g., no need to
have backend administrative rights). The decoupling can result in a
more secure enterprise computing system. The decoupling can provide
for rule-based dispatching of requests in a multi-system landscape
(e.g., for approvals including aggregation).
[0097] The frontend server 230 includes a gateway 236. The gateway
236 can provide a way to connect devices, environments, and
platforms to enterprise software based on market standards. The
gateway 236 can enable the development of UIs for use in different
environments (e.g., social and collaboration environments). The
gateway 236 can enable the development of UIs for use on different
types of client computing devices (e.g., client computing devices
206a-e). The gateway 236 can enable the development of UIs for use
in internet-based applications.
[0098] The backend server 208 can include a bundle (a set) of
business applications (e.g., business suite 238). The business
applications can be transactional applications. analytical
applications, and fact sheet and contextual navigation
applications. Transactional applications can allow task-based
access to tasks that can include create and change. In addition or
in the alternative, transactional applications can allow access to
entire processes with guided navigation. Analytical applications
can provide a user with a visual overview of complex tasks for
monitoring and tracking purposes. Fact sheet applications and
contextual navigation applications involve search and explore
activities. Fact sheet applications and contextual navigation can
allow a user to view essential information about an object and can
allow contextual navigation between related objects.
[0099] The database management computing system 212 includes a
database management server 212a that can run (execute) applications
that can manage a database 212b. For example, the database 212b can
be an in-memory, column-oriented, relational database (e.g., SAP
HANA.RTM.). The database management computing system 212 can
include extended application services 240 that can embed a full
featured application server, web server, and development
environment within the database management computing system 212.
The extended application services 240 can include application
content 242 and reuse content 244 for use by the enterprise
computing system 202 when providing a personalized, responsive, and
simple UX across different types of computing devices and
deployment options.
[0100] FIG. 3 is a diagram of an example system 300 that can
implement the launchpad for the user interfaces and user
experiences described herein. The launchpad acts as runtime shell
environment for the apps described herein in which the personalized
home page is one feature among many other services. The launchpad
is based on a unified shell architecture. The guiding principle of
the unified shell is to have a single, platform-independent,
client-side runtime environment which can be hosted on different
server platforms (e.g., SAP NetWeaver AS ABAP, SAP HANA XS, SAP
HANA CloudPlatform).
[0101] In general, the framework described herein may support for
modularizing comprehensive JavaScript applications. That means,
instead of defining and loading one large bundle of JavaScript
code, an application can be split into smaller parts which then can
be loaded at runtime at the time when they are requested. These
smaller individual files are called modules.
[0102] A module is a JavaScript file that can be loaded and
executed in a browser. The module may include a name, a
description, a dependency, and a declaration location. The content
bundled in a module is up to the developer, but typically the
content has a common topic, such as forming a JavaScript class or
namespace or the contained functions address a specific topic, for
example client to server communication or mathematical
functions.
[0103] Modules have no predefined syntax or structure, but module
developers can use the name, declaration, description, or
dependency to identify such modules. The name identifies the module
and is used with jQuery.sap.require to load the module. As human
readers associate a module with the main JavaScript object declared
in it, the module names by convention are a hierarchical sequence
of dot-separated identifiers like sap.ui.core.Core. A developer can
use all but the last identifier to group modules in a logical
and/or organizational order, similar to packages in Java, and can
use the last identifier to give the module a semantical name.
[0104] Modules can declare themselves and their location of content
by calling the static jQuery.sap.declare function with their name.
This helps SAPUI5 to check at runtime whether a loaded module
contains the expected content by comparing the required name
against the declared name. As a side effect, jQuery.sap.declare
ensures that the parent namespace of the module name exists in the
current global namespace (window). For modules without declaration,
the framework assumes that the module has the expected content and
declares it with the name that was used for loading. In some cases
a module declaration is mandatory.
[0105] The description of a module is any JavaScript comment
preceding the module's declaration statement and is intended to
help to decide whether a module is useful for the intended purpose.
The configuration UI displays the description next to the module
name.
[0106] Modules can use the jQuery.sap.require method to load other
modules they depend on. While jQuery.sap.require internally has the
effect of a loadModule call, it can also be regarded as a
dependency declaration. The dependency declarations can be
evaluated at runtime, but can also be analyzed at built time or at
runtime on the server.
[0107] In one example, the unified shell offers unified services
with platform-independent interfaces (APIs) (e.g., services 301) to
the hosted apps and shell components. The implementations of these
services can utilize different service adapters for the respective
platform to carry out platform-specific behavior. The unified shell
can be enabled using a shell container 302, shell services 304, and
a shell renderer 306. In some implementations, the shell container
may be independent of shell services 304 by utilizing the shell
renderer 306.
[0108] Applications (e.g., apps) 308 may be embedded in an
application container 310. As this is an independent re-use
component, the embedding aspect is decoupled from the renderer 306.
The application container 310 can, for example, host SAPUI5
components, Web Dynpro ABAP applications and SAP GUI for HTML
transactions.
[0109] The shell services 304 and renderers 306 are managed by the
central shell container 302. The shell container 302 utilizes a
runtime configuration 312, which defines the concrete
implementations for services 314, adapters 316, and shell renderer
306, as well as global settings like theme, language, system and
user data. The runtime configuration 312 is fed by a number of
settings, including, but not limited to static configuration
settings in the hosting HTML page, dynamic configuration data read
from the front-end server during startup, and/or dynamic settings
passed as query parameters in the URL
[0110] In some implementations, the JavaScript components shown in
FIG. 300 are embedded into a single HTML page. The launchpad
implementation of the SAP NetWeaver ABAP front-end server may
contain a standard page called, for example, Fiorilaunchpad.html
318, or other URL directed to one or more viewports 320. Users may
create custom start pages which utilize the shell with different
static configurations.
[0111] The web browser can use http data and OData to access
application back-end systems 322 and UI front-end server 324 (e.g.,
service implementations 326 and UI contact 328) via web dispatcher
330.
[0112] Users can embed apps into the Launchpad. When embedding
applications into the launchpad, the system 300 differentiates
between applications based on SAP GUI for HTML or Web Dynpro ABAP
can be embedded using an iFrame (i.e., inline frame). The system
300 differentiates between applications based on SAPUI5. As these
have been implemented using the same UI technology, these can be
embedded directly into the Launchpad using DOM injection. This
approach also allows smooth, animated UI transitions and the reuse
of shared components at runtime. Therefore, applications have to be
implemented as self-contained SAPUI5 components, as described
below.
[0113] In a specific example, users can embed SAPUI5 Applications
into the launchpad using the application container 310 configured
with the following parameters: the URL (root path) of the
application and the name of the SAPUI5 component. The root path is
a path where the component controller for the SAPUI5 app (e.g., the
Component.js file) is located. The application container 310
registers the component namespace as module path for the
application URL.
[0114] The SAPUI5 component is defined with a file structure having
a file named Component.js, which should be located in the root
folder of the application being embedded. The definition of an
SAPUI5 component includes the component metadata. The component
metadata includes a config object containing additional
information. The launchpad-specific configuration is defined in
this config object.
[0115] The launchpad evaluates the following properties of the
component configuration:
[0116] ResourceBundle--Path to the resource bundle that holds the
translated app title. Example: i18n/i18n.properties.
[0117] TitleResource--Key of the app title text in the resource
bundle. The title is typically displayed in the browser tab.
[0118] FavIcon--Path to the "favicon" (*.ico) file for the app,
which is typically displayed in the address bar or next to the
window title or tab title.
[0119] HomeScreenIconPhone, homeScreenIconPhone@2, homeScreenIcon
Tablet, and/or homeScreenIconTablet@2--Paths to icons with
different resolutions that are used when users add the (launchpad
page containing the) app to their mobile devices' home screens. The
properties with an @2 suffix enable referral to special icons for
high-resolution devices.
[0120] The launchpad uses URL hashes for its own navigation. Direct
manipulation of the location hash would interfere with the
launchpad navigation. For cross-app navigation, use the
Cross-Application Navigation service. For inner-app navigation, use
the SAPUI5 routing API. Ensure that all controls created by your
component are destroyed when the component is destroyed. Avoid
using sap.ui.localResources inside your Component.js file.
sap.ui.localResources registers a path relative to the main page
(Fiorilaunchpad.html).
[0121] FIG. 4A is an example screenshot 400 of a scrollable screen
area. The scrollable screen area may provide one or more viewports
that a user can scroll through. For example, the entire screen area
may be a viewport that can be scrolled onto and off of a display
screen. In another example, each region (e.g., container) within
the screenshot 400 may be a viewport that can be scrolled between
other viewports. As shown, the screenshot 400 includes a left
container 402, a shell main container 404, a right container 406,
and a shell toolbar 408. In general, it may be possible for a user
to scroll (e.g., pan) left and right across different regions
(e.g., container 402, container 404, container 406, and container
408) on a display device screen.
[0122] In one example, the shell toolbar 408 can be used to toggle
between viewports. As shown in FIG. 4B, a screenshot 410 includes a
representation 411 of the Shell Toolbar 408. The representation 411
may be provided when the Shell Toolbar 408 is off the screen. For
example, if the user chooses to view other viewports that do not
include the toolbar 408, then a representation 411 of the toolbar
can be provided. The representation 411 includes a Toggle Me Area
control 412 that can toggle between viewports (e.g., container 402,
container 404, container 406, and container 408). The
representation 411 also includes a Back to launchpad control 414 to
enable the user to return to their launchpad viewport. A Toggle
Notifications control 416 is shown to enable the user to toggle the
Notifications in and out of a view of the screen.
[0123] As shown in FIG. 4C, a screenshot 420 depicts a number of
Viewports 422, 424, and 428 viewable on a display screen of a
computing device 428. The Viewports 422-426 may be selected by a
user to show additional data associated with each respective
Viewport. For example, if a user selects an item on the launchpad
viewport 424, a scrollable overlay 430 can be presented in part
within the screen of device 428. In one example, the overlay 430
may be a single viewport that is scrollable by the user. In another
example, a number of Viewports can be represented by overlay 430.
If the user selects a portion of the overlay, any Me area or
notification area viewports may be hidden to display additional
overlay data.
[0124] The architecture described herein can also enable a viewport
that can be translated, faded, zoomed, and/or scaled on a display
screen. As shown in FIG. 5A, a screenshot 500 includes a left
container 502, a main container 504, and a right container 506. The
left container 502 may include a Me area while the right container
506 includes a notification area. More or fewer containers can be
shown and any of the containers may be presented in any position on
the screen in one or more viewports (or virtually off of the
screen). The viewports described herein can support parallax
side-to-side scrolling. In one example, when scrolling content, the
user can shrink and fade content (e.g., as shown at arrow 508) from
the main container 502. In another example, the user can scale up
content and move content to another container/viewport, as shown by
arrow 512 in FIG. 5B. Users can also zoom into and out of a region
on a container/viewport.
[0125] FIG. 5C illustrates a screenshot 520 of an example animation
that can occur when a user interacts with a portion of a viewport.
In particular, if the user is viewing a viewport (e.g., an open
viewport), and selects a profile icon 522, the profile image is
faded from a user profile picture into a cancel icon 524. The fade
includes a gradual removal of the icon by scaling down (e.g.,
shrinking from larger to smaller) the profile image. When the user
closes the viewport, the cancel icon 524 is faded into the profile
icon. The fade in includes scaling the cancel icon 524 from smaller
to larger. In one example, if the left container is open and the
Main container or Notification container is clicked, the same
procedure can occur as in when closing the left
Viewport/container.
[0126] FIG. 5D illustrates a screenshot 530 that depicts a user
moving content from a main container 504. Here, the user is moving
a launchpad into left container 502 from main container 504. When
the user begins to move the content, an animation is generated by
the systems described herein to scale down (e.g., shrink) the
launchpad element as the element is dragged to the left between
containers/viewports 504 and 502.
[0127] FIG. 5E illustrates a screenshot 540 that depicts a user
moving content from a notification container 506. Here, the user is
moving notifications from right container 506 into main container
504. When the user begins to move the content, an animation is
generated by the systems described herein to scale up (e.g.,
enlarge) the notification element as the element is dragged to the
left between containers/viewports 506 and 504.
[0128] FIG. 6 is an illustration of an example process 600 for
generating a user interface. The user interface may include any
number of viewport facilities (e.g., viewports). A viewport may
represent a partial view of the user interface and may include
information from any number of sources. The sources may include,
but are not limited to applications, social media platforms,
messaging facilities, and stored data pertaining to a role of a
user accessing the user interface. IN some implementations, the
user interface may be an adjustable surface area.
[0129] In general, viewports are responsive, flexible, and
extensible for users. The sizing, placement, location, arrangement,
etc. of artifacts within a viewport may also be responsive and
flexible, as well as alterable, and dynamically configurable.
Particular aspects of a viewport may be supported by any number of
different mechanisms including, for example, workflow engines,
rules, events, the rating, ranking of an event, alerts,
prioritizations, and/or the definition/assignment of user
roles.
[0130] In some implementations, viewports may receive, pull,
retrieve, etc. information from any number of sources including for
example Microsoft Exchange, Microsoft Lync, SMS/MMS/IM/etc.
messaging facilities, social media platforms, internal and/or
external enterprise systems, etc. A viewport may receive
information manually through user input. A viewport may employ any
number of different automatic learning capabilities. For example, a
viewport may be used to determine context or user-related
information provided to the viewport. A viewport may continuously
and dynamically evaluate a range of data and/or parameters to
identify information that may be relevant to a user (or a
definition of a rule for a user).
[0131] During various processing activities a viewport may examine
any number of items including things that are on a user's mind and
work that is pending within a user's role. During processing
activities, a viewport may incorporate and/or leverage any number
of pieces of data including information (e.g., from a wearable
device such as a Fitbit), location information (e.g., from a GPS or
LBS facility), etc. Various aspects of a viewport may leverage or
draw upon different bodies of configuration information. For
example, the definition and/or description of different roles that
a user may take encompassing activities, behaviors, priorities,
tasks, etc. A user may have one or more roles.
[0132] A user may browse, traverse, etc. elements of a viewport in
any number of ways. While a viewport may improve and/or augment
aspects of a Fiori environment, a viewport is not limited to just a
Fiori environment and may operate, function in any number of other
environments.
[0133] A viewport may reside, operate, etc. within a user interface
on any target device including for example any combination of one
or more of inter alia a desktop, a notebook, a tablet, a smart
phone, a smart watch, etc. and may among other things dynamically
adjust or adapt aspects of itself to account for any particulars
(e.g., display real estate, input mechanism(s), etc.) of a target
device.
[0134] A viewport may include one or more graphical user interfaces
of software applications that display content (e.g., main content)
together with functions and other information besides the main
content, i.e., supplemental content. Such applications may include
standalone software programs that include a built-in display module
that generates a graphical user interface as described in the
example embodiments herein. Alternatively, display functionality
may be provided separately, e.g., as an add-on package, a plug-in
or through a separate program that communicates with a main content
providing program via an Application Program Interface (API). The
main content providing program and/or the display program may be
executed locally on a user device and/or remotely, as a Web
application, for example.
[0135] Example embodiments are described in which a display area,
referred to herein as a "viewport," is switchable to display the
main content at a different time than the above described functions
and information. When the main content is selected for display, the
viewport provides the user with a clear screen orientation,
allowing the user to focus on his current task. Additionally, the
viewport includes user selectable options to switch the display to
the functions and information, thus ensuring access to everything
the user may need in a convenient, space saving and visually
appealing way. The options require minimal space to display, so
that user interface can be made to look essentially the same across
different computer devices. Thus, the viewport has a responsive
design.
[0136] In an example embodiment, the functions and information are
assigned to at least one virtual extension of the viewport.
Preferably, the virtual extension includes a first extension area
to the left of the viewport and a second extension area to the
right of the viewport. When the main content is selected, the
extension area(s) are hidden from display.
[0137] In another example embodiment, the viewport is switched to
display selected supplemental content by triggering a graphical
icon inside the viewport. Alternatively, if the display is
touch-sensitive, the viewport may be switched by a touch gesture
such as a swiping motion towards or away from the corresponding
extension area. The viewport may be switched back to the main
content, e.g., by triggering a respective icon or using a
gesture.
[0138] In another example embodiment, trigger icons indicate when
new or unread information is available inside a respective
extension area. The indication can be a numerical counter, a
symbol, a special graphic or animation, etc. Thus, the user need
not leave the current context, i.e., the main content, to be
alerted to new information.
[0139] In yet another example embodiment, the supplemental content
is displayed by moving the corresponding extension area over to the
viewport. Preferably, the movement is animated in the manner of a
camera pan. However, other movements such as instantaneous display
or fading in and out are also possible.
[0140] In another example embodiment, at least part of the main
content remains on display in the viewport when the extension area
is displayed. The main content may be shifted away from a central
portion of the viewport and reduced in size (e.g., scaled down to
75% of its original size) to direct the user's attention to the
supplemental content. In this display state, the main content may
be partially cut off by the border of the viewport.
[0141] Referring to FIG. 6, a process 600 is depicted for
generating a user interface. The user interface may be customized
and includes a set of UI elements (e.g., graphical targets,
navigation links, icons, graphs, pictorial data, applications,
etc.). Each viewport may include any or all architecture including,
but not limited to a shell container, an application container, a
renderer, services (e.g., shell services), a runtime configuration,
web browser functionality, etc. The user interface and viewports
may have access to frontend servers and backend servers (e.g.,
server 324 and server 322).
[0142] At block 602, the process 600 includes obtaining a plurality
of viewports. The viewports may be generated by architecture 300
and provided to a frontend server 324. At least some of the
plurality of viewports may include personalized content from a
plurality of sources and application functions integrated from a
plurality of different applications. This may be because particular
viewports may include information or applications for a user to
carry out a particular assigned role.
[0143] At block 604, the process 600 includes providing, for
display in a display device, the user interface depicting at least
one of the plurality of viewports (e.g., containers) in the
display. For example, a viewport 404 (FIG. 4A) is displayed in a
main view of a user interface. Viewport 408 is a toolbar that is
also provided in the main view of the user interface. The viewports
402 and 406 are not presently shown on a display device white the
display device is displaying viewport 404. However, the user
interface is scrollable and as such, a user can select (e.g.,
touch, swipe, select control) to move the user interface left or
right to respectively capture viewport 406 or viewport 402. In some
implementations, the user interface can provide scrolling
functionality between the plurality of viewports that includes
animations such as a fading out and downward scaling of a first
viewport before fading in and upward scaling of a second viewport
in the user interface. If the display of the display device were
large enough to display all of the viewports in FIG. 4A, then all
viewports can be displayed.
[0144] At block 606, the process 600 includes generating the one or
more additional viewports, appending the one or more additional
viewports to the user interface, and generating an updated user
interface to include the plurality of viewports and the one or more
additional viewports, in response to receiving a request to add one
or more additional viewports. For example, if a user wishes to add
a viewport for displaying scheduling and email content for one
project and another viewport for displaying drawings and purchase
costs for a second project, two viewports can be generated and
appended or interleaved into the current user interface. That is,
additional containers (e.g., views, shells, etc.) can be generated
for the user interface.
[0145] At block 608, the process 600 includes displaying the
updated user interface with a selected one or more of the
additional viewports being scrolled into view on the display of the
display device. The selected one or more of the additional
viewports is based at least in part on a size of the display of the
display device. That is, a bigger display can allow for concurrent
display of additional viewports.
[0146] In one example, a first viewport of the plurality of
viewports may include a container with a user profile portion, a
configuration settings portion, and a plurality of controls, as
shown in the left container (e.g., viewport) 502 in FIG. 5A. A
second viewport of the plurality of viewports may include a
launch-able content container, as shown in FIG. 5A (e.g., launchpad
504). A third viewport of the plurality of viewports may include a
notification container, as shown in the right container 506 of FIG.
5A. A fourth viewport of the plurality of viewports may include a
header toolbar. The header toolbar may include a plurality of
controls, as shown in FIG. 4A (toolbar 408). IN some
implementations, the plurality of controls are associated with a
context of content being displayed in one or more of the plurality
of viewports.
[0147] In some implementations, the process 600 may include
resizing at least one viewport in the plurality of viewports, in
response to detecting new information is available to be displayed
in the user interface. For example, if new notifications are
available, the systems described herein may resize a viewport to
show the indication of the new notifications.
[0148] The various systems and techniques described herein may be
implemented in digital electronic circuitry, or in computer
hardware, firmware, software, or in combinations of them. The
various techniques may implemented as a computer program product,
i.e., a computer program tangibly embodied in an information
carrier, e.g., in a machine readable non-transitory storage device,
for execution by, or to control the operation of, data processing
apparatus, e.g., a programmable processor, a computer, or multiple
computers. A computer program, such as the computer program(s)
described above, can be written in any form of programming
language, including compiled or interpreted languages, and can be
deployed in any form, including as a standalone program or as a
module, component, subroutine, or other unit suitable for use in a
computing environment. A computer program can be deployed to be
executed on one computer or on multiple computers at one site or
distributed across multiple sites and interconnected by a
communication network.
[0149] Processors suitable for the execution of a computer program
include, by way of example, both general and special purpose
microprocessors, and any one or more processors of any kind of
digital computer. Generally, a processor will receive instructions
and data from a read only memory or a random access memory or both.
Elements of a computer may include at least one processor for
executing instructions and one or more memory devices for storing
instructions and data. Generally, a computer also may include, or
be operatively coupled to receive data from or transfer data to, or
both, one or more mass storage devices for storing data, e.g.,
magnetic, magnetooptical disks, or optical disks. Information
carriers suitable for embodying computer program instructions and
data include all forms of nonvolatile memory, including by way of
example semiconductor memory devices, e.g., EPROM, EEPROM, and
flash memory devices; magnetic disks, e.g., internal hard disks or
removable disks; magnetooptical disks; and CDROM and DVD-ROM disks.
The processor and the memory may be supplemented by, or
incorporated in special purpose logic circuitry.
[0150] Implementations may be implemented in a computing system
that includes a backend component, e.g., as a data server, or that
includes a middleware component, e.g., an application server, or
that includes a frontend component, e.g., a client computer having
a graphical user interface or a Web browser through which a user
can interact with an implementation, or any combination of such
backend, middleware, or frontend components. Components may be
interconnected by any form or medium of digital data communication,
e.g., a communication network. Examples of communication networks
include a local area network (LAN) and a wide area network (WAN),
e.g., the Internet.
* * * * *