U.S. patent application number 14/035266 was filed with the patent office on 2014-03-27 for method and apparatus for providing multi-window in touch device.
This patent application is currently assigned to Samsung Electronics Co. Ltd.. The applicant listed for this patent is Samsung Electronics Co. Ltd.. Invention is credited to Daesik HWANG, Hyesoon JEONG, Jeonghoon KIM, Dongjun LEE, Jonghwa OH.
Application Number | 20140089833 14/035266 |
Document ID | / |
Family ID | 49263142 |
Filed Date | 2014-03-27 |
United States Patent
Application |
20140089833 |
Kind Code |
A1 |
HWANG; Daesik ; et
al. |
March 27, 2014 |
METHOD AND APPARATUS FOR PROVIDING MULTI-WINDOW IN TOUCH DEVICE
Abstract
A method of executing an application in a touch device is
provided. The method includes displaying an execution screen of a
first application as a full screen, receiving an input of an
execution event for executing a second application, configuring a
multi-window in a split scheme when the execution event is released
on a specific window, and independently displaying screens of the
first application and the second application through respective
split windows.
Inventors: |
HWANG; Daesik; (Suseong-gu,
KR) ; JEONG; Hyesoon; (Chilgok-gun, KR) ; KIM;
Jeonghoon; (Gumi-si, KR) ; LEE; Dongjun;
(Gumi-si, KR) ; OH; Jonghwa; (Dalseo-gu,
KR) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Samsung Electronics Co. Ltd. |
Suwon-si |
|
KR |
|
|
Assignee: |
Samsung Electronics Co.
Ltd.
Suwon-si
KR
|
Family ID: |
49263142 |
Appl. No.: |
14/035266 |
Filed: |
September 24, 2013 |
Current U.S.
Class: |
715/769 ;
715/780; 715/781; 715/800 |
Current CPC
Class: |
G06F 2203/04803
20130101; G06F 3/0488 20130101; G06F 3/04817 20130101; G06F 3/04886
20130101; G06F 3/0486 20130101; G06F 3/0481 20130101 |
Class at
Publication: |
715/769 ;
715/781; 715/780; 715/800 |
International
Class: |
G06F 3/0481 20060101
G06F003/0481; G06F 3/0486 20060101 G06F003/0486 |
Foreign Application Data
Date |
Code |
Application Number |
Sep 24, 2012 |
KR |
10-2012-0105898 |
Claims
1. A method of executing an application in a touch device, the
method comprising: displaying an execution screen of a first
application as a full screen; receiving an input of an execution
event for executing a second application; configuring a
multi-window in a split scheme when the execution event is released
on a specific window; and independently displaying screens of the
first application and the second application through respective
split windows.
2. The method of claim 1, wherein the execution event is an event
for selecting an execution icon of the second application to be
additionally executed from a tray and moving the selected execution
icon into a screen.
3. The method of claim 1, further comprising outputting a feedback
for a window corresponding to an updated location of an execution
icon when an execution icon is moved and is not released.
4. The method of claim 3, wherein the outputting of the feedback
comprises confirming a window of a region where the execution icon
is moved and currently located through a location trace of the
execution icon.
5. The method of claim 4, further comprising releasing the
execution icon by dropping the execution icon on a window where the
execution icon is currently located.
6. The method of claim 2, wherein the independently displaying of
the screens comprises respectively displaying a screen
corresponding to a size of a corresponding window in which the
first application and the second application are executed.
7. The method of claim 1, further comprising: displaying a screen
of a plurality of applications through the multi-window; receiving
an input of an execution event for an additional application while
displaying the screen of the plurality of applications; executing
the additional application through a window selected to execute the
additional application; and processing an application previously
executed through the selected window as a background, and
displaying a screen of the additional application through the
selected window.
8. The method of claim 7, further comprising comparing the number
of currently executed execution windows with a split information,
and determining whether the number of execution windows corresponds
to a value set to the split information when the input of the
execution event for selecting the additional application is
received.
9. The method of claim 2, wherein the tray is moved to another
region in the screen according to a user input.
10. The method of claim 2, wherein a floating key pad is provided
when a text input is requested from an application of a specific
window during an operation by the multi-window.
11. The method of claim 10, wherein the floating key pad is moved
to another region in the screen according to a user input.
12. The method of claim 11, wherein an input character by the
floating key pad is input to a text input window provided from the
application of the specific window and is displayed.
13. The method of claim 1, wherein respective windows of the
multi-window are separated by a separator.
14. The method of claim 13, wherein the sizes of the respective
windows are changed according to a movement of the separator.
15. A method of executing an application in a touch device, the
method comprising: executing a first application corresponding to
user selection and displaying the application through one window as
a full screen; receiving a first event input for selecting and
moving a second application when the first application is executed;
determining a multi-window split scheme and a region to which the
first event is input; outputting a feedback for a window in which
the second application is able to be executed and the region to
which the first event is input; receiving a second event input of
executing the second application; configuring the multi-window in
response to the second event input; and independently displaying a
screen of the first application and a screen of the second
application respectively through corresponding windows separated by
the multi-window.
16. The method of claim 15, wherein the first event comprises an
event for selecting an execution icon of the second application to
be additionally executed from a tray and moving the selected
execution icon into a screen.
17. The method of claim 15, wherein the second event comprises
moving the execution icon and releasing the execution icon from a
current window when the first event is not released.
18. A method of executing an application in a touch device, the
method comprising: displaying an execution screen of a first
application as a full screen; sliding-in a tray including an
execution icon of an application according to a user input when the
first application is executed; receiving an input for selecting an
execution icon of a second application from the tray and dragging
the selected execution icon into the full screen; receiving an
input for dropping the execution icon in a specific window while
the execution icon is dragged; executing the second application in
response to the drop input of the execution icon; splitting a full
screen into windows for displaying screens of the first application
and the second application; and displaying a screen of the second
application through the specific window in which the execution icon
is dropped, and displaying the screen of the first application
through another split window.
19. The method of claim 18, further comprising sliding-out the tray
when the execution icon is selected and is moved to the full
screen.
20. The method of claim 18, further comprising: blanking a region
to which the execution icon is allocated in the tray when the
execution icon is selected and is moved to the full screen; and
restoring again the blanked region when the tray is slid-out.
21. The method of claim 18, further comprising outputting a
feedback for the specific window into which the execution icon is
dragged while the execution icon is moved on the full screen
according to the drag.
22. The method of claim 18, further comprising: popping-up a
floating key pad when a text input is requested from an application
of a specific window from among the split windows; and inputting a
text for the application of the specific window according to a user
input using the floating key pad.
23. The method of claim 22, further comprising moving the floating
key pad to another region in a screen to input a text for an
application of another window.
24. The method of claim 18, wherein the windows for displaying the
screens of the first application and the second application are
split through a separator.
25. The method of claim 24, further comprising changing sizes of
the windows according to a movement of the separator.
26. A touch device comprising: a touch screen configured to display
a screen interface of a multi-window environment, to display
screens of a plurality of applications through a plurality of
windows separated in the screen interface, and to receive an event
input for operating the plurality of applications; and a controller
configured to control execution of the plurality of applications in
the multi-window environment, and to control to independently
display screens of at least two applications according to a user
selection from among a plurality of executed applications through
the plurality of windows.
27. The touch device of claim 26, wherein the controller receives
an input of an execution event for executing a second application
when an execution screen of a first application is displayed as a
full screen, configures a multi-window according to a split scheme
when the execution event is released from a specific window, and
controls to independently display screens of the first application
and the second application through respective split windows.
28. The touch device of claim 27, wherein the controller controls a
feedback output for a window of a moved location while an execution
icon is moved in a state in which the execution event is not
released.
29. The touch device of claim 27, wherein, when execution of an
additional application is received while displaying screens of the
plurality of applications through the multi-window, the controller
executes the additional application through a selected window for
executing the additional application, processes an application
previously executed through the selected window as a background,
and controls to display a screen of the additional application
through the selected window.
30. The touch device of claim 26, wherein the screen interface
comprises an execution icon of an application and a tray moved to
another region in a screen according to a user input.
31. The touch device of claim 26, wherein the screen interface
comprises a floating key pad which is popped up when a text input
is requested from an application of a specific window, and moved to
another region in a screen according to a user input.
32. The touch device of claim 26, wherein the screen interface
comprises a separator for separating respective windows according
to the multi-window environment and changing sizes of the
respective windows according to a user input.
33. The touch device of claim 32, wherein the controller determines
a changing size of the respective windows according to movement of
the separator.
34. A computer readable recording medium having recorded thereon a
program for executing a process comprising receiving an input of an
execution event for executing a second application when an
execution screen of a first application is displayed as a full
screen, configuring a multi-window in a split scheme when the
execution event is released on a specific window, and independently
displaying screens of the first application and the second
applications through respective split windows.
35. The recording medium of claim 34, wherein the program further
comprises outputting a feedback for a window of a moved location
when an execution icon is moved and when the execution icon is not
released.
Description
CROSS-REFERENCE TO RELATED APPLICATION(S)
[0001] This application claims the benefit under 35 U.S.C.
.sctn.119(a) of a Korean patent application filed on Sep. 24, 2012
in the Korean Intellectual Property Office and assigned Serial No.
10-2012-0105898, the entire disclosure of which is hereby
incorporated by reference.
TECHNICAL FIELD
[0002] The present disclosure relates to a method and an apparatus
for operating a function in a touch device. More particularly, the
present disclosure relates to a method of providing a multi-window
in a touch device so that a plurality of application may be
efficiently used through multi-splitting of a window on one screen
provided from the touch device, and an apparatus thereof.
BACKGROUND
[0003] In recent years, with the development of digital technology,
various mobile devices such as a mobile communication terminal, a
Personal Digital Assistant (PDA), an electronic note device, a
smart phone, a tablet Personal Computer (PC), and the like, each
capable of processing communication and personal information while
a user is moving, have been introduced. These mobile devices have
developed to a mobile convergence stage including the traditional
field of communication and other terminal fields. The mobile device
may have various functions, such as the ability to process an audio
call, an image call, to process the transmission and reception of a
message such as a Short Message Service (SMS)/Multimedia Message
Service (MMS), an e-mail, an electronic note, photography, a
broadcasting play, a video play, a music play, information from
Internet, a messenger, and a Social Networking Service (SNS).
[0004] However, in the touch device, due to a characteristic of the
touch device having a small screen, only one application view can
be provided at once. Any additional application is displayed
through pop-up. Accordingly, in the related art, due to a screen
having a small size, although a plurality of applications are
simultaneously executed, only one application view is provided to a
current screen according to the user selection. That is, the
related art cannot efficiently use a plurality of applications.
[0005] Therefore, a need exists for a method and apparatus in which
a plurality of applications may be efficiently used by splitting a
window displayed on one screen of the touch device.
[0006] The above information is presented as background information
only to assist with an understanding of the present disclosure. No
determination has been made, and no assertion is made, as to
whether any of the above might be applicable as prior art with
regard to the present disclosure.
SUMMARY
[0007] Aspects of the present disclosure are to address at least
the above-mentioned problems and/or disadvantages and to provide at
least the advantages described below. Accordingly, an aspect of the
present disclosure is to provide a method of implementing a
multi-window environment in a single system of a touch device
composed of at least two split windows and an apparatus
thereof.
[0008] Another aspect of the present disclosure is to provide a
method of providing a multi-window in a touch device capable of
maximizing the usability of the touch device by a user by splitting
one screen into at least two windows to easily arrange and execute
a plurality of applications and an apparatus thereof.
[0009] Another aspect of the present disclosure is to provide a
method of supporting a multi-widow environment in a touch device
capable of simply changing a layout for convenience of an operation
of a plurality of applications in the multi-window environment and
supporting the convenience of a user operation in the multi-window
environment and an apparatus thereof.
[0010] Another aspect of the present disclosure is to provide a
method of supporting a multi-window in a touch device capable of
minimizing a burden of a user operation in a multi-window
environment, and increases the user's convenience with respect to a
plurality of applications by freely adjusting windows with respect
to a plurality of applications and an apparatus thereof.
[0011] Another aspect of the present disclosure is to provide a
method of supporting a multi-window environment in a touch device
capable of supporting large amounts of information and various
experiences to the user by implementing a multi-window environment
in a touch device and an apparatus thereof.
[0012] Another aspect of the present disclosure is to provide a
method of supporting a multi-window environment capable of
improving convenience for a user and usability of the touch device
by implementing an optimal environment for supporting a
multi-window environment in a touch device is provided.
[0013] In accordance with an aspect of the present disclosure, a
method of executing an application in a touch device is provided,
The method includes displaying an execution screen of a first
application as a full screen, receiving an input of an execution
event for executing a second application, configuring a
multi-window in a split scheme when the execution event is released
on a specific window, and individually displaying screens of the
first application and the second application through respective
split windows.
[0014] In accordance with another aspect of the present disclosure,
a method of executing an application in a touch device is provided.
The method includes executing a first application corresponding to
a user selection and displaying the application through one window
as a full screen, receiving a first event input for selecting and
moving a second application when the first application is executed,
determining a multi-window split scheme and a region to which a
first event is input, outputting a feedback for a window in which
the second application is able to be executed and the region to
which the first event is input, receiving a second event input for
executing the second application, configuring the multi-window in
response to the second event input, and independently displaying a
screen of the first application and a screen of the second
application through corresponding windows separated by the
multi-window.
[0015] In accordance with another aspect of the present disclosure,
a method of executing an application in a touch device is provided.
The method includes displaying an execution screen of a first
application as a full screen, sliding-in a tray including an
execution icon of an application according to a user input when the
first application is executed, receiving an input for selecting an
execution icon of a second application from the tray and dragging
the selected execution icon into the full screen, receiving an
input for dropping the execution icon in a specific window while
the execution icon is dragged, executing the second application in
response to the drop input of the execution icon, splitting a full
screen into windows for displaying screens of the first application
and the second application, and displaying a screen of the second
application through the specific window in which the execution icon
is dropped and displaying the screen of the first application
through another split window.
[0016] In order to achieve the above objects, there is provided a
computer readable recording medium recording a program for
executing the methods in a processor.
[0017] In accordance with another aspect of the present disclosure,
a touch device is provided, The touch device includes a touch
screen configured to display a screen interface of a multi-window
environment, to display screens of a plurality of applications
through a plurality of windows split in the screen interface, and
to receive an event input for operating the applications, and a
controller configured to control execution of the applications in
the multi-window environment, and to control to independently
display screens of at least two applications through the windows
according to a user selection from among a plurality of executed
applications.
[0018] In accordance with another aspect of the present disclosure,
a computer readable recording medium having recorded thereon a
program performing a method is provided. The method includes
receiving an input of an execution event for executing a second
application when an execution screen of a first application is
displayed as a full screen, configuring a multi-window in a split
scheme when the execution event is released on a specific window,
and individually displaying screens of the first application and
the second application through respective split windows.
[0019] Other aspects, advantages, and salient features of the
disclosure will become apparent to those skilled in the art from
the following detailed description, which, taken in conjunction
with the annexed drawings, discloses various embodiments of the
present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] The above and other aspects, features, and advantages of
certain embodiments of the present disclosure will be more apparent
from the following description taken in conjunction with the
accompanying drawings, in which:
[0021] FIG. 1 is a block diagram schematically illustrating a
configuration of a touch device according to an embodiment of the
present disclosure;
[0022] FIG. 2 is a diagram of a screen schematically illustrating a
screen interface in a touch device according to an embodiment of
the present disclosure;
[0023] FIG. 3 is a diagram schematically illustrating an operation
of a multi-window in a touch device according to an embodiment of
the present disclosure;
[0024] FIG. 4 is a diagram schematically illustrating an operation
for splitting a multi-window in a touch device according to an
embodiment of the present disclosure;
[0025] FIGS. 5, 6, 7, 8, 9, 10, 11, and 12 are diagrams
illustrating examples of an operation screen operating a tray for
rapidly executing an application in a multi-window environment
according to an embodiment of the present disclosure;
[0026] FIGS. 13, 14, 15, 16, and 17 are diagrams illustrating
examples of an operation screen operating a plurality of
applications in a multi-window environment according to an
embodiment of the present disclosure;
[0027] FIGS. 18, 19, 20, 21, 22, and 23 are diagrams illustrating
examples of operating a plurality of applications in a multi-window
environment according to an embodiment of the present
disclosure;
[0028] FIGS. 24, 25, 26, 27, 28, and 29 are diagrams illustrating
examples of operating a key pad for text input in a multi-window
environment according to an embodiment of the present
disclosure;
[0029] FIG. 30 is a diagram illustrating an example of operating a
plurality of applications in a multi-window environment according
to an disclosure embodiment of the present disclosure;
[0030] FIGS. 31, 32, 33, and 34 are diagrams illustrating examples
of an operation screen providing information with respect to a
plurality of applications executed according to a multi-window
environment in a touch device according to an embodiment of the
present disclosure;
[0031] FIG. 35 is a flowchart illustrating a method of executing an
additional application by switching a multi-window environment in a
touch device according to an embodiment of the present disclosure;
and
[0032] FIG. 36 is a flowchart illustrating a method of executing an
additional application in a multi-window environment in a touch
device according to an embodiment of the present disclosure.
[0033] The same reference numerals are used to represent the same
elements throughout the drawings.
DETAILED DESCRIPTION
[0034] The following description with reference to the accompanying
drawings is provided to assist in a comprehensive understanding of
various embodiments of the present disclosure as defined by the
claims and their equivalents. It includes various specific details
to assist in that understanding but these are to be regarded as
merely exemplary. Accordingly, those of ordinary skill in the art
will recognize that various changes and modifications of the
various embodiments described herein can be made without departing
from the cope and spirit of the present disclosure. In addition,
descriptions of well-known functions and constructions may be
omitted for clarity and conciseness.
[0035] The terms and words used in the following description and
claims are not limited to the bibliographical meanings, but, are
merely used by the inventor to enable a clear and consistent
understanding of the present disclosure. Accordingly, it should be
apparent to those skilled in the art that the following description
of various embodiments of the present disclosure is provided for
illustration purpose only and not for the purpose of limiting the
present disclosure as defined by the appended claims and their
equivalents.
[0036] It is to be understood that the singular forms "a," "an,"
and "the" include plural referents unless the context clearly
dictates otherwise. Thus, for example, reference to "a component
surface" includes reference to one or more of such surfaces.
[0037] The present disclosure relates to a method of providing a
multi-window in a touch device which splits a screen of the touch
device into at least two windows in a split scheme to provide a
multi-window and allows a user to efficiently use a plurality of
applications through the multi-window on one screen and an
apparatus thereof.
[0038] Embodiments of the present disclosure may include selecting
an additional application in a touch device to determine a screen
split scheme upon execution of a drag, and may feedback a
corresponding window in which an additional application is able to
be executed from among respective windows split from one screen.
Accordingly, the user may know where an additional application
being executed exists. Further, according to an embodiment of the
present disclosure, when the additional application is executed at
a location selected by the user, a screen of the application may be
displayed suitable for the size of a corresponding window.
[0039] Hereinafter, a configuration of a touch device and a method
of controlling an operation thereof according to embodiments of the
present disclosure will be described with reference to the
accompanying drawings. A configuration of the touch device and a
method of controlling an operation thereof according to embodiments
of the present disclosure are not limited to the following
description, but are also applicable to various additional
embodiments based on the embodiments described herein.
[0040] FIG. 1 is a block diagram schematically illustrating a
configuration of a touch device according to an embodiment of the
present disclosure.
[0041] Referring to FIG. 1, the touch device of the present
disclosure may include a Radio Frequency (RF) communication unit
110, a user input unit 120, a display unit 130, an audio processor
140, a memory 150, an interface unit 160, a controller 170, and a
power supply 180. Since constituent elements shown in FIG. 1 may
not be essential, a touch device of the present disclosure may be
implemented with more than the above described elements or less
than the above described elements.
[0042] The RF communication unit 110 may include at least one or
more modules capable of performing a wireless communication between
the touch device and a wireless communication system or between the
touch device and a network in which another device is located. For
example, the wireless communication unit 110 may include a mobile
communication module 111, a Wireless Local Area Network (WLAN)
module 113, a short range communication module 115, a location
calculation module 117, and a broadcasting reception module
119.
[0043] The mobile communication module 111 transmits and receives a
wireless signal to and from at least one of a base station, an
external terminal, various servers (e.g., an integration server, a
provider server, a content server, or the like). The wireless
signal may include a voice call signal, an image call signal, or
data of various formats according to the transmission/reception of
a character/multi-media message. The mobile communication module
111 may access at least one of various servers under control of the
controller 170 to receive an application available in a touch
device according to user selection.
[0044] The WLAN module 113 may be a module for access to wireless
Internet, and forming a wireless LAN link with other touch device,
and may be installed at an inside or outside of the touch device.
Wireless Internet techniques may include Wireless LAN/Wi-Fi (WLAN),
Wireless broadband (Wibro), World Interoperability for Microwave
Access (Wimax), and High Speed Downlink Packet Access (HSDPA). The
WLAN module 113 may access at least one of various servers to
receive a usable application from the touch device according to
user selection under control of controller 170. Further, when a
WLAN link is formed with another touch device, the WLAN module 113
may transmit or receive an application according to the user
selection to or from another touch device.
[0045] The short range communication module 115 is a module for
short range communication. The short range communication techniques
may include Bluetooth, Bluetooth Low Energy (BLE), Radio Frequency
IDentification (RFID), Infrared Data Association (IrDA), Ultra
Wideband (UWB), ZigBee, and Near Field Communication (NFC). When
the short range communication module 115 connects short range
communication with another touch device, the short range
communication module 115 may transmit or receive an application
according to the user selection to or from another touch
device.
[0046] The location calculation module 117 is a module for
acquiring a location of the touch device. For example, the location
calculation module 117 includes a Global Position System (GPS). The
location calculation module 115 may calculate distance information
distant from at least three base stations and exact time
information, apply trigonometry to the calculated information so
that three-dimensional current location information according to
latitude, longitude, and altitude may be calculated. The location
calculation module 115 may continuously receive a current location
of the touch device from at least three satellites in real time to
calculate location information. The location information of the
touch device may be acquired by various schemes.
[0047] The broadcasting receiving module 119 receives a
broadcasting signal (e.g., a TV broadcasting signal, a radio
broadcasting signal, a data broadcasting signal) and/or information
(e.g., a broadcasting channel, a broadcasting program or
information about a broadcasting service provider) from an external
broadcasting management server through a broadcasting channel
(e.g., a satellite channel or a terrestrial channel).
[0048] The user input unit 120 generates input data for controlling
an operation of the touch device by user. The user input unit 120
may be configured by a key pad, a dome switch, a touch pad (e.g., a
resistive/capacitive type), a jog wheel, and a jog switch. The user
input unit 120 may be implemented in the form of a button outside
the touch device, and some buttons may be implemented by a touch
panel.
[0049] The display unit 130 displays (i.e., outputs) information
processed by the touch device. For example, when the touch device
is in a call mode, the display unit 130 displays User Interface
(UI) or Graphical UI (GUI) associated with a call. When the touch
device is in an image call mode or a shooting mode, the display
unit 130 displays photographed and/or received image or UI and
GUI.
[0050] In the present disclosure, the display unit 130 may display
an execution screen with respect to various functions (or
applications) executed in the touch device through one or more
windows, as will be illustrated in relation to the following
figures, for instance FIG. 3. The execution screen may therefore
display data relating to multiple applications. In particular, the
display unit 130 may provide at least two split screen regions
according to a split scheme, and may provide the split screen
regions to one window, respectively to form a multi-window. That
is, the display unit 130 may display a screen corresponding to the
multi-window environment, and may display an execution screen with
respect to a plurality of applications through a multi-window,
which is split regions. In this case, the display unit 130 may
simultaneously display a screen of one window and a screen of
another window in parallel. The display unit 130 may display a
separator for separating respective windows, that is, split
regions. The display unit 130 may display a tray (or an application
launcher) for efficiently and intuitively executing an application
according to the multi-window environment. The tray comprises a
screen region in which, for instance, icons representing respective
applications may be displayed and selected. The tray may comprise a
pop-up object displayed upon the screen. The tray may be moved
within the screen. The display unit 130 may display a virtual input
device (e.g., a touch key pad or a floating key pad which is freely
moved in a full screen region. Further, the display unit 130 may
receive a user input on a full screen (the whole of the available
screen area of the display unit 130) or on an individual window
screen provided through one or more windows in a multi-window
environment, and may transfer an input signal according to the user
input to the controller 170. Further, the display unit 130 may
support screen display in a landscape mode, screen display in a
vertical mode (portrait mode), and a screen switch display
according to variation between the landscape mode and the vertical
mode according to the orientation or a change in the orientation of
the touch device. An embodiment of a screen of the display unit 130
operated according to an embodiment of the present disclosure will
be described herein.
[0051] The display unit 130 may include at least one of a Liquid
Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal
Display (TFT LCD), a Light Emitting Diode (LED), an Organic
Light-Emitting Diode (OLED), an Active Matrix OLED (AMOLED), a
flexible display, a bendable display 100, and a 3D display. Some of
the above displays may be implemented by a transparent display
configured in a transparent type or a light transmittance type to
look out the outside there through.
[0052] When a touch panel detecting a touch operation forms a layer
structure with the display unit 130 (e.g., a "touch screen"), the
display unit 130 may be used as an input device as well as an
output device. The touch panel may convert pressure applied to a
specific part of the display unit 130 or a variation in capacitance
created at the specific part of the display unit 130 into an
electric input signal. The touch panel may detect a touched
location, an area, or pressure upon touch. When there is touch
input with respect to the touch panel, a signal(s) corresponding to
the touch input is sent to a touch controller (not shown). The
touch controller (not shown) processes the signal(s) and transmits
corresponding data to the controller 170. Accordingly, the
controller 170 may recognize which region of the display unit 330
is touched.
[0053] The audio processor 140 transmits an audio signal input from
the controller 170 to a speaker 141, and transfers an audio signal
such as a voice input from the microphone 143 to the controller
170. The audio processor 140 converts voice/sound data into an
audible sound and outputs the audible sound through the speaker 141
under the control of the controller 170. The audio processor 140
may convert an audio signal such as a voice input from the
microphone 143 into a digital signal, and may transfer the digital
signal to the controller 170.
[0054] The speaker 141 may output audio data received from the RF
communication unit 110 or stored in the memory 150 in a call mode,
a record mode, a media contents play mode, a photographing mode, or
a multimedia mode. The speaker 141 may output a sound signal
associated with a function (e.g., a receiving call connection, a
sending call connection, a music file play, a video file play, an
external output, or the like) performed in the touch device.
[0055] The microphone 143 may receive and process an external sound
signal to electric voice data in a call mode, a record mode, a
voice recognition mode, or a photographing mode. The processed
voice data are converted into a transmissible format and the
converted data are outputted to a mobile communication base station
through a mobile communication module 111. Various noise removal
algorithms for removing a noise generated during a procedure of
receiving an external sound signal may be implemented in the
microphone 143.
[0056] The memory 150 may store a program for process and control
of the controller 170, and may temporarily store a function for
input/output data (e.g., a telephone number, a message, audio,
media contents [e.g., a music file or a video file], or an
application). The memory 150 may store a use frequency (e.g.,
frequencies in the use of an application, frequencies in media
contents, or frequencies in a phone number, a message, and in
multi-media), an importance, a priority, or a preference according
to a function operation of the touch device. The memory 150 may
store data regarding a vibration or a sound of various patterns
output upon touch input on the touch screen. In particular, the
memory 150 may store split information with respect to a screen
split scheme for operating a multi-window, application information
to be registered in the tray, or application information executed
by multi-tasking by the multi-window.
[0057] The memory 150 may include a storage medium having at least
one of memory types including a flash memory type, a hard disk
type, a micro type, a card type (e.g., an SD card or XD card
memory), Random Access Memory (RAM), Static Random Access Memory
(SRAM), Read-Only Memory (ROM), Programmable Read-Only Memory
(PROM), Electrically Erasable Programmable Read-Only Memory
(EEPROM), Magnetic RAM (MRAM), a magnetic disc, or an optical disc.
The touch device may operate associated with a web storage
executing a storage function of the memory 150 on Internet.
[0058] The interface unit 160 performs a function of passage with
all external devices connected to the touch device. The interface
unit 160 may receive data or power from an external device,
transfer the data or power to each element inside of the touch
device, or transmit data of the inside of touch device to an
external device. For example, the interface unit 160 may include a
wire/wireless headset port, an external charger port, a
wire/wireless data port, a memory card port, a port of connecting a
device having an identity module, an audio I/O (input/output) port,
a video I/O (input/output) port and an earphone port. The interface
unit 160 includes an interface for connecting with an external
device in a wired or wireless scheme.
[0059] The controller 170 controls an overall operation of the
touch device. For example, the controller 170 performs control
associated with an operation of an application according to a voice
call, a data communication, an image call, or operating a
multi-window environment. The controller 170 may include a separate
multi-media module (not shown) for operating a multi-window
function. According to certain embodiments of the present
disclosure, the multi-media module (not shown) may be implemented
in the controller 170 and may be implemented separately from the
controller 170.
[0060] More particularly, the controller 170 may control a series
of operations for supporting a multi-window function according to
embodiments of the present disclosure. For example, the controller
170 may control execution of a plurality of applications in a
multi-window environment. The controller 170 may control to
independent display of screens relating to at least two
applications according to user selection from among a plurality of
executed applications through the plurality of windows.
[0061] For example, the controller 170 may receive an execution
event input, for instance a touch input, for executing a second
application in a state in which an execution screen of the first
application is displayed as a full screen (that is, occupying all
or substantially all of the available screen area within the
display unit 130). The controller 170 may control a feedback output
(for instance, visual feedback) with respect to a window where a
dragged icon relating to the second application is currently
located, or another movement location before the execution event is
released. If the execution event is released when located over a
specific window, the controller 170 may configure a multi-window
according to a pre-set split scheme, and may control to
independently display a screen of the first application and the
second application through respective split windows.
[0062] Further, when an input requesting execution of an additional
application is received while displaying screens of a plurality of
applications through multi-windows, the controller 170 may control
execution of the additional application through a window selected
to execute the additional application. In this case, the controller
170 executes, and processes an application previously executed
through the selected window in the background (that is, without
continuing to display the executing application), and controls to
display the additional application screen through the selected
window.
[0063] Further, the controller 170 may control the display of a
tray, a separator, or a floating key pad provided from a screen
interface according to the multi-window environment. The controller
170 may allow the displayed tray, separator or floating key pad to
be moved within the screen according to a user input or otherwise.
More particularly, the controller 170 may determine (i.e., change)
the size of each window according to the multi-window environment
in accordance with the movement of the separator.
[0064] A detailed control operation of the controller 370 will be
described in an example of an operation of the touch device and a
control method thereof referring to following drawings.
[0065] The power supply 180 uses power which is applied from an
external power source or an internal power source thereto, and
supplies power necessary to operate each constituent element under
control of the controller 170.
[0066] Various embodiments according to the present disclosure may
be implemented in a recording medium which may be read by a
computer or a similar device using software, hardware or a
combination thereof. According to hardware implementation, various
embodiments of the present disclosure may be implemented using at
least one of Application Specific Integrated Circuits (ASICs),
Digital Signal Processors (DSPs), Digital Signal Processing Devices
(DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate
Arrays (FPGAs), processors, controllers, micro-controllers,
microprocessors, and an electric unit for executing the functions.
In some cases, embodiments of this disclosure may be implemented by
the controller 170. According to the software implementation,
various embodiments of procedures and functions according to this
specification may be implemented by separate software modules. The
software modules may perform one or more functions and operations
described in the specification.
[0067] The recording medium may include a computer readable
recording medium recording a program processing to receive an input
of an execution event for executing a second application in a state
in which an execution screen of the first application is displayed
on a full screen, to output feedback with respect to a window of a
moved location when the execution event is moved while not being
released, to configure a multi-window according to a preset split
scheme when the execution event is released from the moved specific
window, or to independently display screen of the first and second
applications through respective split windows.
[0068] Further, the touch device of the present disclosure
illustrated in FIG. 1 may include various information communication
devices, multi-media devices supporting a function of the present
disclosure, and an application device thereof, such as various
devices using an Application Processor (AP), a Graphic Processing
Unit (GPU), and a Central Processing Unit (CPU). For example, the
touch device includes devices such as a tablet Personal Computer
(PC), a Smart Phone, a digital camera, a Portable Multimedia Player
(PMP), a media player, a portable game terminal, a Personal Digital
Assistant (PDA) as well as mobile communication terminals operating
based on respective communication protocols corresponding to
various communication systems.
[0069] FIG. 2 is a diagram of a screen schematically illustrating a
screen interface in a touch device according to an embodiment of
the present disclosure.
[0070] Referring to FIG. 2, a screen interface for supporting a
multi-window environment in a touch device according to certain
embodiment of the present disclosure includes execution regions 210
and 230 split from one screen to display an execution screen of an
application. That is, within the screen there are separate
execution regions 210 and 230 in which execution screens relating
to separate applications can be displayed. Each execution region
210 and 230 may be referred to as a separate window, and
collectively the separate windows may be referred to as
multi-windows or a multi-window environment. Furthermore, the
screen interface includes a separator 200 separating at least two
execution regions 210 and 230 split according to a split scheme to
adjust a window size of the execution regions 210 and 230. The
split scheme refers to the relative disposition and size of the two
or more execution regions 210 and 230 or windows within the
multi-window environment. It will be appreciated that if there are
more than two windows within the multi-window environment then
further separators may be required. The respective execution
regions 210 and 230 split according to the multi-window environment
may include a navigation region, a scroll region, or a text input
region which are independently formed according to the execution
application or the respective execution applications.
[0071] Further, the screen interface of the present disclosure
provides a tray 300 for conveniently supporting execution of an
application using respective windows separated as a multi-window.
The tray 300 is installed in the touch device and includes one or
more execution icons (or a shortcut icon) 400 from among all
executable applications or includes only some applications
according to settings of the user. The tray 300 may be arranged
such that it appears to slide-in (i.e., be displayed) on the screen
or to slide-out and be hidden from the screen. The tray 300 may
include a handle item 350 capable of receiving a user command (for
instance a touch input or a touch and drag input) for switching
between the slide-in in a slide-out state. In addition, the tray
300 may support scrolling through execution icons 400 in the tray
300 and the execution icon 400 in the tray 300 may be corrected,
added, or removed according to user selection. Although it has been
illustrated in FIG. 2 that the tray 300 is disposed in a row, the
tray 300 may be disposed in two or more rows, which may be changed
according to user selection.
[0072] Although it has been illustrated in FIG. 2 that a screen of
a touch device is split into two execution regions (i.e., windows)
210 and 230 through one separator 200, according to an embodiment
of the present disclosure, the screen of the touch device may be
split into a larger number of windows up to a maximum number N
(N>1, N=natural number) where N is proportional to the screen
size. Accordingly, one or more separators 200 may be provided in
response to the number of windows, that is, according to the split
scheme that configures a multi-window environment. For example,
when the screen of the touch device is split into two execution
regions as shown in FIG. 2, one separator 200 may be provided. When
the screen of the touch device is split into three execution
regions, two separators 200 may be provided. When the screen of the
touch device is split into four execution regions, two or three
separators 200 may be provided according to the split region.
[0073] FIG. 3 is a diagram schematically illustrating an operation
of a multi-window in a touch device according to an embodiment of
the present disclosure.
[0074] Referring to FIG. 3, a screen example of reference numeral
<301> indicates a screen example of a touch device when the
touch device executes an Internet application. More particularly,
the screen of reference numeral <301> indicates a state in
which the Internet application is displayed as a full screen
through one window. The full screen consumes all or substantially
all of the available screen space (which may, for instance, be less
than the total screen size to allow for status bars to be
continuously displayed).
[0075] A screen example of reference numeral <303> indicates
a screen example of a touch device when two applications are
executed through a multi-window. For example, the user may
additionally execute a map (MAP) application in a state in which a
full screen of the Internet application is displayed. Accordingly,
as shown in the screen example of reference numeral <303>,
one screen is split into different execution regions by two windows
through the separator 200, and execution screens of an Internet
application and a MAP application are provided through respective
execution regions (windows). In this manner, a plurality of
applications split among at least two screens may be simultaneously
operated according to embodiments of the present disclosure.
[0076] A screen example of reference numeral <305> indicates
a screen example where sizes of respective windows are changed
according to a user operation from a screen of reference numeral
<330>. For example, the user moves (e.g., a touch & drag)
the separator 200 to adjust a window size of an execution region in
which the Internet application is executed and an execution region
in which a MAP application is executed. According to embodiments of
the present disclosure, when adjusting the window size by movement
of the separator 200, the screen size of the application may be
suitably changed according to a variation in the window size of a
corresponding execution region.
[0077] FIG. 4 is a diagram schematically illustrating an operation
for separating a multi-window in a touch device according to an
embodiment of the present disclosure.
[0078] Referring to FIG. 4, a screen example of reference numeral
<401> indicates a case where a screen is split into two
windows for a multi-window environment and a screen example when an
application A and an application B are executed through two windows
separated through one separator 200.
[0079] Screen examples of reference numerals <403> and
<405> indicate a case where a screen is split into three
windows for a multi-window environment, and indicates a screen
example when applications A, B, and C are executed through three
windows using two separators 200.
[0080] As illustrated in screen examples of reference numerals
<403> and <405>, the screen split of the present
disclosure may be separated into various forms according to
settings of the user, and the split scheme may be pre-defined.
[0081] FIGS. 5, 6, 7, 8, 9, 10, 11, and 12 are diagrams
illustrating examples of an operation screen operating a tray for
rapidly executing an application in a multi-window environment
according to embodiments of the present disclosure.
[0082] Referring to FIGS. 5, 6, 7, 8, 9, 10, 11, and 12, FIG. 5
illustrates a screen example of a touch device when the touch
device displays an idle screen (or home screen).
[0083] Although an idle screen is displayed as a full screen in the
screen example of FIG. 5, an execution screen of a specific
application may be displayed as a full screen. More particularly,
FIG. 5 illustrates an example where the idle screen is operated in
a normal mode before operating the multi-window environment. That
is, according to an embodiment of the present disclosure, the touch
device may be operated in a multi-window mode and a normal mode and
may switch between the two.
[0084] The user may activate the tray 300 to be indicated on the
idle screen as illustrated in FIG. 6 in a state in which the idle
screen is displayed, according to an embodiment of the present
disclosure. For example, the user may input a menu operation
through the displayed idle screen of the touch device to display
the tray 300. Alternatively, the tray 300 may be displayed through
selection of a function key for executing a multi-window mode, or
in response to a touch event set to execute the multi-window mode
(e.g., a gesture having a specific pattern such as figures and
characters). Accordingly, the touch device may activate and
indicate (display) a tray 300 on a pre-set region on an idle screen
as shown in FIG. 6. For example, the tray 300 may be disposed at a
left frame (a left edge) of a rectangular full screen such that the
full screen (currently displaying the idle screen in FIG. 6) is
reduced in size. The tray 300 also may be provided in the form of
an overlay through a separate layer on a currently displayed
screen, and may have a handle item 350, such that the tray 300
overlaps the idle screen, as shown in FIG. 6.
[0085] The user may input a movement event (e.g., a touch &
drag) moving the tray 300 to another region on a screen as shown in
FIG. 7 in a state in which the tray 300 is displayed on an idle
screen, according to an embodiment of the present disclosure. For
example, the user may touch a part of the tray 300 to input a
movement event to drag the tray to a different part of the screen
(for instance, an opposite direction of a screen (e.g., a right
frame direction of a window (a right edge direction of the
screen)). Accordingly, the touch device may provide a User
Interface (UI) or a Graphic User Interface (GUI) that separates the
tray 300 from the left frame according to the movement event to
move with the drag in response to the drag of the user. In this
case, when the tray 300 is moved in a specific direction greater
than a predetermined range (e.g., based on a center of a screen) in
response to a drag movement of the user, the touch device may
change and display a direction of a handle item 350 of the tray
300. That is, the touch device may differently display the handle
item 350 for sliding-in the tray 300 in a screen according to a
region in which the tray 300 is located. For example, the handle
item 350 illustrated in FIG. 6 may be switched to a direction of a
handle item 350 as illustrated in FIG. 7 according to a movement of
the tray 300.
[0086] Referring to FIG. 7, the user may move the tray 300 close to
a desired region to release the input movement event. That is, the
user may release drag input for moving the tray 300. Then, the
touch device may determine a moved region of the tray 300 and
arrange and display the tray 300 on the determined region. For
example, as shown in FIG. 8, the touch device may arrange and
provide the tray 300 at a right frame of a window (a right edge of
the screen). That is, if a user input for moving the tray 300 is
released, the touch device displays a screen as illustrated in FIG.
8. That is, a tray 300 provided in the screen in the touch device
shown in FIG. 6 is switched as illustrated in FIG. 8 according to a
movement of the tray 300. The touch device may determine an
arranged region of the tray 300 according to a movement degree of
the tray 300. For example, the touch device may arrange the tray
300 at a window frame (screen edge) closest to the moved region
(based on a point of contact of a user input on the tray 300). For
instance, when the user input is released when the tray 300 is
closest to the left frame (the left edge of the screen), the tray
300 is arranged and displayed at the left frame (the left edge).
When the user input is released when the tray 300 is closest to a
respect right, upper or lower frame (edge of the screen), the tray
300 is arranged and displayed at the respective right, upper or
lower frame (edge).
[0087] In this manner, screen examples where the tray 300 is
arranged in different locations according to a user input are
illustrated in FIG. 6 (arranged at a left frame), in FIG. 8
(arranged at a right frame), in FIG. 9 (arranged at an upper
frame), and in FIG. 10 (arranged at a lower frame). That is,
according to embodiments of the present disclosure, referring to
FIGS. 6 to 10, an arranged location of the tray 300 may be changed
in real time according to user input.
[0088] FIG. 11 illustrates a screen example of a slide-out, that
is, a hidden state in a state in which the tray 300 is arranged at
a lower frame as shown in FIG. 10.
[0089] Referring to FIG. 11, if the tray is slid-out, the tray 300
is not displayed on a screen but only a handle item 350 of the tray
300 may be displayed. In the present disclosure, a slide-out of the
tray 300 is achieved by a user input using the handle item 350, or
the tray 300 may be automatically slid-out when the user input does
not occur for a predetermined time in a slide-in state. When a
specific execution icon 400 is selectively moved to the screen from
the tray 300 according to the user input, the tray 300 may be
automatically slid-out.
[0090] Further, when the user touches a user input (e.g., handle
item 350) and moves (i.e., a drag, a flick, or the like) it in an
inner direction of a screen in a state in which the tray 300 is
slid-out, the tray 300 may be slid-in.
[0091] FIG. 12 illustrates a screen example when a screen of a
landscape mode is displayed according to a rotation of the touch
device in a screen display of a portrait mode as illustrated in
FIGS. 6, 7, 8, 9, 10, and 11, according to embodiments of the
present disclosure. When the touch device is switched from the
landscape mode to the portrait mode or from the portrait mode to
the landscape mode, the tray 300 may be arranged and provided at a
location corresponding to a direction arranged in a previous mode.
For example, when the touch device switches the landscape mode to
the portrait mode in a state in which the tray 300 is arranged at a
left frame at a time point of viewing a screen of the user (a left
edge of the landscape mode), the tray 300 may be automatically
arranged and provided at a left frame at a time point of viewing
the screen of the user (a left edge of the portrait mode). That is,
regardless of switch of the mode, the tray 300 may be arranged and
provided at the same location based on a time point of the
user.
[0092] Referring to FIG. 12, screens of respective applications of
split execution regions (windows) are rotated and provided
according to a mode switch, and the window size split by the
separator 200 may be maintained in accordance with a previous
state.
[0093] FIGS. 13, 14, 15, 16, and 17 are diagrams illustrating
examples of an operation screen operating a plurality of
applications in a multi-window environment according to an
embodiment of the present disclosure.
[0094] Referring to FIGS. 13, 14, 15, 16, and 17, FIG. 13
illustrates a screen example of a touch device when the touch
device executes one application (e.g., Internet application) as a
full screen. As shown in FIG. 13, the tray 300 is activated,
slid-out, and hidden on the screen so that only the handle item 350
is displayed on the screen.
[0095] The user may select (e.g., touch & drag) the handle item
350 in a state in which the Internet application is displayed to
slide-in the tray 300 on a screen as shown in FIG. 14. When the
user input with respect to the handle item 350 is detected in a
state the tray 300 is slid-out, the touch device displays a screen
as shown in FIG. 14. That is, a screen of the touch device
illustrated in FIG. 13 is switched according to the user input as
illustrated in FIG. 14.
[0096] The user may select an execution icon 410 of an application
to be additionally executed according to a multi-window environment
from among application execution icons 400 previously registered in
the tray 300 to input an event moving on a screen in a state in
which the tray 300 is displayed. For example, the user selects
(i.e., touches) an execution icon 410 capable of executing a map
application in the tray 300 and inputs an event moving (i.e.,
dragging) the execution icon into the screen region currently
displaying Internet application while the touch is maintained.
[0097] Then, the touch device displays a state in which the
execution icon 410 is moved into the screen in response to a user
input as shown in FIG. 15. In this case, the touch device confirms
a region in which the execution icon 410 is located and a split
scheme as illustrated in FIG. 15 and outputs a feedback for an
execution region to which an application of the execution icon 410
is to be executed to the user (illustrated by the hashed box in
FIG. 15). The feedback may be expressed by various schemes which
may be intuitively recognized by the user such as focusing a
corresponding window in which the execution icon 410 is located
among windows of the split execution region, highlighting and
displaying only a corresponding window, or changing a color of a
corresponding window.
[0098] When an execution icon 410 in the tray 300 enters in the
screen according to the user input, UI or GUI may provide a fade
out effect such that a space in which the execution icon 410 is
located in the tray 300 is remained as a blank. Further, when the
execution icon 410 is separated from the tray 300 and enters in the
screen, the tray 300 may be slid-out. That is, a screen of the
touch device illustrated in FIG. 15 may be switched as illustrated
in FIG. 16 according to the user input.
[0099] The blank processing of the present disclosure is provided
for intuition of the user. When the tray 300 is slid-out, that is,
when FIG. 15 is switched to FIG. 16, a blanked processed space from
the tray 300 according to separation of the execution icon may have
an original shape. That is, as illustrated in a screen example of
FIG. 18 to be described later, a space in which the execution icon
410 is located may be provided in a state in which a corresponding
to when icon is present.
[0100] Further, in the case of FIGS. 15 and 16, a multi-window
environment may be split into two execution regions having two
windows with an upper window and a lower window. In addition, FIG.
15 illustrates a case where the current location of the execution
icon 410 is in a current upper window according to a user input,
and where the lower window is focused when the execution icon 410
is moved to a lower side of the screen in a state in which the
touch input at the execution icon 410 is maintained.
[0101] Referring to FIG. 16, the user may move the execution icon
410 to a lower side of the screen in a state in which a touch input
to the execution icon 410 is maintained, and input an event of
releasing a touch input to the execution icon 410 in the lower
window. For example, when the lower window is focused and displayed
in a state in which the execution icon 410 is dragged and moved to
the lower window, the user may release (i.e., drag & drop) a
touch input to the execution icon 410.
[0102] Accordingly, referring to FIG. 17, the touch device executes
an application (i.e., the map application) associated with the
execution icon 410 in response to the user input and displays an
execution screen of the application on the lower window. In this
case, if a full screen is executed with a previous application,
such as the Internet application and execution of an additional
application such as the map application is detected, the touch
device separates the full screen into two split execution regions
through the separator 200 to form two separate windows. Further,
the touch device displays a screen of the additional application
(i.e., map application) through a window (e.g., a lower window) of
an execution region in which the execution icon 410 is located, and
displays a screen of the previous application (i.e., Internet
application) through a window (e.g., an upper window) of another
execution region.
[0103] In this case, upon execution of the additional application,
the touch device displays a screen of a suitable size corresponding
to a window (e.g., a lower window) size of an execution region in
which the additional application is executed. Further, the touch
device displays a screen of the previous application as a full
screen or a partial screen in a window (e.g., an upper window) of a
split execution region according to a characteristic of a previous
application, and displays a screen of the additional application in
a window (lower window) of another split execution region as a full
screen or a partial screen upon splitting the screen.
[0104] For example, when the previous application and the
additional application are each an application capable of playing
content, such as a video, the touch device may change to a screen
of a suitable size corresponding to a window (e.g., an upper window
and a lower window) of a split execution region and display a play
screen in a corresponding window as a full screen. When the
previous application and the additional application are each an
application capable of displaying a text or a list, such as an
Internet application, the touch device may display only a partial
screen corresponding to a size of a corresponding window (i.e.,
upper window, lower window) of the split execution region.
[0105] As illustrated in screen examples of FIGS. 13, 14, 15, 16
and 17, according to embodiments of the present disclosure, when
the touch device executes an application, an execution screen of a
first application may be displayed as the full screen. Further, the
touch device may receive an execution event input (e.g., a user
input which selects an execution icon 400 from the tray 300 and
moves to the screen) for executing a second application from a user
while displaying the first application as a full screen. In this
case, when the execution event is moved into the screen while not
being released, the touch device may output feedback with respect
to the window of a location to which the execution event is moved
(i.e., a location to which the execution icon 400 is being moved
(i.e., dragged) according to a user input). Further, when the
execution event is released in a moved specific window (e.g., when
a user drops an execution icon 400 dragged into a region of a
specific window after a selection thereof), a multi-window may be
configured according to a pre-set split scheme, and screens of the
first application and the second application may be independently
displayed through respective split windows.
[0106] FIGS. 18, 19, 20, 21, 22, and 23 are diagrams illustrating
examples of operating a plurality of applications in a multi-window
environment according to an embodiment of the present
disclosure.
[0107] Referring to FIGS. 18, 19, 20, 21, 22, and 23, FIG. 18
illustrates a screen example of a touch device when the tray 300 is
slid-in according to the user input using a handle item 350 in a
state in which the touch device displays screens of different
applications through each window of two split execution regions as
illustrated in FIG. 17.
[0108] The user may select an execution icon 430 of an application
(e.g., a note application) to be additionally executed from among
execution icons 400 previously registered in the tray 300 in
response to the foregoing operation and input an event moving on
the screen as illustrated in FIG. 19.
[0109] Accordingly, the touch device moves the execution icon 430
into the screen in response to the user input as illustrated in
FIG. 19, and outputs feedback for an execution region in which the
execution icon 430 is to be executed in a corresponding location
according to the movement to the user. A slide-out operation of the
tray 300 according to the movement of the execution icon 430 and an
execution operation of an application (e.g., a note application) of
the execution icon 430 correspond to the foregoing operation. In
this case, FIG. 19 illustrates a case where a touch input to the
execution icon 430 is moved to an upper window of the screen and is
released (i.e., drag & drop).
[0110] Referring to FIG. 20, the touch device executes an
application (e.g., a note application) of an execution icon 430 in
response to the user input and displays an execution screen of the
application on an upper window. In this case, the touch device
processes the application (e.g., an Internet application)
previously executed through the upper window in the background (not
displayed), and displays a screen of the additional application
(e.g., a note application) whose execution is newly requested
through the upper window. Further, the touch device may
continuously execute the application (e.g., a map application)
allocated to the lower window and continuously displays a screen
(e.g., currently progressing screen) according to the execution
state through the lower window.
[0111] In this manner, as illustrated in screen examples of FIGS.
18, 19, and 20, according to embodiments of the present disclosure,
the touch device may receive a user input for executing an
additional application while displaying a screen of a plurality of
applications through the multi-window. Accordingly, the touch
device may execute the additional application through a
corresponding window selected from the user for executing the
additional application. Upon executing the additional application,
the application previously executed through the selected window may
be processed as a background, and the additional application screen
may be displayed through the selected window.
[0112] The user may change the window size for two split execution
regions through the separator 200 as illustrated in FIG. 20. That
is, FIGS. 21, 22, and 23 illustrate an operation of changing the
window size according to the user input in a state in which a
window of split execution regions of the touch device is
displayed.
[0113] The user may input an event to select, as illustrated in
FIG. 21, the separator 200 in a screen like FIG. 20 and to move the
selected separator 200 in a specific direction (e.g., upward or
downward). For example, the user may input an event which touches
the separator 200 as illustrated in FIG. 21 and drags the separator
200 to a lower direction of the screen in a state in which the
touch is maintained.
[0114] Accordingly, the touch device displays a moved state of the
separator 200 in response to a user input as illustrated in FIG.
21. In this case, the touch device may change and display only a
moving state of the separator 200 according to an user input while
maintaining a screen of the application displayed through each
window as a current state as shown in FIG. 21. However, according
to embodiments of the present disclosure, the touch device may
adaptively change and display a screen of an application according
to a window size changed when the separator 200 is moved according
to the user input through a window size control scheme.
[0115] The user may input an event which moves the separator 200
corresponding to a size ratio of each window to be adjusted and
releases a touch input to the separator 200. For example, the user
may drag the separator 200 and release (i.e., drag & drop) a
touch input to the separator 200 in a state in which the separator
200 is moved to a location of the lower window as illustrated in
FIG. 21.
[0116] Accordingly, the touch device changes and displays a window
size according to movement of the separator 200 in response to the
user input as shown in FIG. 22. In this case, the touch device
changes and displays a display state of a screen of an application
allocated to each window (e.g., upper window and lower window)
according to variation in the window size. For example, as shown in
FIG. 22, remaining hidden contents may be displayed according to
increase of the window size on a screen of an application displayed
on the upper window, and a screen of an application displayed on
the lower window may be provided in a state in which a region
displayed according to reduction of the window size is reduced.
[0117] FIG. 23 illustrates an opposite case of FIG. 22, and
illustrates a screen example in a state in which a separator 200 is
moved to an upper direction of a screen according to a user input,
and accordingly the size of an upper window is reduced and the size
of the lower window is enlarged.
[0118] FIGS. 24, 25, 26, 27, 28, and 29 are diagrams illustrating
examples operating a key pad for text input in a multi-window
environment according to an embodiment of the present
disclosure.
[0119] Referring to FIGS. 24, 25, 26, 27, 28, to 29, the present
disclosure provides a touch key pad (e.g., a floating key pad) 500
having a different form from a normal touch key pad for efficiently
operating a multi-window environment. That is, according to
embodiments of the present disclosure, a touch key pad operated in
a normal mode providing a screen of one application as a full
screen, and a floating key pad 500 operated in a multi-window mode
providing a screen of a plurality of applications as an individual
screen through screen split may be differentially provided. In the
present disclosure, the floating key pad 500 is not fixed to a
pre-defined region like a normal touch key pad, but may be freely
moved around in a screen of the touch device in response to the
user input. The floating key pad of the present disclosure may be
in the form of a pop-up when a text input is requested (e.g., a
user input selecting a text input window of an application of the
specific window) from an application of the specific window
according to user selection from among applications of a plurality
of windows separated as a multi-window in the multi-window
environment.
[0120] Referring to FIGS. 24, 25, 26, 27, 28 and 29, FIG. 24
illustrates a screen example of a touch device in a state in which
the touch device displays a screen of different applications
through each window of two split execution regions.
[0121] The user may display a floating key pad at a predetermined
region (e.g., a pre-defined region or a previously executed region)
according to a user input referring to FIG. 25 in a state in which
screens of a plurality of applications according to a multi-window
environment are simultaneously displayed. For example, the user may
input a menu operation of the touch device, function key selection
for executing the floating key pad 500, or a touch event (e.g., a
gesture having a specific pattern such as figures and characters)
set to execute the floating key pad 500. More particularly, in the
present disclosure, when a text input window in which a text input
is possible is selected on an application screen executed on a
window of each split execution region, the floating key pad 500 may
be automatically executed and be provided on the screen.
[0122] Referring to FIG. 25, the touch device activates a floating
key pad 500 at one region of a screen operated as the multi-window.
For example, a location provided when the floating key pad 500 is
activated may be provided in a form that a bottom end of the
floating key pad 500 adheres to a lower frame. In the present
disclosure, the floating key pad 500 has a separate layer and may
be provided in an overlay form on screens according to a
multi-window.
[0123] The user may input a movement event (e.g., a touch &
drag) moving the floating key pad 500 to another region on the
screen as illustrated in FIG. 26 in a state in which the floating
key pad 500 is displayed on the screen. For example, the user may
input a movement event which touches and drags a part of the
floating key pad 500 to another region (e.g., upward) of the
screen. Accordingly, the touch device may provide UI or GUI
separating the floating key pad 500 from a lower frame according to
the movement event and moving the floating key pad 500 with a drag
of the user in response to the drag of the user.
[0124] The user may move the floating key pad 500 to a desired
location and release the input movement event as shown in FIG. 27.
That is, the user may release a drag input for moving the floating
key pad 500. Accordingly, the touch device may arrange and display
the floating key pad 500 in a location in which the drag input is
released.
[0125] According to embodiments of the present disclosure, the user
input may be achieved in both of respective windows of split
execution regions and the floating key pad 500 in a state in which
the floating key pad 500 is provided. In this case, a user input by
the floating key pad 500 is received in a region that the floating
key pad 500 occupies, and a user input for a corresponding window
may be received in a remaining region.
[0126] Referring to FIG. 27, the user may perform a text input
using the floating key pad 500 in a state in which the floating key
pad 500 is displayed. For example, it is assumed that the user
inputs a text on a screen of an application executing on the upper
window. In this case, the user selects the upper window (i.e.,
selects any one region (e.g., a text input window) in which a text
input is possible from an application screen of an upper window),
and selects and inputs a desired character button on the floating
key pad 500.
[0127] Referring to FIGS. 27 and 28, the user selects a text input
window 610 on a screen of an application executing through the
upper window to implement a state in which the text input is
possible. Further, the user may sequentially input respective
buttons to which characters p, s, and y are allocated to input
"psy" using the floating key pad 500. Accordingly, the touch device
may input and display a corresponding character on the text input
window 610 in response to the user input as illustrated in FIGS. 27
and 28.
[0128] Referring to FIG. 28, the touch device may provide a result
for a text (e.g., "psy") input to the text input window 610 of an
application executing on the upper window to the floating key pad
500 in the form of an underlay as illustrated in FIG. 28. For
example, as an example of FIG. 28, a text input in to the text
input window 610 may be provided through a recommendation region
620 of a new layout recommending a searched result corresponding to
the text input in to the text input window 610 while maintaining a
current state. The recommendation region 620 may be provided in
such a way that overlies a screen of an application and the
floating key pad 500 overlies the recommendation region 620. That
is, the floating key pad 500 may be disposed at the uppermost
position and may maintain a current state.
[0129] The text input to the text input window 610 may be input to
the same layer as an application screen and may be directly
provided thereon. For example, in a case of a text input window in
to which receiver information is input, like a mail application
executed in the lower window, and unlike the example of FIG. 28,
only an input result may be displayed through a text input window
of an application screen without a separate new layer.
[0130] Referring to FIG. 28, the user may select any one
recommended result in a state in which a recommendation region 620
is displayed on the floating key pad 500 as an underlay, or operate
(i.e., command) search execution for a text input to the text input
window 610. A corresponding result screen is illustrated in FIG.
29. That is, a screen of a touch device illustrated in FIG. 28 is
switched as illustrated in FIG. 29 according to a user input.
[0131] Referring to FIG. 29, after a text shown in text input
window 610 is input through the floating key pad 500 according to
user input, when function execution for a corresponding application
(e.g., a search execution, a mail transmission execution, a memo
storage execution, a message transmission execution, or the like)
is input, the floating key pad 500 is removed from the screen, and
a result for the execution may be provided from a corresponding
window of an application executing the function. For example,
referring to FIGS. 28 and 29, a search result for "psy" input from
an application of an upper window may be provided through the upper
window.
[0132] FIG. 30 is a diagram illustrating an example of operating a
plurality of applications in a multi-window environment according
to an embodiment of the present disclosure.
[0133] Referring to FIG. 30, FIG. 30 illustrates a screen example
when specific setting for respective windows is changed according
to the user input in a state in which the touch device displays
screens of different applications through respective windows of two
split execution regions.
[0134] According to embodiments of the present disclosure, a
function may be independently set in every split window. That is, a
function suitable for a characteristic of an execution application
of a window selected by the user from among windows of split
execution regions may be changed. For example, the user may select
a left window from among windows of split execution regions, and
operate a pre-set function (e.g., operate a function key provided
to control a volume). Accordingly, the touch device may separate a
characteristic of an application executing through the left window.
Further, the touch device may display a volume setting item 700
according to a characteristic of a separated application (e.g., a
media playing capability, like a video playing capability), and may
feedback a setting value changed according to the user input. In
this case, when the user defines a setting of screen brightness
with respect to the media characteristic, a screen brightness
setting item (not shown) instead of the volume setting item 700 may
be provided on the screen, and a feedback where brightness of the
screen is changed according to the user input may be provided.
Further, the setting for an application executing on the right
window may be changed in accordance with the foregoing scheme.
[0135] As described above, when a function setting is changed
according to a user input on a specific window, an independent
setting may be achieved for each window. For example, when a volume
or screen brightness is set on the left window, a setting value may
be reflected and displayed only for the left window.
[0136] FIGS. 31, 32, 33, and 34 are diagrams illustrating examples
of an operation screen providing information for a plurality of
applications executed according to a multi-window environment in a
touch device according to an embodiment of the present
disclosure.
[0137] Referring to FIGS. 31, 32, 33, and 34, FIG. 31 illustrates a
screen example of a touch device when the touch device displays a
list for a plurality of application executed according to a
multi-window environment. Referring to FIG. 31, a list of
applications executed in the multi-window environment by the user
may be provided through a full screen according to user selection.
The user may input a menu operation of the touch device, function
key selection for executing the list, or a touch event (e.g., a
gesture having a specific pattern such figures or characters) set
to execute the list in a state in which a function by multi window
is operating or the screen is converted into an idle screen.
Accordingly, as illustrated in FIG. 31, the touch device may
display a list for applications currently executed (including
background execution) through UI or GUI set as FIG. 31.
[0138] Referring to FIG. 31, applications which are executed by the
user in the multi-window environment and currently maintain the
execution may be provided in a specific arrangement format. For
example, the applications may be arranged and provided in an
execution order or a random order. FIG. 31 illustrates a list
including an E-mail application 910, a Video Player application
920, a Note application 930, a Map application 940, and a Play
Store application 950.
[0139] Referring to FIGS. 32 and 33, although not displayed on an
initial list screen of FIG. 31, remaining applications (e.g., Gmail
application 960, Wi-Fi application 970, and Phone application 980)
hidden according to scroll (or navigation) control of the user may
be spread and displayed. That is, the list illustrated in FIG. 31
includes different applications which are not displayed through the
screen but are hidden. The number of applications included in the
initial list may be suitably set in consideration of intuition of
the user according to the size of a screen of the touch device.
When the number of executing applications is greater than the
preset number, excessive applications may be hidden as illustrated
in examples of FIGS. 31 to 34. Information for the applications of
the list may be provided in such a manner that an information
display region of an application (e.g., Video Player application
920) disposed at a lower side among the applications is mainly
allocated and the information display region becomes gradually
reduced in the upward direction. Accordingly, the uppermost
application (e.g., Play Store application 950) may display only a
state bar capable of discriminating a corresponding
application.
[0140] Further, as shown in FIG. 31, an application (e.g., E-mail
application 910) disposed at a lowermost region to display only a
state bar may correspond to at least one application which is most
recently executed by user or is displayed on a screen just before
execution of a list. In this manner, the application disposed at
the lowermost region may be fixed and provided at a corresponding
region regardless of scroll control of the user, and fixed
arrangement may not be achieved according to user setting.
[0141] Further, a list screen for the execution applications of the
present disclosure may include a command region 800 for supporting
a variety of command types (e.g., an application scroll, a
termination of application execution, an application search, or the
like) for the execution applications in the list. More
particularly, the list screen may include a scroll item 850 for
controlling a scroll (or a spread) for the applications in the
list. That is, the user may scroll the applications in the list
through a user input using the scroll item 850. The touch device
may provide UI or GUI where information of applications overlapped
according to a user input scheme for the scroll item 850 is spread.
In this case, when a user input scheme is repeated once to be
input, the touch device may repeatedly control (e.g., spread) one
scroll in response to a corresponding input. When the user input
scheme maintains an input (e.g., a touched) state of the scroll
item 850, the touch device may continuously control automatic
scroll while the user input is maintained.
[0142] The user may select (touch) the scroll item 850 to maintain
the input in a state in which the list is displayed as illustrated
in FIG. 31. Accordingly, when a user input for the scroll item 850
is detected, the touch device displays a screen where information
of applications spread from up to down as illustrated in FIGS. 32,
33, and 34. That is, the list screen of the touch device
illustrated in FIG. 31 is switched as shown in FIGS. 32, 33, and 34
according to the user input.
[0143] Referring to FIGS. 32, 33, and 34, UI or GUI may be provided
in such a manner that a Video Play application 920 is pulled
downward in response to the user input using the scroll item 850
and disappears from the screen while information of other upper
applications disposed in the upper side is gradually spread and is
sequentially pulled downward. Further, when the list is scrolled
according to scroll control according to the user input, referring
to FIGS. 33 and 34, other hidden applications (e.g., a Gmail
application 960 (FIG. 33), a Wi-Fi application 970 [FIG. 34], a
Phone application 980 [FIG. 34], or the like) may be sequentially
displayed on the screen. In this case, as illustrated in FIGS. 32,
33, and 34, an E-mail application 910 may be fixed at a
corresponding location to be continuously displayed.
[0144] As illustrated in FIGS. 31, 33, and 34, the user may select
an item of a specific application in a state in which the list is
displayed, or during scroll control. Accordingly, the touch device
may display the selected application as a full screen. Referring to
FIGS. 31, 33, and 34, when the user input is achieved by the scroll
item 850 until scroll for all applications included in the list is
achieved, that is, when all the applications in the list is spread
and pulled downward, the touch device may automatically display a
recently executed application (i.e., an application [e.g., E-mail
application 910] fixed and arranged at the lowermost side) as a
full screen.
[0145] FIG. 35 is a flowchart illustrating a method of operating a
multi-window environment in a touch device according to an
embodiment of the present disclosure. More particularly, FIG. 35
illustrates an example of switching to the multi-window environment
during an operation of one window.
[0146] Referring to FIG. 35, a controller 170 executes an
application (hereinafter, referred to as a "first application")
corresponding to user selection at operation 3501, and controls
screen display for the executing first application at operation
3503. In this case, the controller 170 controls display of a full
screen of the first application through one window.
[0147] When receiving an execution standby event input for
executing an additional application (e.g., a "second application")
in a state in which the first application is executed at operation
3505, and determines a preset multi-window split scheme at
operation 3507. In the present disclosure, the execution standby
event may refer to an event for additionally executing and
displaying another application by a multi-window environment in a
state in which the user executes and displays any one application.
More particularly, the execution standby event may refer to an
event which allows the user to activate (e.g., slide in) the tray
300 on the screen and select an execution icon of an application to
be additionally executed from the activated tray 300 to move (e.g.,
drag) into the screen.
[0148] When the execution icon is moved from the tray 300 and
enters in the screen, the controller 170 traces and determines a
moved location of the execution icon at operation 3509. The
controller 170 may confirm a window of a current region after the
execution icon is moved through location trace of the execution
icon.
[0149] The controller 170 controls feedback output for a window of
an execution region in which an additional application is able to
be executed in response to the determined split scheme and a
location of an execution icon at operation 3511. That is, the
controller 170 may control feedback output for a specific window of
a location in which the execution icon is dragging while the
execution icon is move on the full screen according to the drag.
For example, the controller 170 may focus and display a window of a
location to which the execution icon is moved.
[0150] If an execution event of the second application by execution
icon is input at operation 3513, the controller 170 splits a screen
at operation 3515 and controls execution of the second application
at operation 3517. The execution event may be an event dropping the
execution icon in one region of the screen. The controller 170
identifies a region (e.g., a region where an execution icon is
dragged and dropped [i.e., a drag & drop]) where the execution
icon is moved to generate an execution event, splits a full screen
for the first application, and determines a region in which the
execution event is generated among the split regions as one window
(i.e., execution region) for displaying a screen of the second
application.
[0151] Upon executing the second application, the controller 170
controls to display a screen having a suitable size corresponding
to the window size of the split execution region (i.e., an
execution region in which the second application is executed) at
operation 3519. Here, the controller 170 may display a screen of
the first application in a window (e.g., an upper window) of a
split execution region as a full screen or a partial screen, and
display a screen of the second application in a window (e.g., a
lower window) of another split execution region as a full screen or
a partial screen. For example, when the first application or the
second application is an application having a capability of playing
media, like a video, the controller 170 may change into a screen of
a suitable size pertinent to a corresponding window size of a split
execution region, and display a playing screen in the window as the
full screen. When the first application and the second application
are an application having a characteristic of a text or a list like
Internet, the controller 170 may display as a partial screen in
response to a corresponding window size of the split execution
region. That is, according to embodiments of the present
disclosure, a screen of the first application and a screen of the
second application may be independently displayed on a
corresponding window by implementing the multi-window
environment.
[0152] That is, if an input where the execution icon is dropped on
a specific window during drag is received, the controller 170 may
execute the second application in response to a drop input of the
execution icon. In this case, when executing the second
application, the controller 170 may split the full screen into
windows for displaying screens of the first application and the
second application. Further, the controller 170 may display a
screen of the second application through the specific window in
which the execution icon is dropped, and display a screen of the
first application through another split window.
[0153] FIG. 36 is a flowchart illustrating a method of operating a
multi-window environment in a touch device according to an
embodiment of the present disclosure. In particular, FIG. 36
illustrates an operation example in which an additional application
is executed while operating the multi-window.
[0154] Referring to FIG. 36, when displaying screens of a plurality
of applications by a multi-window at operation 3601, the controller
170 may receive an input for selecting an additional application to
additionally execute an application at operation 3603. That is,
according to embodiments of the present disclosure, another
application may be further executed while independently displaying
screens of a plurality of different applications through respective
split windows in the multi-window environment.
[0155] If an input for selecting an additional application is
received in the multi-window environment, the controller 170
determines a split scheme and a currently executed window (e.g., an
"execution window") at operation 3605. For example, the controller
170 may confirm how many window split schemes exist in the screen
split for multi-window environment through pre-defined split
information, and determine how many currently executed windows are
split and operated.
[0156] The controller 170 compares the number of execution windows
with the split information to determine whether the number of
execution windows corresponds to a maximum value set to the
pre-defined split information at operation 3607. For example, the
controller 170 may determine whether the pre-defined split
information is 3 and the number of currently executed windows is 3.
If the number of execution windows does not correspond to the
maximum value set to the split information (NO of operation 3607),
the controller 170 controls execution of a corresponding operation
at operation 3609.
[0157] For example, as described above, the controller 170 may
control an additional screen split for executing the additional
application, execution of the additional application according
thereto, and screen display for a plurality of applications. This
may correspond to an operation for controlling execution of the
additional application due to screen slit on the full screen as
illustrated in an example of FIG. 35.
[0158] If the number of execution windows corresponds to the
maximum value set to the split information (i.e., YES of operation
3607), the controller 170 traces and determines a location for a
user input selecting an execution region for executing the
additional application at operation 3611. For example, when the
user selects an execution icon of an application to be additionally
executed from the tray 300 and moves the selected icon into the
screen, the controller 170 may trace and determine a moved location
of the execution icon.
[0159] The controller 170 feedbacks an execution region in which an
additional application is able to be executed in response to the
determined location at operation 3613. For example, when the
execution icon is moved from the tray 300 and enters in the screen,
the controller 170 focuses and displays a window of a location to
which the execution icon is moved.
[0160] If an execution event for the additional application is
input at operation 3615, the controller 170 executes the additional
application and controls processing of a previous application
executed in a corresponding execution region as a background at
operation 3617.
[0161] For example, when executing the additional application in
response to the user input, the controller 170 may process an
application previously executed through a window selected to
execute the additional application as the background, and may
display the screen of additional application which is requested to
execute through a corresponding window. That is, the controller 170
may process the previous application allocated to a corresponding
window as a background to continuously execute the application, and
may just replace a screen displayed on a corresponding window.
[0162] Upon executing the additional application, the controller
170 may control a screen display corresponding to a window size of
an execution region in which the additional application is executed
at operation 3619. For example, the controller 170 may display a
screen of the additional application in a window of a corresponding
execution region as a full screen or a partial screen.
[0163] Here, when the additional application is an application
having a capability of playing media, like a video, the controller
170 changes into a screen having a suitable size corresponding to a
window size of a corresponding execution region, and may display a
playing screen in the window as a full screen. When the additional
application is an application having a capability of processing a
text or a list, e.g., an Internet application, the controller 170
may display a partial screen corresponding to a window size of the
corresponding execution region.
[0164] The foregoing various embodiments of the present disclosure
may be implemented in an executable program command form by various
computer means and be recorded in a computer readable recording
medium. In this case, the computer readable recording medium may
include a program command, a data file, and a data structure
individually or a combination thereof. In the meantime, the program
command recorded in a recording medium may be specially designed or
configured for the present disclosure or be known to a person
having ordinary skill in a computer software field to be used. The
computer readable recording medium includes Magnetic Media such as
hard disk, floppy disk, or magnetic tape, Optical Media such as
Compact Disc Read Only Memory (CD-ROM) or Digital Versatile Disc
(DVD), Magneto-Optical Media such as floptical disk, and a hardware
device such as ROM. RAM, flash memory storing and executing program
commands. Further, the program command includes a machine language
code created by a complier and a high-level language code
executable by a computer using an interpreter. The foregoing
hardware device may be configured to be operated as at least one
software module to perform an operation of an embodiment of the
present disclosure, and vice versa.
[0165] Accordingly, embodiments provide a program comprising code
for implementing apparatus or a method as claimed in any one of the
claims of this specification and a machine-readable storage storing
such a program. Still further, such programs may be conveyed
electronically via any medium, for example a communication signal
carried over a wired or wireless connection and embodiments
suitably encompass the same.
[0166] As described above, according to the method and the
apparatus for providing a multi-window in a touch device of the
present disclosure, the user may simultaneously use a plurality of
applications as a determined split screen or a free style in a
simple method. For example, in order to split the screen to use a
multi-window in a state in which one application is executed as a
full screen, the user drags an additional application from the tray
to drag & drop the application to a determined location or a
free location, thereby simultaneously operating a plurality of
applications.
[0167] Further, according to the present disclosure, the user may
easily arrange and confirm a plurality of application from one
screen through a multi-window, and freely change each window
according to the multi-window to a desired layout, thereby solving
burden and trouble with respect to an efficient configuration of a
screen and operations of a plurality of applications.
[0168] According to the present disclosure, large amounts of
information and various user experiences may be provided to the
user through the multi-window environment. Further, according to
the present disclosure, the user may efficiently and simultaneously
perform an operation with respect to various applications by a
multi-window environment on a small screen of the touch device. For
example, the user may simultaneously perform other operations such
as creation of messages and mail while viewing and listening to a
video on one screen of the touch device. Accordingly, according to
the present disclosure, an optimal environment capable of
supporting a multi-window environment in the touch device is
implemented so that convenience for the user can be improved, and
usability, convenience, and competitive forces of the touch device
can be improved. The present disclosure may simply implement
various types of touch devices and various corresponding
devices.
[0169] It will be appreciated from the following description that,
in certain embodiments of the invention, features concerning the
graphic design of user interfaces are combined with interaction
steps or means to achieve a technical effect.
[0170] It will be appreciated from the following description that,
in certain embodiments of the invention, graphic features
concerning technical information (e.g. internal machine states) are
utilised to achieve a technical effect.
[0171] Certain embodiments aim to achieve the technical effect of
enhancing the precision of an input device.
[0172] Certain embodiments aim to achieve the technical effect of
lowering a burden (e.g. a cognitive, operative, operational,
operating, or manipulative burden) of a user when performing
certain computer or device interactions.
[0173] Certain embodiments aim to achieve the technical effect of
providing a more efficient man-machine (user-machine)
interface.
[0174] While the present disclosure has been shown and described
with reference to various embodiments thereof, it will be
understood by those skilled in the art that various changes in form
and details may be made therein without departing from the spirit
and scope of the present disclosure as defined by the appended
claims and their equivalents.
* * * * *