U.S. patent application number 15/874385 was filed with the patent office on 2018-05-24 for methods and devices for user interactive interfaces on touchscreens.
This patent application is currently assigned to Beijing Xiaomi Technology Co., Ltd.. The applicant listed for this patent is Beijing Xiaomi Technology Co., Ltd.. Invention is credited to Yu GUO, Jing LIU, Yang SHEN, Xiaojun WU.
Application Number | 20180143751 15/874385 |
Document ID | / |
Family ID | 47198464 |
Filed Date | 2018-05-24 |
United States Patent
Application |
20180143751 |
Kind Code |
A1 |
LIU; Jing ; et al. |
May 24, 2018 |
METHODS AND DEVICES FOR USER INTERACTIVE INTERFACES ON
TOUCHSCREENS
Abstract
The present application is directed to devices and methods for
touchscreen interactive interface. The method may comprise steps of
combining two or more single-function buttons on the user interface
of the touchscreen device into one multifunction button; monitoring
a real-time touch operation with the multifunction button by a
user; determining the type of the touch operation according to the
time of the touch operation; and unfolding the two or more
single-function buttons one after another around the multifunction
button according to the type of the touch operation.
Inventors: |
LIU; Jing; (Beijing, CN)
; WU; Xiaojun; (Beijing, CN) ; SHEN; Yang;
(Beijing, CN) ; GUO; Yu; (Beijing, CN) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Beijing Xiaomi Technology Co., Ltd. |
Beijing |
|
CN |
|
|
Assignee: |
Beijing Xiaomi Technology Co.,
Ltd.
Beijing
CN
|
Family ID: |
47198464 |
Appl. No.: |
15/874385 |
Filed: |
January 18, 2018 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13909570 |
Jun 4, 2013 |
9910558 |
|
|
15874385 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06F 3/0482 20130101;
G06F 3/04845 20130101; G06F 2203/04804 20130101; G06F 3/04886
20130101 |
International
Class: |
G06F 3/0482 20060101
G06F003/0482; G06F 3/0488 20060101 G06F003/0488 |
Foreign Application Data
Date |
Code |
Application Number |
Jun 5, 2012 |
CN |
201210184210.1 |
Claims
1. A method for operating an interactive user interface on a
touchscreen device, the method comprising: combining, by a
touchscreen device comprising a memory and a processor in
communication with the memory, two or more single-function buttons
on a user interface of the touchscreen device into one
multifunction button at a docking location; monitoring, by the
touchscreen device, a real-time touch operation of the interactive
user interface by a user; determining, by the touchscreen device, a
type of the real-time touch operation according to a length of time
of the real-time touch operation; when the type of the real-time
touch operation is determined to be a first type, unfolding, by the
touchscreen device, the two or more single-function buttons in a
user touch area around the docking location of the multifunction
button; and when the type of the real-time touch operation is
determined to be a second type: adjusting, by the touchscreen
device, the docking location of the multifunction button according
to a dragging movement operation performed by the user, and
unfolding, by the touchscreen device, the two or more
single-function buttons in the user touch area around the docking
location of the multifunction button.
2. The method according to claim 1, wherein the determining the
type of the real-time touch operation according to the length of
time of the real-time touch operation comprising: judging, by the
touchscreen device, whether the length of time of the real-time
touch operation satisfies a preset condition of the first type;
when it is judged that the length of time of the real-time touch
operation satisfies the preset condition of the first type,
determining, by the touchscreen device, the type of the real-time
touch operation to be the first type; and when it is judged that
the length of time of the real-time touch operation does not
satisfy the preset condition of the first type, determining, by the
touchscreen device, the type of the real-time touch operation to be
the second type.
3. The method according to claim 2, wherein the judging whether the
length of time of the real-time touch operation satisfies the
preset condition of the first type comprises: judging, by the
touchscreen device, whether the length of time of the real-time
touch operation is shorter than a first predetermined threshold
value.
4. The method according to claim 1, wherein, when the type of the
real-time touch operation is determined to be the second type, the
adjusting the docking location of the multifunction button
according to the dragging movement operation performed by the user
comprises: determining, by the touchscreen device, whether the user
performs the dragging movement operation on the multifunction
button; and when it is determined that the user performs the
dragging movement operation on the multifunction button: obtaining,
by the touchscreen device, a moved location of the multifunction
button when the user stops the dragging movement operation, and
adjusting, by the touchscreen device, the docking location of the
multifunction button according to the moved location of the
multifunction button when the user stops the dragging movement
operation.
5. The method according to claim 1, wherein: the docking location
of the multifunction button is movable within a movable area of the
interactive user interface.
6. the method according to claim 1, further comprising: dividing,
by the touchscreen device, a movable area of the interactive user
interface into a plurality of subareas, and determining, by the
touchscreen device, a center of each of the plurality of
subareas.
7. The method according to claim 6, wherein: the plurality of
subareas comprises three subareas.
8. The method according to claim 1, wherein, when the type of the
real-time touch operation is determined to be the second type, the
adjusting the docking location of the multifunction button
according to the dragging movement operation performed by the user
further comprises: obtaining, by the touchscreen device, a moved
location of the multifunction button; determining, by the
touchscreen device, an adjusted docking location of the
multifunction button as a center of a particular subarea where the
moved location is located; and adjusting, by the touchscreen
device, the docking location of the multifunction button according
to the adjusted docking location of the multifunction button.
9. The method according to claim 8, wherein the obtaining the moved
location of the multifunction button further comprises:
determining, by the touchscreen device, whether a current location
of the multifunction button is outside of a movable area of the
interactive user interface in a vertical direction; adjusting, by
the touchscreen device, a vertical coordinate of the current
location of the multifunction button to a second predetermined
threshold value; and maintaining, by the touchscreen device, a
horizontal coordinate of the current location of the multifunction
button.
10. The method according to claim 1, wherein, when the type of the
real-time touch operation is determined to be a second type, the
unfolding the two or more single-function buttons in the user touch
area around the docking location of the multifunction button
comprises: unfolding, by the touchscreen device, the two or more
single-function buttons into an arc around the multifunction
button, wherein: the arc has a predetermined radius, and an angle
measure of the arc is dependent on the docking location of the
multifunction button.
11. The method according to claim 10, wherein distance between any
two neighboring single-function buttons of the two or more
single-function buttons is equal, and distance between any of the
two or more single-function buttons and the multifunction button is
equal.
12. The method according to claim 1, wherein the multifunction
button is a semitransparent button.
13. A touchscreen device with an interactive user interface,
comprising: a memory storing instructions for conducting operations
on the touchscreen device; a processor in communication with the
memory, when the processor executes the instructions, the process
is configured to cause the touchscreen device to: combine two or
more single-function buttons on a user interface of the touchscreen
device into one multifunction button at a docking location; monitor
a real-time touch operation of the interactive user interface by a
user; determine a type of the real-time touch operation according
to a length of time of the real-time touch operation; when the type
of the real-time touch operation is determined to be a first type,
unfolding, by the touchscreen device, the two or more
single-function buttons in a user touch area around the docking
location of the multifunction button; and when the type of the
real-time touch operation is determined to be a second type:
adjusting, by the touchscreen device, the docking location of the
multifunction button according to a dragging movement operation
performed by the user, and unfolding, by the touchscreen device,
the two or more single-function buttons in the user touch area
around the docking location of the multifunction button.
14. The touchscreen device according to claim 13, wherein, when the
processor is configured to cause the touchscreen device to
determine the type of the real-time touch operation according to
the length of time of the real-time touch operation, the processor
is configured to cause the touchscreen device to: judge whether the
length of time of the real-time touch operation satisfies a preset
condition of the first type; when it is judged that the length of
time of the real-time touch operation satisfies the preset
condition of the first type, determine the type of the real-time
touch operation to be the first type; and when it is judged that
the length of time of the real-time touch operation does not
satisfy the preset condition of the first type, determine the type
of the real-time touch operation to be the second type.
15. The touchscreen device according to claim 14, wherein, when the
processor is configured to cause the touchscreen device to judge
whether the length of time of the real-time touch operation
satisfies the preset condition of the first type, the processor is
configured to cause the touchscreen device to: judge whether the
length of time of the real-time touch operation is shorter than a
first predetermined threshold value.
16. The touchscreen device according to claim 13, wherein, when,
when the type of the real-time touch operation is determined to be
the second type, the processor is configured to cause the
touchscreen device to adjust the docking location of the
multifunction button according to the dragging movement operation
performed by the user, the processor is configured to cause the
touchscreen device to: determine whether the user performs the
dragging movement operation on the multifunction button; and when
it is determined that the user performs the dragging movement
operation on the multifunction button: obtain a moved location of
the multifunction button when the user stops the dragging movement
operation, and adjust the docking location of the multifunction
button according to the moved location of the multifunction button
when the user stops the dragging movement operation.
17. The touchscreen device according to claim 13, wherein, when the
processor executes the instructions, the processor is further
configured to cause the touchscreen device to: divide a movable
area of the interactive user interface into a plurality of
subareas, and determine a center of each of the plurality of
subareas.
18. The touchscreen device according to claim 17, wherein: the
plurality of subareas comprises three subareas.
19. The touchscreen device according to claim 13, wherein, when,
when the type of the real-time touch operation is determined to be
the second type, the processor is configured to cause the
touchscreen device to adjust the docking location of the
multifunction button according to the dragging movement operation
performed by the user, the processor is further configured to cause
the touchscreen device to: obtain a moved location of the
multifunction button; determine an adjusted docking location of the
multifunction button as a center of a particular subarea where the
moved location is located; and adjust the docking location of the
multifunction button according to the adjusted docking location of
the multifunction button.
20. The touchscreen device according to claim 19, wherein, when the
processor is configured to cause the touchscreen device to obtain
the moved location of the multifunction button, the processor is
further configured to cause the touchscreen device to: determine
whether a current location of the multifunction button is outside
of a movable area of the interactive user interface in a vertical
direction; adjust a vertical coordinate of the current location of
the multifunction button to a second predetermined threshold value;
and maintain a horizontal coordinate of the current location of the
multifunction button.
21. The touchscreen device according to claim 13, wherein, when,
when the type of the real-time touch operation is determined to be
a second type, the processor is configured to cause the touchscreen
device to unfold the two or more single-function buttons in the
user touch area around the docking location of the multifunction
button, the processor is further configured to cause the
touchscreen device to: unfold the two or more single-function
buttons into an arc around the multifunction button, wherein: the
arc has a predetermined radius, and an angle measure of the arc is
dependent on the docking location of the multifunction button.
22. The touchscreen device according to claim 21, wherein distance
between any two neighboring single-function buttons of the two or
more single-function buttons is equal, and distance between any of
the two or more single-function buttons and the multifunction
button is equal.
23. The touchscreen device according to claim 13, wherein the
multifunction button is a semitransparent button.
Description
PRIORITY STATEMENT
[0001] This application is a continuation under 35 U.S.C. .sctn.
120 of U.S. application Ser. No. 13/909,570 filed on Jun. 4, 2013,
pending, which is based upon and claims the benefit of Chinese
Patent Application No. 201210184210.1 filed on Jun. 5, 2012, all of
which are incorporated herein by reference in their entireties.
FIELD OF THE INVENTION
[0002] The present invention relates generally to touchscreen
technology, particularly, the present invention relates to methods
and devices for user interactive interfaces on touchscreens.
BACKGROUND
[0003] With the development of the wireless communication
technology and touchscreen technology, more and more electronic
equipment adopt the touchscreen technology, such as touchscreen
cellphones and tablet PCs and so on, and become hand-held terminals
frequently used by users.
[0004] On the user interface of the current touchscreen devices,
the user operation point is usually fixed on a certain location. As
shown in FIG. 1, taking a touchscreen cellphone for example, it is
generally divided into the top touch area and buttons 101, the
cellphone supporting component 102, the touchable screen 103 and
the bottom touch area and buttons 104. As shown in FIG. 1, the user
touchable area of the current touchscreen cellphone is mainly
characterized in that there are several fixed buttons on the top
and bottom of the cellphone screen for the user's touch operation
and the touch area is generally rectangular.
[0005] Applicant has found in the research that: as most users use
one thumb to operate the touchscreen devices, as shown in FIG. 2,
if the screen of the touchscreen device is too large, it will be
inconvenient for the user to operate, which will increase
unnecessary user action times on the touchscreen device and thus
degrade the performance of the touchscreen device.
SUMMARY
[0006] A preferred embodiment of the present application provides a
user interface interactive method which combines two or more
single-function buttons on the user interface of the touchscreen
device into one multifunction button. The method may include steps
of: monitoring the real-time touch operation with the multifunction
button by a user; determining the type of the touch operation
according to the time of the touch operation; and unfolding the two
or more single-function buttons one after another around the
multifunction button according to the type of the touch
operation.
[0007] In a preferred embodiment, the type of the touch operation
may include Type I touch operation and Type II touch operation,
which may be determined according to the time of the touch
operation. Determining the type of the touch operation may include
steps of: judging whether the time of the touch operation satisfies
the preset conditions of Type I touch operation. If it does, the
type of the touch operation may be determined as Type I touch
operation; if not, the type of the touch operation may be
determined as Type II touch operation.
[0008] In a preferred embodiment, Type I touch operation may be a
short press and Type II touch operation may be a long press.
Determining whether the time of a touch operation satisfies the
preset conditions of Type I touch operation may include determining
whether the time of the touch operation is greater than a preset
first threshold value.
[0009] In a preferred embodiment, to unfold the two or more
single-function buttons one after another around the multifunction
button according to the type of a touch operation may include steps
of: unfolding the two or more single-function buttons one after
another in a user touch area around the multifunction button when
the touch operation is Type I touch operation; and when the touch
operation is Type II touch operation, adjusting the docking
location of the multifunction button according to the movement
operation of the multifunction button by the user, and unfolding
the two or more single-function buttons one after another in the
user touch area around the docking location of the multifunction
button after the multifunction button moves and/or has been
moved.
[0010] In a preferred embodiment, unfolding the two or more
single-function buttons one after another in the user touch area
around the multifunction button may include: unfolding the two or
more single-function buttons evenly into an arc according to a
preset radius around the multifunction button, wherein the
distances of any two neighboring single-function buttons of the two
or more single-function buttons are equal and the distances from
any of the two or more single-function buttons to the multifunction
button are equal.
[0011] In a preferred embodiment, adjusting the docking location of
the multifunction button according to the movement operation of the
multifunction button by the user may include: monitoring whether
the user has moved the multifunction button; if the user has moved
the multifunction button, obtaining the moved location of the
multifunction button after the user stops the movement operation;
and when the user stops the touch operation, determining the
docking location of the multifunction button according to the area
of the user interface to where the multifunction button has been
moved.
[0012] A preferred embodiment of the present application may also
include: dividing the movable area of the multifunction button on
the user interface into three equal subareas, wherein the movable
area is located on the bottom of the touchscreen device and its
height is the horizontal area of a preset second threshold value;
determining center coordinates of the three subareas;
[0013] Obtaining the moved location of the multifunction button
after it is moved may include: detecting whether the current
multifunction button is out of the movable area in a vertical
direction during its movement. If it is, correcting the vertical
coordinate of the location of the multifunction button after it is
moved to the second threshold value of the movable area. The
horizontal coordinate of the moved location may remain the
same.
[0014] Determining the docking location of the multifunction button
according to the area of the moved location on the user interface
may include: determining the docking location of the multifunction
button as the center coordinate of the current subarea where the
moved location is located.
[0015] In a preferred embodiment, unfolding the two or more
single-function buttons one after another in the user touch area
around the docking location of the multifunction button after the
multifunction button is moved may include: in the user touch area
around the docking location of the multifunction button, unfolding
evenly the two or more single-function buttons into an arc
according to a preset radius around the multifunction button,
wherein the distances of any two neighboring single-function
buttons of the two or more single-function buttons are equal and
the distances from any of the two or more single-function buttons
to the multifunction button are equal.
[0016] In a preferred embodiment, the multifunction button is a
semitransparent button.
[0017] According to an preferred embodiment of the present
application, a user interface interactive method for touchscreen
devices which combines two or more single-function buttons on the
user interface of the touchscreen device may include: monitoring a
real-time screen-touch operation with the multifunction button by a
user; determining a type of the touch operation according to the
time of the touch operation; and judging whether the type of the
touch operation satisfies preset conditions to move the
multifunction button. If the touch operation satisfies the preset
conditions, determine the docking location of the multifunction
button according to the movement operation of the multifunction
button by the user; if not, unfold the two or more single-function
buttons one after another around the multifunction button.
[0018] A user interface interactive device for touchscreen devices
may include a combination module, which may be used to combine two
or more single-function buttons on the user interface of the
touchscreen device into one multifunction button; a monitoring
module, which may be used to monitor the real-time touch operation
with the multifunction button by the user; a first determination
module, which may be used to determine the type of the touch
operation according to the time of the touch operation; and an
interaction module, which may be used to unfold the two or more
single-function buttons one after another around the multifunction
button according to the type of the touch operation.
[0019] In a preferred embodiment, the type of the touch operation
includes Type I touch operation and Type II touch operation, and
the first determination module comprises particularly: a first
judgment submodule, which may be used to judge whether the time of
the touch operation satisfies the preset conditions of Type I touch
operation; a first determination submodule, which may be used to
determine the type of the touch operation as the Type I touch
operation when the result of the judgment submodule is yes; and a
second determination submodule, which may be used to determine the
type of the touch operation as the Type II touch operation when the
result of the judgment submodule is no.
[0020] In a preferred embodiment, the Type I touch operation may be
a short press and the Type II touch operation may be a long press.
The Type I judgment submodule may be configured to judge whether
the time of the touch operation is greater than the preset first
threshold value.
[0021] In a preferred embodiment, the interaction module may
include a first interaction submodule, an adjustment submodule, and
a second interaction submodule, wherein, when the touch operation
is Type I touch operation, the first interaction submodule may be
used to unfold the two or more single-function buttons one after
another in the user touch area around the multifunction button;
when the touch operation is Type II touch operation, the adjustment
submodule may be used to adjust the docking location of the
multifunction button according to the movement operation of the
multifunction button by the user; and the second interaction
submodule may be used to unfold the two or more single-function
buttons one after another in the user touch area around the docking
location of the multifunction button after it is moved.
[0022] In a preferred embodiment, the first interaction submodule
may be configured to unfold evenly the two or more single-function
buttons into an arc around the multifunction button according to
the preset radius, wherein the distances of any two neighboring
single-function buttons of the two or more single-function buttons
are equal and the distances from any of the two or more
single-function buttons to the multifunction button are equal.
[0023] In a preferred embodiment, the adjustment submodule may
include a monitoring submodule and obtaining submodule. The
adjustment submodule may be used to monitor whether the user has
moved the multifunction button. When the result of the monitoring
submodule is yes, the obtaining submodule may obtain a moved
location of the multifunction button after it is moved when the
user stops the movement operation.
[0024] The interaction module may also include a third
determination submodule, which may be used to determine the docking
location of the multifunction button according to the area of the
move location on the user interface when the user stops the
movement operation.
[0025] In a preferred embodiment, adjustment submodule may also
include a trisection module, which may be used to divide the
movable area of the multifunction button on the user interface into
three equal subareas, wherein the movable area may be located on
the bottom of the touchscreen device and its height may be the
horizontal area of the preset second threshold value.
[0026] The adjustment submodule may also include a second
determination module, which may be used to determine the center
coordinate points of the three subareas.
[0027] Accordingly, the obtaining the submodule may include a
detection submodule, which may be used to detect whether the
current multifunction button is out of the movable area in the
vertical direction during its movement; and a correction submodule,
which may be used to correct the vertical coordinate of the move
location of the multifunction button after it is moved, to the
second threshold value of the movable area, while the horizontal
coordinate of the move location is the same as that of the location
after it is moved.
[0028] Accordingly, the third determination submodule may be
configured to determine the docking location of the multifunction
button as the center coordinate point of the current subarea
according to the current subarea where the move location is
located.
[0029] In a preferred embodiment, the second interaction submodule
may be configured to unfold evenly the two or more single-function
buttons into an arc according to the preset radius in the user
touch area around the docking location of the multifunction button,
wherein the distances of any two neighboring single-function
buttons of the two or more single-function buttons may be equal and
the distances from any of the two or more single-function buttons
to the multifunction button may be equal.
[0030] According to a preferred embodiment of the present
application, a user interface interactive device for a touchscreen
device may include a combination module, which may be used to
combine two or more single-function buttons on the user interface
of the touchscreen device into one multifunction button; a
monitoring module, which may be used to monitor the real-time touch
operation with the multifunction button by the user; a first
determination module, which may be used to determine the type of
the touch operation according to the time of the touch operation; a
judgment module, which may be used to judge whether the type of the
touch operation satisfies the preset conditions to move the
multifunction button; a second determination module, which may be
used to determine the docking location of the multifunction button
according to the movement operation of the multifunction button by
the user when the result of the judgment module is yes; and an
unfolding module, which may be used to unfold the two or more
single-function buttons one after another around the multifunction
button when the result of the judgment module is no.
[0031] According to a preferred embodiment of the present
application, a touchscreen device, which may include any of the
devices as described above.
[0032] By combining several single-function buttons into one
multifunction button, users may not need to consider the location
of every single-function button in the current technology during
the operation, instead, they may only need to operate with the
multifunction button. In addition, as the combined single-function
buttons are around the multifunction button, users may also control
the screen of the single-function buttons they need to operate with
by moving the multifunction button, which makes it convenient for
users to use the single-function buttons on the touchscreen.
Furthermore, with convenient operation, it will not increase the
operation times on the touchscreen and thus will reduce the wear of
the touchscreen of the touchscreen devices from users'
operation.
BRIEF DESCRIPTION OF THE DRAWINGS
[0033] The embodiments described below may be more fully understood
by reading the following description in conjunction with the
drawings, in which:
[0034] FIG. 1 is a schematic diagram of the user interface of a
touchscreen device;
[0035] FIG. 2 is a schematic diagram of a user operation area of
the touchscreen device;
[0036] FIG. 3 is a flow diagram of an embodiment 1 of a first
method of a preferred embodiment of the present invention;
[0037] FIG. 4 is a schematic diagram of a location of the
multifunction button in embodiment 1 of the first method of the
present invention;
[0038] FIG. 5 is a flow diagram of step 302 in embodiment 1 of the
first method of the present invention;
[0039] FIG. 6 is an instance flow diagram of step 302 in embodiment
1 of the first method of the present invention;
[0040] FIG. 7 is a schematic diagram of the interface when an
embodiment of the present invention unfolds several single-function
buttons on the user interface;
[0041] FIG. 8 is a flow diagram of an embodiment 2 of the first
method of the present invention;
[0042] FIGS. 9 and 10 are schematic diagrams of unfolding
single-function buttons when a docking location of the
multifunction button is on the left and right side of the user
interface;
[0043] FIG. 11 is a flow diagram of an embodiment 3 of the first
method of the present invention;
[0044] FIG. 12 is a schematic diagram of an embodiment of the
second method of the present invention;
[0045] FIG. 13 is a structural schematic diagram of embodiment 1 of
the first device of the present invention;
[0046] FIG. 14 is a structural schematic diagram of a first
determination module 1103 in embodiment 1 of the first device of
the present invention;
[0047] FIG. 15 is a structural schematic diagram of interaction
module 1304 in embodiment 1 of the first device of the present
invention;
[0048] FIG. 16 is a structural schematic diagram of an embodiment 2
of the first device of the present invention;
[0049] FIG. 17 is a structural schematic diagram of an embodiment 3
of the first device of the present invention; and
[0050] FIG. 18 is a structural schematic diagram of an embodiment
of the second device of the present invention.
DETAILED DESCRIPTION
[0051] Preferred embodiments will now be described more fully with
reference to the accompanying drawings, in which preferred
embodiments are shown. Preferred embodiments may, however, be
embodied in many different forms and should not be construed as
being limited to the preferred embodiments set forth herein;
rather, preferred embodiments are provided so that this disclosure
will be thorough and complete, and will fully convey the concept of
the invention to one skilled in the art. In the drawings, the
regions are exaggerated for clarity and not necessarily in scale.
Like reference numerals in the drawings denote like elements, and
thus, their description will not be repeated.
[0052] The present application is directed to user interface
interactive methods and devices for touchscreen devices. Each of
the touchscreen devices may include a non-transitory
computer-readable and/or processor-readable storage medium and a
processor in communication with the non-transitory
computer-readable and/or processor-readable storage medium. The
non-transitory computer-readable and/or processor-readable storage
medium may store sets of instructions for conducting operations on
a touchscreen. The processor may be configured to execute the sets
of instructions, which conduct the user interface interactive
methods. Here, the computer may be any type of electrical
computational device, such as a laptop, a desktop, a tablet, a
mobile phone (e.g., a featured cell phone and a smartphone), and/or
a personal digital assistance (PDA), etc. the computer may also be
the electrical computation part of any other device, such as the
electrical computation part of a camera, a GPS device, and/or a
motor vehicle, etc. In an even broader sense, the term computer
used here may be any electrical designs that is capable of
operating programs and processing data by a processor.
[0053] According to the methods, the processor may combine two or
more single-function buttons on the user interface of the
touchscreen device into one multifunction button and unfold the two
or more single-function buttons around the multifunction button
when a user touches the multifunction button. Because there may be
only one multifunction button and its location may be fixed and all
buttons may be unfolded around the multifunction button on the
bottom of the touchscreen device, it is convenient for the user to
operate the touchscreen device.
[0054] In the embodiments of the present application, for
convenience of description, a touchscreen cellphone is adopted to
illustrate the present application. However, the embodiments of the
present invention may also be applied to other touchscreen devices
like tablet PCs.
[0055] FIG. 3 shows a flow diagram of embodiment 1 of the first
user interface interactive method for touchscreen devices of the
present application. The method may include using a processor to
execute sets of instructions saved in a non-transitory
computer-readable medium to conduct the following steps:
[0056] Step 301: combining two or more single-function buttons on a
user interface of the touchscreen device into one multifunction
button.
[0057] When implementing the embodiment of the present invention,
the processor may first combine several single-function buttons on
the user interface of the touchscreen device into one multifunction
button. For example, the processor may combine several
single-function buttons on the top or bottom of the touchscreen
device into one multifunction button. A default location of the
multifunction button may be preset at the central bottom location
of the touchscreen device. FIG. 4 is a schematic diagram showing
that a default location of the multifunction button 410 may be
preset at the central bottom of a touchscreen 420 of a device
400.
[0058] Step 302: monitoring a real-time touch operation on the
multifunction button conducted by the user.
[0059] In this step, the processor may monitor in real time whether
the user has conducted and/or is conducting a touch operation with
the multifunction button. If yes, the processor may start a timer
to detect the time of the touch operation on the multifunction
button conducted by the user.
[0060] Step 303: determining the type of the touch operation
according to the time of the touch operation.
[0061] In practical use, a user may conduct long-press, short-press
or double-click the multifunction button. Therefore, there may be
several types of touch operation in this step. For example, Type I
touch operation may be defined as short press, Type II touch
operation may be defined as long press, Type III touch operation
may be defined as double click, and Type IV touch operation may be
defined as another type of touch operation and so on. Because in
practical use common operations are mainly long press and short
press, example embodiments in the present application only
distinguish two types of touch operation: Type I touch operation
and Type II touch operation. However, the spirit of the example
embodiments should also be applied to situations that include more
than two types of touch operations.
[0062] As shown in FIG. 5, conducting step 303 may require the
processor conduct the following steps:
[0063] Step 501: judging whether the time of the touch operation
satisfies preset conditions of Type I touch operation. If it does,
the processor may proceed to step 502; if not, the processor may
proceed to step 503. The preset conditions may be set according to
whether Type I touch operation is defined as a long press or a
short press. For example, if Type I touch operation is defined as a
short press, the preset conditions may be set as short press
time.
[0064] Step 502: determining the type of the touch operation as the
Type I touch operation. If the time of the touch operation
satisfies the time requirements of Type I touch operation, the
processor may determine the type of the touch operation as Type I
touch operation.
[0065] Step 503: determining the type of the touch operation as the
Type II touch operation. If the time of the touch operation fails
the time requirements of Type I touch operation, the processor may
determine the type of the touch operation as Type II touch
operation. This example is only represented when there are two
touch operation types. In situations where there are more than two
types of touch operation, those skilled in the art may make
corresponding adjustment and/or amend the steps above accordingly
in order to distinguish all types of touch operation.
[0066] To make those skilled in the art understand the
implementation of the present application, illustrations are made
below based on the situation that Type I touch operation is short
press and the Type II touch operation is long press. Accordingly,
the step 302 may include the following steps as shown in FIG.
6:
[0067] Step 601: judging, by the processor, whether the time of the
touch operation is greater than a preset first threshold value. If
it does, the processor may proceed to step 602; if not, the
processor may proceed to step 603.
[0068] The preset first threshold value may be determined before
step 601 and servers as a critical point to distinguish a long
press touch from a short press touch. For example, the preset first
threshold value may be 0.8 second (0.8 s). When implementing step
602, the touch operation time may be obtained by monitoring a
touchesBegan event (i.e., the beginning of a touch operation) and a
touchesEnded event (i.e., the end of the touch operation) of the
UIView (i.e., the user interface). A timer may be started by the
processor when the touchesBegan event is triggered and stopped by
the processor when the touchesEnded event is triggered, and the
operation time on the timer is the touch operation time.
[0069] Step 602: determine the touch operation as long press. When
the touch operation time is longer than the preset first threshold
value (e.g., 0.8 s), the type of the touch operation by the user
may be determined as long press.
[0070] Step 603: determine the touch operation as short press. When
the touch operation time is shorter than the preset first threshold
value (e.g., 0.8 s), the type of the touch operation by the user
may be determined as short press.
[0071] It should be noted that when the touch operation time equals
the preset first threshold value (e.g., 0.8 s), the type of the
touch operation may be determined as long press or short press
according to the actual situation which may be set by the user.
[0072] Step 304: unfolding the two or more single-function buttons
one after another around the multifunction button according to the
type of the touch operation.
[0073] In this step, the processor may unfold the two or more
single-function buttons one after another around the multifunction
button according to the type of the touch operation. The unfolding
way may differ as the touch operation type changes. For example,
when the touch operation is Type I touch operation, the processor
may unfold the two or more single-function buttons one after
another in a user touch area around the multifunction button
directly; when the touch operation is Type II touch operation, the
processor may first adjust a docking location (i.e., an actual
location that a button is finally placed by the processor after a
user moves the button from its original location on the
touchscreen) of the multifunction button according to the movement
operation of the multifunction button by the user. Then the
processor may unfold the two or more single-function buttons one
after another in the user touch area around the docking location of
the multifunction button after it is moved.
[0074] It should be understood that there are such examples in
practical use as that when the user short-presses the multifunction
button (that is, when Type I touch operation is short press), the
processor may unfold the combined single-function buttons one after
another around the multifunction button into an arc; and when the
user long-presses the multifunction button (that is, when Type II
touch operation is long press), the processor may monitor if the
multifunction button is moved by the user from its original
location, place the multifunction button to a docking location
after it is moved by the user, and unfold the combined
single-function buttons one after another around the docking
location of the multifunction button. These actions may be
conducted by the processor simultaneously or may be conduct by the
processor one after another. The situation on user's long-pressing
the multifunction button will be introduced in detail in embodiment
2 and embodiment 3.
[0075] It should be noted that, according to the user's using
habits or aesthetic measure, when short-pressing the multifunction
button, the user may decide that the processor unfolds evenly the
two or more single-function buttons into an arc according to a
preset radius around the multifunction button, wherein the
distances of any two neighboring single-function buttons of the two
or more single-function buttons are equal and the distances from
any of the two or more single-function buttons to the multifunction
button are equal.
[0076] FIG. 7 is a schematic diagram of the interface when several
single-function buttons are unfolded in the user interface, wherein
FIG. 7 illustrates 5 single-function buttons, and the distances of
any two neighboring single-function buttons 401 are equal and the
distances from any of single-function buttons 401 to the
multifunction button 410 are equal. It should be noted that FIG. 7
is just an example embodiment of the even unfolding of
single-function buttons. The single-function buttons may also be
unfolded at will around the multifunction button without limiting
the distribution of the single-function buttons as long as the user
can see these single-function buttons.
[0077] In this example embodiment, the unfolding diameter of the
single-function buttons may be 1/2to 2/3 of the diameter of the
multifunction button, while the user may decide the size of the
single-function buttons or multifunction button. This example
embodiment is just an illustration. In addition, the number of the
single-function buttons and that whether there will be background
or words when the single-function buttons are unfolded and so on
may be defined by the user himself. It should be noted that when
the number of unfolded single-function buttons is more than a
maximum number that is allowed to be displayed on the screen of the
touchscreen device, the number of unfolded single-function buttons
may be set according to the current location of the multifunction
button. For example, when the multifunction button is on the left
or right side, the number of single-function buttons may be no more
than 6. When the number of single-function buttons exceeds 6, only
5 of the single-function buttons may be displayed and the last
button (i.e., the 6th button displayed) may be a "Show More"
button. When the user touches the "Show More" button, an action
sheet and/or a menu may pop up for the user to choose from more
actions provided thereon. When the multifunction button is on the
bottom middle of the screen, the number of the buttons may be no
more than 11 and when the number of single-function buttons exceeds
11, only 10 single-function buttons are displayed. The last button
(i.e., the 10th button displayed) may be a "Show More" button. When
the user touches the "Show More" button, an action sheet and/or a
menu may pop up for the user to choose from more actions provided
thereon.
[0078] It should be understood that, in practical use, user may
also set the multifunction button as a semitransparent button,
which may not affect the screen display frame of the touchscreen
device and may also achieve the effect of the example embodiment of
the present application.
[0079] In the example embodiment, after combining several
single-function buttons into one multifunction button with the
method above, users may not need to consider the location of every
single-function button during the operation, instead, they may only
need to operate with the multifunction button. In addition, as the
combined single-function buttons are around the multifunction
button, users may also control the screen of the single-function
buttons they need to operate with by moving the multifunction
button, which makes it convenient for users to use the
single-function buttons on the touchscreen. Furthermore, with
convenient operation, it will not increase the operation times on
the touchscreen and thus may reduce the wear of the touchscreen of
the touchscreen devices from users' operation.
[0080] FIG. 8 illustrates a flow chart of an embodiment 2 of the
first user interface interactive method for touchscreen devices of
the present invention. In this embodiment, the touch operation type
may include Type I and Type II touch operation, wherein Type II
touch operation is illustrated as long press and Type I touch
operation is illustrated as short press. In this example embodiment
(i.e., embodiment 2), the processor may execute sets of
instructions stored in the non-transitory computer-readable medium
to conduct the following steps:
[0081] Step 801: combining two or more single-function buttons on
the user interface of the touchscreen device into one multifunction
button.
[0082] Step 802: monitoring a real-time touch operation conducted
by a user on the multifunction button.
[0083] The implementation of step 801 and step 802 is similar to
step 301 and step 302, respectively, in embodiment 1, as shown in
FIG. 3.
[0084] Step 803: judging whether the time of the touch operation is
longer than a preset first threshold value. If it is, the processor
may determine that the touch operation type is long press.
[0085] Step 804: determining the type of the touch operation as
long press.
[0086] After determining the type of the touch operation is long
press, the processor may adjust the docking location of the
multifunction button according to the movement operation of the
multifunction button by the user, wherein step 805 to step 807
illustrate how to adjust the docking location of the multifunction
button according to the movement operation of the multifunction
button by the user.
[0087] Step 805: monitoring if the user has moved the multifunction
button to a new location different from its original location. If
he does, the processor may proceed to step 806.
[0088] In this step, the processor may also monitor whether the
user moves the multifunction button when he long-presses the
multifunction button. If he does, the location of the multifunction
button is changed.
[0089] Step 806: when the user stops the movement operation,
obtaining the moved location of the multifunction button after it
is moved.
[0090] In this step, the processor may obtain the current moved
location of the multifunction button when the user stops the
movement operation. Because the user may move the multifunction
button at will on the whole screen of the touchscreen device, the
processor may monitor the current location of the multifunction
button after the user has stopped the movement operation.
[0091] Step 807: when the user stops the touch operation,
determining the docking location of the multifunction button
according to the area of the moved location on the user
interface.
[0092] If the user stops touching the multifunction button after
stops moving the multifunction button, the processor may determine
the docking location of the multifunction button according to its
moved location as determined in step 806. For example, when the
user stops dragging the multifunction button, the multifunction
button may or may not just stop right in its docking location. Thus
the processor may determine whether the multifunction button needs
to dock on the left, middle, or right side of the user interface
according to the area where its moved location is, for example, on
the left, middle or right side of the user interface.
[0093] Step 808: unfolding the two or more single-function buttons
one after another around the docking location of the multifunction
button.
[0094] After the docking location of the multifunction button is
determined, the two or more single-function buttons may be unfolded
one after another around the docking location of the multifunction
button. For example, the processor may unfold evenly the two or
more single-function buttons into an arc according to the preset
radius around the multifunction button, wherein the distances of
any two neighboring single-function buttons of the two or more
single-function buttons may be equal and the distances from any of
the two or more single-function buttons to the multifunction button
may be equal.
[0095] FIG. 9 is a schematic diagram showing unfolding
single-function buttons when the docking location of the
multifunction button is on the left side of the user interface.
FIG. 10 is a schematic diagram showing unfolding single-function
buttons when the docking location of the multifunction button is on
the right side of the user interface. It can be seen from FIGS.
9-10 that after unfolding the single-function buttons, the
single-function buttons are distributed evenly on an arc with the
multifunction button as the center of the arc. The diameter of the
arc is 1/2 of the screen height of the touchscreen device. When the
multifunction button is on the left or right bottom of the screen,
the single-function buttons form a 90-degree arc, and when the
multifunction button is in the middle of the screen, the
single-function buttons form a 180-degree arc. It should be noted
that the diameter of the arc formed by unfolding single-function
buttons may also be 1/3 of the screen height of the touchscreen
device, or other numerical value, or a diameter preset by the
user.
[0096] It should be noted that, the multifunction button may be at
any location on the screen of the touchscreen device and the
single-function buttons may be unfolded in various ways such as
being unfolded vertically or horizontally. The schematic diagrams
in the example embodiment of the present application are only
examples and should not be understood as all of the implementation
models of the embodiment of the present application.
[0097] In this embodiment, the user can move the multifunction
button to the left, middle or right bottom of the screen by
long-pressing the multifunction button so he can unfold the
single-function buttons in a range convenient for his operation,
which not only increases the utilization rate of the screen of the
touchscreen device but also enhances user experience.
[0098] FIG. 11 shows a flow diagram of an embodiment 3 of the first
user interface interactive method for touchscreen devices of the
present application. In this example embodiment, touch operation
type may include Type I touch operation and Type II touch
operation, wherein Type II touch operation may be illustrated as
long press and Type I touch operation may be illustrated as short
press. The embodiment may require the processor execute sets of
instructions stored in the non-transitory computer-readable medium
to conduct the following steps:
[0099] Step 1101: combining two or more single-function buttons on
the user interface of the touchscreen device into one multifunction
button.
[0100] Step 1102: dividing a movable area (i.e., an area on the
touchscreen on which the multifunction button may be movable) of
the multifunction button on the user interface into three equal
subareas; wherein the movable area may be a horizontal area located
on the bottom of the touchscreen of the device. The movable area
may have a height equals a preset second threshold value.
[0101] The second threshold value may be modified according to the
size of the multifunction button and may be generally fixed on the
bottom of the touchscreen of the touchscreen device, for example,
within the area with a height of 44 pixels above the bottom of the
screen and a width of the screen bottom horizontal line, which is
called the movable area of the multifunction button. The processor
may divide the movable area into three equal subareas A (left
side), B (middle) and C (right side).
[0102] Step 1103: determining the center coordinates of the three
subareas.
[0103] The processor may set the center coordinates of the three
subareas as the final docking locations of the multifunction
function button in the three subareas--a (center coordinate of
subarea A), b (center coordinate of subarea B) and c (center
coordinate of subarea C).
[0104] Step 1104: monitoring the real-time touch operation of the
multifunction button by the user.
[0105] Step 1105: judging whether the time of the touch operation
is greater than the preset first threshold value.
[0106] Step 1106: determining the touch operation as long
press.
[0107] Step 1107: monitoring whether the user has moved the
multifunction button. If he does, the processor may proceed to step
1108.
[0108] Step 1108: when the user stops the movement operation,
detecting whether the multifunction button is out of the movable
area in the vertical direction during the movement. If it is, the
processor may proceed to step 1109.
[0109] Because the user may move the multifunction button at will,
the processor may detect whether the multifunction button is out of
the movable area in the vertical direction after the user stops
moving the multifunction button. If yes, the processor may
automatically correct the location of the multifunction button
later.
[0110] Step 1109: correcting the vertical coordinate of the move
location of the multifunction button after it is moved to the
second threshold value of the movable area, and remaining the
horizontal coordinate of the moved location the same as that of the
location after it is moved.
[0111] Because the location of the multifunction button will not
just stop at a, b or c when the user stops dragging the
multifunction button, the processor may check which area of A, B or
C the coordinate of the multifunction button is in, and its final
docking point may be determined as the center coordinate point of
the current area. When the coordinate of the multifunction button
is not out of the movable area, the center coordinate of the
multifunction button may be the coordinate of the multifunction
button in its current move location; when the coordinate of the
multifunction button is out of the movable area, the X value (i.e.,
the value of the horizontal coordinate) in the horizontal direction
of the center coordinate of the multifunction button may be kept
the same as the X value of the current move location, but the Y
value (i.e., the value of the vertical coordinate) in the vertical
direction may be the Y value that the move location of the
multifunction button maps on the upper boundary of the movable
area. Accordingly, the processor may automatically correct the
deviation that the multifunction button exceeds its movable range
when the user long-presses and moves the multifunction button,
which means that, when the multifunction button moves on the bottom
of the touchscreen device, the move direction depends on the move
direction of the multifunction button on the horizontal line, while
the move distance on the straight-line distance of the
multifunction button in the horizontal direction.
[0112] It should be noted that, the horizontal and vertical
directions specified in the embodiment of the present invention are
relative directions with the bottom horizontal line when the
touchscreen device is placed in forward direction.
[0113] Step 1110: determining the docking location of the
multifunction button as the center coordinate point of the current
subarea according to the current subarea where the move location is
located.
[0114] For example, if the current subarea where the move location
of the multifunction button is located is subarea A, the docking
location of the multifunction button may be determined as point
a.
[0115] Step 1111: unfolding the two or more single-function buttons
one after another around the docking location of the multifunction
button.
[0116] It should be noted that, when unfolding the single-function
buttons in this step, three corresponding animations may be set
according to the docking location of the multifunction button and
each may include the movement of the locations of the
single-function buttons and the rotation of the angle of the
single-function buttons as well as the dissolving effect of the
single-function buttons. Each single-function button may correspond
to an animation path from its starting location to its ending
location, and the animation path may be accompanied by the spinning
effect of the single-function button. A simple animation may also
be used to present the process that the multifunction button moves
from the move location to the docking location. The duration may be
limited to 0.2 s. It should be understood that, during the
animation playing process, the multifunction button may not respond
to touch operation conducted by the user anymore and may only
respond to the touch operation after the animation is ended.
[0117] It should be noted that, the unfolded single-function button
may have both a starting location and an ending location when the
multifunction button is in different positions. The starting
location of each single-function button may be fixed, which may be
the center coordinate of the multifunction button, while the ending
location should be referred to the arc arrangement specified above.
The animation of the single-function buttons may be the combination
of the animation of each single-function button from its starting
location to its ending location. The animation mainly may include
two parts: one is the movement of the location and the other is the
spinning of the single-function button itself. The time of the
animation may be set as being equally split, i.e., the content of
the animation may be equally distributed according to the time. For
example, the spinning starts when the single-function button is
moved to the starting location and ends when it arrives the ending
point. The animation of the single-function button may also be set
as the animation style that supports customized definition, in
which, user only needs to define an AnimationGroup object in
IOS.
[0118] It should be understood that the play time of the animation
of single-function buttons may be set as 0.5 s, and the interval of
the animal starting time of each single-function button may be 0.1
s, which may ensure that the complete animation of tall
single-function buttons be ended in 1 s. So it will not affect user
experience and also makes it convenient for user operation.
[0119] The numerical values in the embodiment of the present
invention are examples made for easy understanding by those skilled
in the art. Those skilled in the art may choose other numerical
values by themselves without contributing any creative labor.
[0120] A second user interface interactive method for touchscreen
devices may also be provided in the embodiment of the present
application, as shown in FIG. 12. This embodiment may require the
processor execute sets of instructions stored in the non-transitory
computer-readable medium to conduct the following steps:
[0121] Step 1201: combining two or more single functions on the
user interface of the touchscreen device into one multifunction
button, and monitoring the real-time touch operation on the
multifunction button by the user.
[0122] Step 1202: determining the type of the touch operation
according to the time of the touch operation.
[0123] In this embodiment, the implementation of step 1201 and step
1202 may be referred to embodiment 1, embodiment 2, and embodiment
3 of the first user interface interactive method for touchscreen
devices as described above.
[0124] Step 1203: judging whether the type of the touch operation
satisfies the preset conditions to move the multifunction button.
If it does, the processor may proceed to step 1204; if not, the
processor may proceed to step 1205.
[0125] In this step, the difference from the first user interface
interactive method for touchscreen devices may be judging whether
the type of the touch operation satisfies the preset conditions to
move the multifunction button. For example, when the type of the
touch operation is long press, it is deemed that it satisfies the
preset conditions to move the multifunction button. However, those
skilled in the art and/or a user may change the preset conditions
according to the actual scene. For example, it may be set as that,
when the user double-clicks the multifunction button, it is deemed
that it satisfies the preset conditions to move the multifunction
button;
[0126] Step 1204: determining the docking location of the
multifunction button according to the movement operation of the
multifunction button by the user.
[0127] If it satisfies the preset conditions to move the
multifunction button, the processor may determine the final docking
location of the multifunction button according to the movement
operation of the multifunction button by the user, wherein the
solution on how to determine the docking location may be referred
to embodiment 2 and embodiment 2 of the first user interface
interactive method for touchscreen devices, as set forth above.
[0128] Step 1205: unfolding the two or more single-function buttons
one after another around the multifunction button.
[0129] If it does not satisfy the preset conditions to move the
multifunction button, the processor may unfold the two or more
single-function buttons around the multifunction button directly.
It should be noted that, the implementation of the steps in this
embodiment may be referred to those in embodiment 1, embodiment 2,
and embodiment 3 of the first user interface interactive method for
touchscreen devices, so it is unnecessary to go into details
here.
[0130] Through the description of the embodiments above, those
skilled in the art may understand that the present application may
be embodied in the form of a software and/or hardware product. The
computer software product is stored in a computer-readable storage
medium including a number of instructions that make a computer
equipment (which can be a PC, a server, or network equipment and so
on) implement all or part of the steps of the methods in the
embodiments of the present application. While the storage medium
may include, but not limited to, ROM, RAM, disk or optical disk or
other medium that can store program codes.
[0131] Corresponding to the method embodiment above, the embodiment
of the present application also provides a first user interface
interactive device for touchscreen devices. FIG. 13 shows a
structural schematic diagram of an embodiment 1 of the first user
interface interactive device for touchscreen devices, which may
comprise:
[0132] a combination module 1301 being configured to combine two or
more single-function buttons on the user interface of the
touchscreen device into one multifunction button;
[0133] a monitoring module 1302 being configured to monitor the
real-time touch operation of the multifunction button by the user;
and
[0134] a first determination module 1303 being configured to
determine the type of the touch operation according to the time of
the touch operation;
[0135] wherein the type of the touch operation may include Type I
touch operation and Type II touch operation. As shown in FIG. 14,
the first determination module 1303 may comprises:
[0136] a first judgment submodule 1401 being configured to judge
whether the time of the touch operation satisfies the preset
conditions of Type I touch operation;
[0137] Type I touch operation is short press and Type II touch
operation is long press. The first judgment submodule may be
configured to:
[0138] judge whether the time of the touch operation is longer than
a preset first threshold value;
[0139] a first determination submodule 1402, which may be
configured to determine the type of the touch operation as the Type
I touch operation when the result of the judgment submodule is
yes;
[0140] a second determination submodule 1403, which may be
configured to determine the type of the touch operation as the Type
II touch operation when the result of the judgment submodule is
no;
[0141] an interaction module 1304, which may be configured to
unfold the two or more single-function buttons one after another
around the multifunction button according to the type of the touch
operation.
[0142] As shown in FIG. 15, the interaction module 1304 may
comprise a first interaction submodule 1501, an adjustment
submodule 1502, and a second interaction submodule 1503, wherein
when the touch operation is Type I touch operation, the first
interaction submodule 1501 may be configured to unfold the two or
more single-function buttons one after another in the user touch
area around the multifunction button;
[0143] The first interaction submodule 1501 may be configured to
unfold evenly the two or more single-function buttons into an arc
according to the preset radius around the multifunction button,
wherein the distances of any two neighboring single-function
buttons of the two or more single-function buttons may be equal and
the distances from any of the two or more single-function buttons
to the multifunction button may be equal;
[0144] When the touch operation is Type II touch operation, the
adjustment submodule 1502 may be configured to adjust the docking
location of the multifunction button according to the movement
operation of the multifunction button by the user; and
[0145] The second interaction submodule 1503 may be configured to
unfold the two or more single-function buttons one after another in
the user touch area around the docking location of the
multifunction button after it is moved.
[0146] In this embodiment, when operating the touchscreen device,
users do not need to consider the location of every single-function
button in the current technology during the operation, instead,
they only need to operate with the multifunction button. In
addition, as the combined single-function buttons are around the
multifunction button, users may also control the screen of the
single-function buttons they need to operate with by moving the
multifunction button, which makes it convenient for users to use
the single-function buttons on the touchscreen. Furthermore, with
convenient operation, it will not increase the operation times on
the touchscreen and thus will reduce the wear of the touchscreen of
the touchscreen devices from users' operation.
[0147] FIG. 16 shows a structural schematic diagram of an
embodiment 2 of the first user interface interactive device for
touchscreen devices. The touch operation type may include Type I
touch operation and Type II touch operation, wherein Type II touch
operation may be illustrated as long press. The device in the
embodiment may comprise:
[0148] a combination module 1301, which may be configured to
combine two or more single-function buttons on the interface of the
touchscreen device into one multifunction button;
[0149] a monitoring module 1302, which may be configured to monitor
the real-time touch operation of the multifunction button by the
user;
[0150] a first determination module 1303, which may be configured
to determine the type of the touch operation according to the time
of the touch operation.
[0151] When the touch operation is Type II touch operation, the
adjustment submodule 1502 may comprise:
[0152] a monitoring submodule 1601, which may be configured to
monitor whether the user has moved the multifunction button;
[0153] when the result of the monitoring submodule 1601 is yes, the
obtaining submodule 1602, which may be used when the user stops the
movement operation, may be configured to obtain the location of the
multifunction button after it is moved;
[0154] a third determination submodule 1603, which may be
configured to determine the docking location of the multifunction
button according to the area of the move location on the user
interface when the user stops the touch operation.
[0155] The second interaction submodule 1503 may be configured to
unfold evenly the two or more single-function buttons into an arc
according to the preset radius in the user touch area around the
docking location of the multifunction button, wherein distances of
any two neighboring single-function buttons of the two or more
single-function buttons may be equal and the distances from any of
the two or more single-function buttons to the multifunction button
may be equal.
[0156] In this embodiment, the user may move the multifunction
button to the left, middle or right bottom of the screen by
long-pressing the multifunction button so he may unfold the
single-function buttons in a range convenient for his operation,
which not only increases the utilization rate of the screen of the
touchscreen device but also enhances user experience.
[0157] FIG. 17 shows a structural schematic diagram of an
embodiment 3 of the first user interface interactive device for
touchscreen devices. The touch operation type may include Type I
touch operation and Type II touch operation, wherein Type II touch
operation may be illustrated as long press. The device in the
embodiment may comprise:
[0158] a combination module 1301, which may be configured to
combine two or more single-function buttons on the user interface
of the touchscreen device into one multifunction button;
[0159] a trisection module 1701, which may be configured to divide
the movable area of the multifunction button on the user interface
into three equal subareas; wherein the movable area may be a
horizontal area located on the bottom of the touchscreen device and
its height may equals to the preset second threshold value;
[0160] a second determination module 1702, which is used to may be
configured to determine the center coordinate points of the three
subareas;
[0161] a monitoring module 1302, which may be configured to monitor
the real-time touch operation of the multifunction button by the
user.
[0162] a monitoring submodule 1601, which may be configured to
monitor whether the user has moved the multifunction button;
[0163] when the result of the monitoring submodule 1601 is yes, in
this embodiment, the obtaining submodule 1602 may comprise:
[0164] a detection submodule 1703, which may be configured to
detect whether the current multifunction button is out of the
movable area in the vertical direction during its movement;
[0165] a correction submodule 1704, which may be configured to
correct the vertical coordinate of the move location of the
multifunction button after it is moved, to the second threshold
value of the movable area, while the horizontal coordinate of the
move location is the same as that of the location after it is
moved;
[0166] a third determination submodule 1603, which may be
configured to determine the docking location of the multifunction
button as the center coordinate point of the current subarea
according to the current subarea where the move location is
located; and
[0167] an interaction module 1304, which may be configured to
unfold the two or more single-function buttons one after another
around the docking location of the multifunction button according
to the type of the touch operation.
[0168] FIG. 18 shows a structural schematic diagram of an
embodiment of the second user interface interactive device for
touchscreen devices which may comprise:
[0169] a combination module 1301, which may be configured to
combine two or more single-function buttons on the user interface
of the touchscreen device into one multifunction button;
[0170] a monitoring module 1302, which may be configured to monitor
the real-time touch operation of the multifunction button by the
user;
[0171] a first determination module 1303, which may be configured
to determine the type of the touch operation according to the time
of the touch operation;
[0172] a judgment module 1801, which may be configured to judge
whether the type of the touch operation satisfies the preset
conditions to move the multifunction button;
[0173] a second determination module 1802, which may be configured
to determine the docking location of the multifunction button
according to the user's movement operation of the multifunction
button when the result of the judgment module is yes; and
[0174] an unfolding module 1803, which may be configured to unfold
one after another the two or more single-function buttons around
the multifunction button when the result of the judgment module is
no.
[0175] It should be noted that the modules set forth above may be
hardware structures in the touchscreen devices of the present
application. The modules may also be hardware module embedded in
the processor. The modules may also be instructions stored in the
non-transitory computer-readable storage medium and may be executed
by the processor.
[0176] To the device embodiment, as it is basically corresponding
to the method embodiment, please refer to the description of the
method embodiment for relevant steps. The device embodiment
described above is just a schematic embodiment, wherein the units
that are illustrated as separated parts may be or may not be
separated physically, and the components displayed as units may be
or may not be physical units, that is, they can be located in the
same place or distributed in several network elements. Part or all
of the modules may be selected to achieve the goal of the
embodiment according to the actual demand. Those skilled in the art
may understand and implement the embodiment without contributing
any creative labor.
[0177] A touchscreen device is also disclosed in the embodiment of
the present invention. The touchscreen device may be any one of the
user interface interactive device disclosed above.
[0178] It should be understood that, the present invention can be
used in many general or special-purpose computer system
environments or configurations, such as PC, server computer,
handheld device or portable device, tablet device, multiprocessor
system, microprocessor-based system, set-top box, programmable
consumer electronics, network PC, small-size computer, large-scale
computer and the distributed computing environment that includes
any of the systems or equipment above and so on.
[0179] The present invention can be described in the general
context of the executable instruction executed by the computer,
such as program module. Generally, program module includes routine,
program, object, component and data structure and so on that
execute specific tasks or realize specific abstract data types. The
present invention can also be practiced in distributed computing
environments. In those distributed computing environments, tasks
are executed by remote process equipment connected through
communication network. In distributed computing environments,
program modules can be located in the local and remote computer
storage mediums including storage devices.
[0180] It should be noted that, the relation terms like first and
second, as used herein, are only used to distinguish one entity or
operation from another and are not necessarily required to or
indicating any actual such relation or order between those entities
or operations. Moreover, the terms "include" "comprise" or any
other variant means non-exclusive comprising, so comprising or
including the processes, methods, objects or equipment of a series
of elements means there are not only those elements included but
also other elements that are not listed or means they also include
the inherent elements of those processes, methods, objects or
equipment. Without more restrictions, the elements limited by the
phrase "including one . . . " does not rule out the possibility
that there are other identical elements in the processes, methods,
objects or equipment of the element.
[0181] The invention has been described in terms of various
specific embodiments. It should be pointed out that those skilled
in the art may make various changes and modifications on those
embodiments without deviating from the principles of the present
invention and those changes and modifications should be within the
scope of protection of the present invention.
* * * * *