U.S. patent application number 13/785370 was filed with the patent office on 2013-09-12 for system and method for linking and controlling terminals.
This patent application is currently assigned to INDUSTRY-UNIVERSITY COOPERATION FOUNDATION HANYANG UNIVERSITY. The applicant listed for this patent is INDUSTRY-UNIVERSITY COOPERATION FOUNDATION HANYANG UNIVERSITY. Invention is credited to Chang-sik YOO.
Application Number | 20130234959 13/785370 |
Document ID | / |
Family ID | 49113653 |
Filed Date | 2013-09-12 |
United States Patent
Application |
20130234959 |
Kind Code |
A1 |
YOO; Chang-sik |
September 12, 2013 |
SYSTEM AND METHOD FOR LINKING AND CONTROLLING TERMINALS
Abstract
A system for linking and controlling terminals is disclosed. The
user terminal includes a decoding unit configured to decode image
data received from a receiving terminal; a touch display unit
configured to display the decoded image; and an information
transmitting unit configured to transmit selection information for
an event-executing entity included in the decoded image to the
receiving terminal.
Inventors: |
YOO; Chang-sik; (Seoul,
KR) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
INDUSTRY-UNIVERSITY COOPERATION FOUNDATION HANYANG
UNIVERSITY |
Seoul |
|
KR |
|
|
Assignee: |
INDUSTRY-UNIVERSITY COOPERATION
FOUNDATION HANYANG UNIVERSITY
Seoul
KR
|
Family ID: |
49113653 |
Appl. No.: |
13/785370 |
Filed: |
March 5, 2013 |
Current U.S.
Class: |
345/173 |
Current CPC
Class: |
G06F 3/0416 20130101;
G06F 2203/0384 20130101; G06F 3/0412 20130101; H04N 21/42209
20130101; H04N 21/41265 20200801; G06F 3/0488 20130101; G06F 9/452
20180201; H04N 21/4222 20130101; H04N 21/42204 20130101; H04N
21/42224 20130101; G06F 3/038 20130101; G06F 3/1423 20130101; G06F
3/16 20130101 |
Class at
Publication: |
345/173 |
International
Class: |
G06F 3/0488 20060101
G06F003/0488 |
Foreign Application Data
Date |
Code |
Application Number |
Mar 6, 2012 |
KR |
10-2012-0022984 |
Mar 6, 2012 |
KR |
10-2012-0022986 |
Mar 6, 2012 |
KR |
10-2012-0022988 |
Mar 6, 2012 |
KR |
10-2012-0023012 |
Mar 8, 2012 |
KR |
10-2012-0024073 |
Mar 8, 2012 |
KR |
10-2012-0024092 |
Mar 30, 2012 |
KR |
10-2012-0032982 |
Mar 30, 2012 |
KR |
10-2012-0033047 |
Apr 25, 2012 |
KR |
10-2012-0043148 |
May 31, 2012 |
KR |
10-2012-0057996 |
May 31, 2012 |
KR |
10-2012-0057998 |
May 31, 2012 |
KR |
10-2012-0058000 |
Claims
1. A user terminal comprising: a decoding unit configured to decode
image data received from a receiving terminal; a touch display unit
configured to display the decoded image; and an information
transmitting unit configured to transmit selection information for
an event-executing entity included in the decoded image to the
receiving terminal.
2. The user terminal of claim 1, wherein the selection information
includes position information corresponding to a touch input for
selecting the event-executing entity.
3. The user terminal of claim 1, wherein the selection information
includes control information for executing the event-executing
entity selected in accordance with a touch input.
4. The user terminal of claim 1, wherein the image data is image
data for an entirety of or a portion of an image displayed on the
receiving terminal.
5. The user terminal of claim 1, further comprising: an image
generating unit configured to generate a position image for a touch
means positioned within a preset distance from the touch display
unit, wherein the touch display unit displays the position
image.
6. The user terminal of claim 5, wherein the image generating unit
generates a position image for a touch means touching the touch
display unit with a pressure or an area smaller than or equal to a
preset pressure level or area level.
7. The user terminal of claim 5, further comprising: a mode changer
unit configured to change a touch mode according to a user's input,
wherein the touch mode includes: a first touch mode in which the
position image is displayed; and a second touch mode in which the
position image is not displayed.
8. The user terminal of claim 1, wherein a position image
indicating a position of a touch means has a different shape
according to whether or not the touch means is positioned at the
event-executing entity or at a preset position.
9. A receiving terminal comprising: an information transmitting
unit configured to transmit an entirety of or a portion of an image
to a user terminal; and an information receiving unit configured to
receive selection information for an event-executing entity
included in the image from the user terminal.
10. The receiving terminal of claim 9, further comprising: an
operation executing unit configured to execute an operation
corresponding to the received selection information.
11. A terminal linkage method comprising: sharing an image with a
receiving terminal; receiving input in a form of selection
information for an event-executing entity included in the image;
and transmitting the selection information to the receiving
terminal.
12. The terminal linkage method of claim 11, wherein the selection
information includes position information corresponding to a touch
input for selecting the event-executing entity.
13. The terminal linkage method of claim 11, wherein the selection
information includes control information for executing the
event-executing entity selected in accordance with a touch
input.
14. The terminal linkage method of claim 11, wherein the image is
an entirety of or a portion of an image displayed on the receiving
terminal.
15. The terminal linkage method of claim 11, further comprising:
generating a position image for a touch means positioned within a
preset distance from a user terminal; and displaying the generated
position image.
16. The terminal linkage method of claim 15, wherein the position
image is an image for a touch means touching the user terminal with
a pressure or an area smaller than or equal to a preset pressure
level or area level.
17. The terminal linkage method of claim 15, further comprising:
changing a touch mode according to a user's input, wherein the
touch mode comprises: a first touch mode in which the position
image is displayed; and a second touch mode in which the position
image is not displayed.
18. The terminal linkage method of claim 11, further comprising:
changing a shape of a position image indicating a position of a
touch means according to whether or not the touch means is
positioned at the event-executing entity or at a preset
position.
19. A terminal linkage comprising: transmitting an entirety of or a
portion of an image to a user terminal; and receiving selection
information for an event-executing entity included in the image
from the user teiminal.
20. The terminal linkage method of claim 19, further comprising:
executing an operation corresponding to the received selection
information.
21. A recorded medium readable by a digital processing device
tangibly embodying a program of instructions executable by the
digital processing device to perform a method for linking a user
terminal with a receiving terminal, the method comprising: sharing
an image with the receiving terminal; receiving input in a form of
selection information for an event-executing entity included in the
image; and transmitting the selection information to the receiving
terminal.
22. The recorded medium of claim 21, wherein the image is an
entirety of or a portion of an image displayed on the receiving
terminal.
23. A recorded medium readable by a digital processing device
tangibly embodying a program of instructions executable by the
digital processing device to perform a method for linking a user
terminal with a receiving terminal, the method comprising:
transmitting an entirety of or a portion of an image to the user
terminal; and receiving selection information for an
event-executing entity included in the image from the user
terminal.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of Korean Patent
Applications No. 10-2012-0023012, No. 10-2012-0022986, No.
10-2012-0022988, No. 10-2012-0022984, No. 10-2012-0024073, No.
10-2012-0024092, No. 10-2012-0032982, No. 10-2012-0033047, No.
10-2012-0043148, No. 10-2012-0057996, No. 10-2012-0057998, No.
10-2012-0058000, filed respectively with the Korean Intellectual
Property Office on Mar. 6, 2012, Mar. 6, 2012, Mar. 6, 2012, Mar.
6, 2012, Mar. 8, 2012, Mar. 8, 2012, Mar. 30, 2012, Mar. 30, 2012,
Apr. 25, 2012, May 31, 2012, May 31, 2012, May 31, 2012, the
disclosures of which are incorporated herein by reference in their
entireties.
BACKGROUND
[0002] 1. Technical Field
[0003] The present invention relates to a system and method for
linking and controlling terminals.
[0004] 2. Description of the Related Art
[0005] The smart TV has become very popular in recent times, and
various methods have been proposed to increase convenience in
performing operations on a smart TV, one such method disclosed, for
example, in Korean Patent Publication No. 2011-0078656. However, it
is inconvenient for a user to control a smart TV using touch
methods, and remote controls for controlling smart TV's are not
very efficient.
SUMMARY
[0006] Embodiments include a system and method with which to
efficiently control a receiving terminal, such as a smart TV, etc.,
by using a user terminal, such as a smart phone, etc.
[0007] An embodiment includes a user terminal that includes: a
decoding unit configured to decode image data received from a
receiving terminal; a touch display unit configured to display the
decoded image; and an information transmitting unit configured to
transmit selection information for an event-executing entity
included in the decoded image to the receiving terminal.
[0008] Another embodiment includes a receiving terminal that
includes: an information transmitting unit configured to transmit
an entirety of or a portion of an image to a user terminal; and an
information receiving unit configured to receive selection
information for an event-executing entity included in the image
from the user terminal.
[0009] A terminal linkage method according to an embodiment
includes: sharing an image with a receiving terminal; receiving
input in the form of selection information for an event-executing
entity included in the image; and transmitting the selection
information to the receiving terminal.
[0010] A terminal linkage method according to another embodiment
includes: transmitting an entirety of or a portion of an image to a
user terminal; and receiving selection information for an
event-executing entity included in the image from the user
terminal.
[0011] A program of instructions that can be executed to link a
user terminal and a receiving terminal according to an embodiment
can be tangibly embodied in a recorded medium readable by a digital
processing device, where the program of instructions are for a
method that includes: sharing an image with the receiving terminal;
receiving input in a form of selection information for an
event-executing entity included in the image; and transmitting the
selection information to the receiving terminal.
[0012] A program of instructions that can be executed to link a
user terminal and a receiving terminal according to an embodiment
can be tangibly embodied in a recorded medium readable by a digital
processing device, where the program of instructions are for a
method that includes: transmitting an entirety of or a portion of
an image to the user terminal; and receiving selection information
for an event-executing entity included in the image from the user
terminal.
[0013] In a method and system for linking and controlling terminals
according to an embodiment, the receiving terminal can provide the
user terminal with an entire or a portion of a displayed image, the
user can select an event-executing entity included in the image via
the user terminal, and the user terminal can transmit the selection
information of the event-executing entity to the receiving
terminal. Thus, the user can efficiently control the operations of
the receiving terminal by using the user terminal.
[0014] Since the image of the receiving terminal may be displayed
on the user terminal, the user can control the receiving terminal
without looking at the receiving terminal and looking only at the
user terminal. Thus, controlling the receiving terminal can be
freed from the limitation of place.
[0015] Also, a position image can be shown on the user terminal,
indicating the position of a touch means that is nearby or is
touching with a light pressure or area, so that the user can
conveniently control the operation of the user terminal.
[0016] Furthermore, the position image can also be shown on the
receiving terminal, in which case the user can easily control the
operation of the receiving terminal while viewing only the
receiving terminal without looking at the user terminal.
[0017] Additional aspects and advantages of the present invention
will be set forth in part in the description which follows, and in
part will be obvious from the description, or may be learned by
practice.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] FIG. 1A, FIG. 1B, FIG. 1C, and FIG. 1D schematically
illustrate a system for linking and controlling terminals according
to an embodiment.
[0019] FIG. 2A and FIG. 2B illustrate a method of linking and
controlling terminals according to a first embodiment.
[0020] FIG. 3 illustrates a method of linking and controlling
terminals according to a second embodiment.
[0021] FIG. 4A and FIG. 4B illustrate a method of linking and
controlling terminals according to a third embodiment.
[0022] FIG. 5 illustrates a method of linking and controlling
terminals according to a fourth embodiment.
[0023] FIG. 6 illustrates a method of linking and controlling
terminals according to a fifth embodiment.
[0024] FIG. 7A, FIG. 7B, and FIG. 7C illustrate a control method
used for the method of linking and controlling terminals in FIG.
6.
[0025] FIG. 8A, FIG. 8B, FIG. 8C, FIG. 8D, and FIG. 8E illustrate a
linking operation when different sensing levels are set in
accordance with an embodiment.
[0026] FIG. 9 is a flowchart illustrating a method of linking and
controlling terminals according to a sixth embodiment.
[0027] FIG. 10 is a flowchart illustrating a method of linking and
controlling terminals according to a seventh embodiment.
[0028] FIG. 11 illustrates a system for linking and controlling
terminals according to another embodiment.
[0029] FIG. 12 is a flowchart illustrating a method of linking and
controlling terminals according to an eighth embodiment.
[0030] FIG. 13 illustrates a method of sensing a touch means
according to a first embodiment.
[0031] FIG. 14 illustrates a method of sensing a touch means
according to a second embodiment.
[0032] FIG. 15A and FIG. 15B illustrate a method of sensing a touch
means according to a third embodiment.
[0033] FIG. 16A and FIG. 16B illustrate a method of sensing a touch
means according to a fourth embodiment.
[0034] FIG. 17 is a block diagram illustrating the structure of a
user terminal according to an embodiment.
[0035] FIG. 18 is a block diagram illustrating the structure of a
user terminal according to another embodiment.
[0036] FIG. 19 is a block diagram illustrating the structure of a
receiving terminal according to an embodiment.
DETAILED DESCRIPTION
[0037] Certain embodiments of the present invention will be
described below in more detail with reference to the accompanying
drawings.
[0038] A system for linking and controlling terminals according to
an embodiment relates to linking and controlling terminals,
especially by sharing an image and controlling the operation of a
terminal based on the shared image. In particular, a system for
linking and controlling terminals according to an embodiment can
link a smaller terminal (e.g. smart phone, tablet PC, etc.), which
is controlled by a touch method, with a larger terminal (e.g. TV,
etc.) and enable various methods for controlling the larger
terminal with the smaller terminal. That is, a system for linking
and controlling terminals according to an embodiment can control a
larger terminal with a smaller terminal utilized as a remote
control.
[0039] For the sake of convenience, the smaller terminal controlled
directly by the user will be referred to as the user terminal or a
mobile terminal, and the larger terminal that receives the position
information of the touch means from the user terminal will be
referred to as the receiving terminal or a display device. Although
the user terminal may preferably have a smaller size compared to
the receiving terminal, the sizes of the terminals are not thus
limited.
[0040] The terminals used in a system according to an embodiment
may be provided with a function for display images and a function
for wired/wireless communication, but the communication function is
not necessarily essential to the invention. Considering real-life
applications, however, it may be preferable if each of the
terminals is equipped with an image display function and a
communication function. The terminal is not limited to a particular
type of device, as long as it is capable of displaying images and
exchanging signals with the another terminal, and various devices,
for example such as a smart phone, smart TV, remote control, PC,
tablet PC, laptop, touch pad, game console, cloud PC, etc., can be
used as the terminal in an embodiment. However, it may be
preferable if the smaller terminal is equipped with a touch
function.
[0041] A system for linking and controlling terminals according to
various embodiments of the present invention will be described
below in detail with reference to the accompanying drawings. For
convenience, the image shared by the terminals will be referred to
as a first image, and the image representing the position
information of the touch means will be referred to as a second
image (position image).
[0042] FIGS. 1A through 1D schematically illustrate a system for
linking and controlling terminals according to an embodiment.
[0043] Referring to FIG. 1A, a system for linking and controlling
terminals according to this embodiment can include a user terminal
100 and a receiving terminal 102, where the terminals 100 and 102
can be computing apparatuses.
[0044] The user terminal 100 may be a terminal that can be directly
controlled by the user and can be, for example, a smart phone,
remote control, etc., that is capable sharing an image with another
device. Also, the user terminal 100 can be a terminal having a
relatively small size and having a touch function, and for example
can be a mobile terminal.
[0045] The receiving terminal 102 may be a terminal that is not
directly manipulated by the user but is linked with the user
terminal 100, and can be a display-enabled terminal. That is, the
receiving terminal 102 may be any device capable of displaying an
image, and from the perspective of displaying an image, can also be
referred to as a display device. The receiving terminal 102 can be
a device used for a different purpose from that of the user
terminal 100, and for example can be a TV for showing broadcast
programs such as a drama series. In an embodiment, the receiving
terminal 102 may be a terminal having a relatively larger size,
although it may not necessarily have a touch function.
[0046] The overall size of the receiving terminal 102 can be larger
than the overall size of the user terminal 100, but it may be
sufficient if the size of the display unit on the receiving
terminal 102 is larger than the size of the display unit on the
user terminal 100. In the latter case, the overall sizes of the
user terminal 100 and receiving terminal 102 need not be
considered.
[0047] The terminals 100 and 102 can be connected directly in a
wired or wireless manner or indirectly using another device as a
medium. In one embodiment, the user may use the user terminal 100
to control the operation of the receiving terminal 102, more
specifically the operation of a program displayed on the receiving
terminal 102. Thus, the user terminal 100 can be located at a
distance that allows near-field communication with the receiving
terminal 102, and can be located for example at a distance from
which the user can view the receiving terminal 102. Of course, the
communication between the terminals 100 and 102 is not limited to
near-field communication; for example, the user can control the
operation of a receiving terminal 102 inside the home by using a
user terminal 100 from outside the home. This is possible because
the user terminal 100 shares at least a portion of the first image
displayed on the receiving terminal 102. Consequently, the control
of the receiving terminal can be freed from the limitation of
place.
[0048] According to an embodiment, the receiving terminal 102 can
transmit to the user terminal 100 the image data corresponding to
at least a portion of the first image 110 displayed on the
receiving terminal 102. The user terminal 100 may display a first
image 110 corresponding to the transmitted image data. That is, the
user terminal 100 and the receiving terminal 102 can share the
first image 110, as illustrated in FIG. 1B.
[0049] Of course, although the first image 110 displayed on the
user terminal 100 may be substantially the same as the first image
110 displayed on the receiving terminal 102, the contrast or
display proportion, etc., may differ according to the properties of
the user terminal 100. The user terminal 100 can display the first
image 110 as is, without particularly processing the transmitted
image data, or the receiving terminal 102 can convert the
resolution, etc., of the transmitted image data and then display
the first image 110 corresponding to the converted image data. In
another example, a separate device connected with the user terminal
100 can convert the image data transmitted from the receiving
terminal 102 to a format suitable for the user terminal 100 and
then transmit the converted image data to the user terminal
100.
[0050] That is, as long as the receiving terminal 102 and the user
terminal 100 display substantially the same first image 110, the
method of processing the image data at the user terminal 100 can be
modified in various ways.
[0051] According to another embodiment, the user terminal 100 can
transmit the image data corresponding to the shared first image 110
to the receiving terminal 102 and thus share the first image
110.
[0052] When the first image 110 on the receiving terminal 102 is
changed, the receiving terminal 102 can transmit image data
corresponding to the changed first image 110 to the user terminal
100. Consequently, the user terminal 100 and the receiving terminal
102 can continuously share the first image 110.
[0053] Also, when a user selects a particular event-executing
entity, for example, from among the first image 110 displayed on
the user terminal 100, the user terminal 100 can transmit the
information on the selection of the event-executing entity (the
selection information) to the receiving terminal 102. In this case,
the receiving terminal 102 can execute a corresponding operation in
accordance with the transmitted information, and as a result, the
first image 110 can be changed. Of course, image data corresponding
to the changed first image 110 may be transmitted to the user
terminal 100. That is, the user can use the user terminal 100 to
control the operation of the receiving terminal 102, particularly
the operation of a program executed by the receiving terminal
102.
[0054] The selection information can include position information
corresponding to a touch input for selecting an event-executing
entity. Alternatively, the selection information can include
control information for executing an event-executing entity
selected by the touch input. The user terminal 100 can generate the
position information or control information in accordance with the
touch input.
[0055] The receiving terminal 102 can execute an operation
corresponding to the event-executing entity by using the position
information corresponding to the touch input. Alternatively, the
receiving terminal 102 can execute the operation corresponding to
the event-executing entity by using the control information.
[0056] The selection of the event-executing entity can be
determined according to the pressure or area by which the touch
means touches the user terminal 100, and as will be described later
on, the event-executing entity can be selected when the user
touches the user terminal 100 by a pressure or area exceeding a
preset pressure or area. This will be described later in more
detail.
[0057] A description will now be provided on various embodiments by
which a user can control the operation of the receiving terminal
102 by using the user terminal 100.
[0058] According to an embodiment, the user terminal 100 can
display a third image (a position image) 114 indicating the
position information of a touch means that is in a touch-sensitive
region of the user terminal 100, for example a touch means that is
near the display unit or is touching the display unit with a
pressure or area smaller than or equal to the preset pressure or
area, as illustrated in FIG. 1C. That is, the position of a touch
means can be shown on the user terminal 100, so that the user may
control user terminal 100 conveniently. The touch means is not
limited to a particular form and can be a finger, a touch pen, etc.
Here, being near may refer to the touch means being positioned with
a preset distance from the user terminal 100, including the case of
the touch means contacting the user terminal 100.
[0059] According to another embodiment, if a touch means is brought
near to the user terminal 100 or is touching with a pressure or
area smaller than or equal to a preset pressure or area, a second
image 112 such as a pointer, a shadow image, etc., corresponding to
the position information of the touch means can be displayed on the
receiving terminal 102 together with the first image 110, as
illustrated in FIGS. 1B through 1D. The second image 112 can be
substantially the same as the third image 114 or can have a
different shape or color.
[0060] That is, the position of the touch means can be specified on
the user terminal 100 and the receiving terminal 102. As the second
image 112 is displayed on the receiving terminal 102, the user can
control the operation of the receiving terminal 102 while viewing
only the receiving terminal 102 and without looking at the user
terminal 100.
[0061] In an embodiment, the second image 112 or the third image
114 corresponding to the position of the touch means can be changed
according to the touch pressure or the touch area of the touch
means contacting the user terminal 100. For example, if the touch
means touches the user terminal 100 with a pressure or an area
smaller than or equal to a preset value, then a third image 114
corresponding to the position of the touch means may be shown on
the user terminal 100, whereas if the touch means touches the user
terminal 100 with a pressure or an area exceeding the preset value,
then the third image 114 corresponding to the position of the touch
means may not be shown on the user terminal 100 or may be changed
to a different shape. This operation can also apply in a similar
manner to the second image 112 displayed on the receiving terminal
102.
[0062] According to another embodiment, the shape of the second
image 112 or third image 114 can be varied according to the
distance between the touch means and the user terminal 100. For
example, the image 112 or 114 can be shaped as a shadow if the
distance between the touch means and the user terminal 100 is
within a first distance and can be shaped as a finger if the
distance between the touch means and the user terminal 100 is
within a second distance smaller than the first distance.
[0063] In various different embodiments, at least one of the
position of a touch means near to the user terminal 100, the
position of a touch means touching the user terminal 100 with a
touch pressure smaller than or equal to a preset value, and the
position of a touch means touching the user terminal 100 with a
touch area smaller than or equal to a preset value can be indicated
by a second image 112 or a third image 114.
[0064] In short, the user can control the operation of the
receiving terminal 102 conveniently by using the user terminal 100.
As a second image 112 indicating the position information of the
touch means is shown on the receiving terminal 102, the user can
use the user terminal 100 to control the receiving terminal 102
while viewing only the receiving terminal 102, without having to
look at the user terminal 100. In particular, even when the user
does not have a separate control means, the user can control the
receiving terminal 102 by using a smart phone, for example, carried
by the user.
[0065] Since the user can control the user terminal 100 by using a
touch means, not only can the user select an event-executing entity
to execute a particular operation, but also the user can input
particular characters, etc., and even work on documents on the user
terminal 100. Such operations on the user terminal 100 can be
reflected immediately on the receiving terminal 102.
[0066] Also, since the touch means can be utilized as a mouse for a
computer, etc, to perform various operations, the user can control
the receiving terminal 102 with greater convenience. For example,
the user can scroll, copy, or search, etc., articles in a portal
site remotely on the user terminal 100 which may be reflected on
the receiving terminal 102. Of course, the second image 112
indicating the position information of the touch means may be shown
on the receiving terminal 102, allowing the user to perform a
desired operation by looking at the second image 112 displayed on
the larger-sized receiving terminal 102. That is, embodiments can
involve showing a second image 112 on the receiving terminal 102 to
allow the user to control the receiving terminal 102 conveniently,
using the user terminal 100 to implement various functions
performed by a remote control, a mouse, a touch pen, etc. Thus, a
receiving terminal 102 such as a smart TV, which can perform
various functions, can be controlled in a convenient manner by
using a user terminal 100 typically carried by the user.
[0067] Although the descriptions above refer to the user terminal
100 as transmitting the image data for the first image 110 and the
position information of the touch means separately to the receiving
terminal 102, the user terminal 100 can just as well indicate the
position of the touch means in the first image 110 and transmit the
first image 110, in which the position of the touch means is
indicated, to the receiving terminal 102. For example, the user
terminal 100 can superimpose the position image for the touch means
over the first image 110 and transmit the image data, with the
position image of the touch means superimposed, to the receiving
terminal 102.
[0068] In another example, the user terminal 100 can modify a
region corresponding to the position of the touch means in the
first image 110 to a position image for the touch means. That is,
the user terminal 100 can modify the first image 110 itself and
create a new image to indicate the position of the touch means, and
can then transmit the created image to the receiving terminal
102.
[0069] According to another embodiment, if the user does not wish
to show the second image 112 during linked operation, the user can
select a menu item or input a button manipulation, etc., on the
user terminal 100 or on the receiving terminal 102 to stop showing
the second image 112 or the third image 114. In this case, the user
terminal 100 can continuously transmit the position information of
the touch means to the receiving terminal 102 and the receiving
terminal 102 can store the transmitted position information, so
that the receiving teiniinal 102 may display the second image 112
corresponding to the position information upon the user's
request.
[0070] According to yet another embodiment, the linkage and control
system can include a user terminal, a receiving terminal, and a
linkage server. The linkage server can transmit image data
corresponding to the first image to the user terminal and the
receiving terminal, and the user terminal can transmit the position
information of the touch means to the receiving terminal, so that
the second image indicating the position information of the touch
means may be shown together with the first image at the receiving
terminal. Here, the linkage server can pre-store the resolution
information, etc., of the user terminal and the receiving terminal
to process the image data based on the stored resolution
information, etc., before transmitting it to the user terminal and
the receiving terminal.
[0071] Terminals such as smart phones, TVs, etc., use wire
communication standards such as High-Definition Multimedia
Interface HDMI, Mobile High-definition Link MHL, Displayport, etc.,
and wireless communication standards such as Digital Living Network
Alliance DNLA and WiFi, etc., and are each provided not only with
data channels for transmitting data but also with a separate
channel for exchanging control signals. For example, HDMI uses
Consumer Electronic Control CEC, Display Data Channel DDC, Utility,
and SCL/SDA as control channels; MHL uses CBUS as a control
channel; and Displayport uses the auxiliary channel as a control
channel. Thus, it may not be necessary to establish separate
channels for linking the terminals 100 and 102, and the channels
already available on the terminals 100 and 102 can be utilized for
a channel by which to transmit the position information of the
touch means according to an embodiment.
[0072] The terminals 100 and 102 can exchange data in various forms
according to the communication method used for the linkage
system.
[0073] A description will now be provided of a method for
specifying the touch position of the touch means at the receiving
terminal 102.
[0074] In an embodiment, the position information of the touch
means can be the position information on the first image displayed
on the user terminal 100. Consequently, the position of the touch
means on the first image of the user terminal 100 can be reflected
in the first image 110 of the receiving terminal 102 as the second
image 112.
[0075] In another embodiment, the position information of the touch
means can be coordinate information that is in accordance with the
resolution or screen size of the display unit of the user terminal
100. That is, the position information of the touch means can be
the coordinate information of the touch means with respect to the
display unit of the user terminal 100, rather than the coordinate
information of the touch means on the image displayed on the
screen.
[0076] In still another embodiment, when the user terminal 100 is
to show the image data transmitted from the receiving terminal 102
in order that the user terminal 100 and the receiving terminal 102
may share the first image 110, the user terminal 100 can display
the first image 110 corresponding to the image data after setting
the screen to the same resolution as the receiving terminal 102.
Thus, the user terminal 100 can accurately represent the position
of the touch means by displaying the second image 112 with the
position information, e.g. the coordinate information, of the touch
means unmodified.
[0077] That is, the present invention can employ various methods
for specifying the position of the touch means at the receiving
terminal 102. The user terminal 100 can transmit the position
information of the touch means to the receiving terminal 102, or
generate second image data corresponding to the second image
representing the position information and send this second image
data to the receiving terminal 102, or transmit the first image to
the receiving terminal 102 with the region corresponding to the
position of the touch means modified.
[0078] The method for linking and controlling terminals according
to various embodiments will be described below in more detail with
reference to the accompanying drawings.
[0079] FIG. 2A and FIG. 2B illustrate a method of linking and
controlling terminals according to a first embodiment.
[0080] Referring to FIG. 2A, the receiving terminal 102 can
transmit image data corresponding to a first image 110 that
includes an event-executing entity 200, such as a UI, icon,
application program, link, etc., for example, to the user terminal
100. Consequently, the first image 110 displayed on the user
terminal 100 can include the event-executing entity 200.
[0081] According to an embodiment, when the user touches the
event-executing entity 200 with a touch means, the user terminal
100 can transmit selection information, which notifies that the
event-executing entity 200 was selected, to the receiving terminal
102. The selection information can include position infoll tation
corresponding to a touch input for selecting the event-executing
entity 200 or, depending on the touch input, control information
for executing the event-executing entity 200.
[0082] The receiving terminal 102 can recognize that the
event-executing entity 200 was selected, based on the information
corresponding to the touch input, and can execute an operation
related to the selection of the event-executing entity 200.
Alternatively, the receiving terminal 102 can execute the related
operation by using the control information for executing the
event-executing entity 200. Thus, the user can control the
operation of the receiving terminal 102 by using the user terminal
100.
[0083] Here, if the user brings a touch means near the
event-executing entity 200 of the user terminal 100 or touches it
with a pressure or an area smaller than or equal to a preset value,
then a second image 112 can be shown on the receiving terminal 102
in a position corresponding to the position of the touch means. Of
course, a third image 114 indicating the position of the touch
means can be shown on the user terminal 100 as well. If the user
touches the event-executing entity 200 with the touch means with a
pressure greater than the preset pressure or an area greater than
the preset area, then the receiving terminal 102 can perform an
operation corresponding to the event-executing entity 200, for
example by activating a game, etc. In this case, the first image
110 displayed on the receiving terminal 102 can be changed.
[0084] This method of linking terminals can be very useful not only
for games, navigation, etc., but also for controlling a smart TV by
using a user terminal 100, as illustrated in FIG. 2B.
[0085] FIG. 3 illustrates a method of linking and controlling
terminals according to a second embodiment.
[0086] Referring to FIG. 3, it is also possible to transmit to the
user terminal 100 only the image 300 corresponding to a UI for
control from among the first image 110 displayed on the receiving
terminal 102, so that the user terminal 100 can display a
corresponding image 302. That is, the user terminal 100 and the
receiving terminal 102 can share just a portion of the image,
especially a portion including an event-executing entity. In this
case, a second image 112 indicating the position of the touch means
may be displayed on the receiving terminal 102. Of course, a third
image 114 indicating the position of the touch means can also be
displayed on the user terminal 100, where the third image 114 can
be substantially the same as the second image 112.
[0087] In such a linkage system, when the user touches an
event-executing entity in the image 302 displayed on the user
terminal 100 with a touch means, the user terminal 100 can transmit
the selection information of the event-executing entity to the
receiving terminal 102, and the receiving terminal 102 can execute
the operation corresponding to the selection information.
Generally, the first image 110 on the receiving teI rliinal 102 may
be changed. In this case, if the image 300 of the event-executing
entity at the receiving terminal 102 is not changed, the user
terminal 100 may maintain the image 302, whereas if the image 300
of the event-executing entity at the receiving terminal 102 is
changed, an altered image 302 of the event-executing entity may be
shown at the user terminal 100.
[0088] If a touch means is brought near the display unit of the
user terminal 100, the second image 112 or third image 114 may be
shown. In particular, if the touch means is positioned over an
event-executing entity, event occurrence information can be
outputted, for example in the form of a sound, etc. This will be
described later in more detail with reference to FIG. 18.
[0089] In short, embodiments can involve using the user terminal
100 as a means for controlling the receiving terminal 102. This can
be particularly efficient for games, electronic commerce, smart TV,
etc.
[0090] FIG. 4A and FIG. 4B illustrate a method of linking and
controlling terminals according to a third embodiment.
[0091] Referring to FIG. 4A and FIG. 4B, the image for the second
image 112a indicating the position of the touch means can be
changed to a different shape. Also, the second image 112a can be
changed if the touch means is present at a preset position while it
is near the user terminal 100 or is touching the user terminal 100
with a pressure or an area smaller than or equal to a preset
pressure or area level.
[0092] For example, if the touch means is not positioned over an
event-executing entity, the second image 112a can be represented as
a shadow image as illustrated in FIG. 4A, but if the touch means is
positioned over an event-executing entity, it can be represented by
a finger image as illustrated in FIG. 4B. The second image 112 can
be changed when the touch means is positioned not only over an
icon, but also over an Internet address input window, the bottom of
the screen, a search window, a folder, and the like.
[0093] In this case, the receiving terminal 102 can output event
occurrence information, such as in the form of sound, light,
vibration, etc., according to the event associated with the
event-executing entity over which the touch means is positioned.
This will be described later in further detail. Changing the second
image 112 and outputting the event occurrence information can be
performed simultaneously. The method described above can also apply
in a similar manner to the third image 114.
[0094] According to another embodiment, the image 112 or 114 can be
changed according to the number of times or the touch duration of
the touch means touching the user terminal 100. For example, the
image 112 or 114 can be a shadow image when a touch means touches
the user terminal 100 once, but can be changed to an arrow image if
the touch means makes a touch twice in a row. In another example,
the image 112 or 114 can be a shadow image if the touch means
touches the user terminal 100 for a duration shorter than or equal
to a preset value, and can be changed to a different image if the
preset duration is exceeded.
[0095] In another example, the image 112 or 114 shown when a touch
means is near the user terminal 100 can be different from the image
112 or 114 shown when the touch means is touching the user terminal
100.
[0096] While the second image 112 and the third image 114 can be
changed simultaneously, it is also possible to change just one of
them or change them into images that are different from each
other.
[0097] The operations of the system for linking and controlling
terminals according to an embodiment will be described below in
more detail with reference to the accompanying drawings.
[0098] FIG. 5 illustrates a method of linking and controlling
terminals according to a fourth embodiment.
[0099] Referring to FIG. 5, a user terminal 100 and a receiving
terminal 102 may be connected to begin linked operation (S500). To
be more specific, if the user terminal 100 or the receiving
terminal 102 requests linkage to the counterpart terminal after the
user terminal 100 and the receiving terminal 102 connected by the
user for linked operation, or if the user terminal 100 requests
linkage to the receiving terminal 102 and the linkage is accepted,
a channel can be formed between the terminals 100 and 102 capable
of transmitting the position information of a touch means. Here,
the linkage of the terminals 100 and 102 can be performed according
to a user's request or can be performed automatically by a terminal
100 or 102.
[0100] According to an embodiment, the user terminal 100 and the
receiving terminal 102 can be connected by two channels, i.e. a
data channel and a control channel, and the control signal for
requesting or accepting linkage can be exchanged through the
control channel.
[0101] Next, the receiving terminal 102 may transmit image data
corresponding to at least a portion of a first image that is
currently being displayed or about to be displayed to the user
terminal 100, and the user terminal 100 may display the first image
corresponding to the transmitted image data (S502). That is, the
user terminal 100 and the receiving terminal 102 may share at least
a portion of the first image.
[0102] Then, when a user brings a touch means, such as a finger or
a touch pen, etc., near the user terminal 100 or contacts the touch
means with the user terminal 100 with a pressure or an area smaller
than or equal to a preset value, then the user terminal 100 may
sense the touch means (S504). The user terminal 100 can sense the
touch means using various methods such as capacitive sensing,
electromagnetic sensing, etc.
[0103] Next, the user terminal 100 may transmit the position
information of the touch means obtained according to the sensing
result to the receiving terminal 102, and the receiving terminal
102 may display a second image, which represents the position
information of the touch means transmitted thus, together with the
first image 110 (S506).
[0104] According to another embodiment, the user terminal 100 can
also transmit image data (or combined image data) that includes
data corresponding to the first image and data corresponding to the
second image to the receiving terminal 102. However, since this
embodiment basically involves the receiving terminal 102
transmitting the image data corresponding to the first image to the
user terminal 100, the former method of the user terminal 100
transmitting only the position information to the receiving
terminal 102 may be more efficient.
[0105] Then, when the user makes a touch with or moves the touch
means, the user terminal 100 may transmit the position information
of the touch means to the receiving terminal 102 to display the
second image on the receiving terminal 102 (S508). That is, the
movement of the touch means can be reflected on the receiving
terminal 102.
[0106] FIG. 6 illustrates a method of linking and controlling
terminals according to a fifth embodiment, FIGS. 7A through 7C
illustrate a control method used for the method of linking and
controlling terminals in FIG. 6, and FIGS. 8A through 8E illustrate
a linking operation when different sensing levels are set in
accordance with an embodiment.
[0107] Referring to FIG. 6, the user terminal 100 or the receiving
terminal 102 may request linkage to begin linked operation
(S600).
[0108] According to an embodiment, the user terminal 100 and the
receiving terminal 102 can be connected by two channels, i.e. a
data channel and a control channel, as illustrated in FIG. 7A, and
the control signal for requesting or accepting linkage can be
exchanged through the control channel.
[0109] According to another embodiment, the data can be exchanged
through one channel. For example, the transmission periods for the
channel can include data periods E1 and E2 for transmitting image
data, and a control period C between the data periods E1 and E2 for
transmitting the position information or the selection information
for an event-executing entity, as illustrated in FIG. 7C. In
particular, the position information or selection information can
be transmitted by utilizing the blank periods C in-between the
periods for transmitting image data. For example, the user terminal
100 can transmit the image data to the receiving terminal 102
through one channel, and can transmit the position information or
selection information to the receiving terminal 102 during the
blank periods C existing in-between the data periods E1 and E2 for
transmitting image data.
[0110] Next, the receiving terminal 102 may transmit image data
corresponding to at least a portion of the first image 110 to the
user terminal 100, to thus share the first image 110 with the user
terminal 100 (S602).
[0111] Then, the user terminal 100 may sense the touch pressure or
the touch area of the touch means (S604).
[0112] For example, in sensing the touch pressure or touch area,
the user terminal 100 can set multiple levels, e.g. two levels, for
the sensing levels, as illustrated in FIG. 7B. If it is not
performing a terminal-linked process, i.e. if the user is using
only the user terminal 100, the sensing level for the user terminal
100 can be set to a higher level (L2) such that the touch position
is sensed only when there is a light touch made by the user.
[0113] Conversely, if the user is performing a terminal-linked
process, i.e. if the receiving terminal 102 is also being used, the
user terminal 100 may set the sensing level to a lower level (L1).
Thus, the user terminal 100 can sense the touch means even when the
touch means is near and not touching or when the touch pressure or
touch area is smaller than or equal to a preset pressure or area.
In other embodiments, the user terminal 100 can employ various
methods other than the capacitance-based method, such as methods
based on electromagnetic induction, methods using a resistive
overlay, optical methods, ultrasonic methods, etc., and the
settings conditions can vary according to the method employed.
[0114] Next, the user terminal 100 may transmit the position
information of the touch means obtained according to the sensing
result to the receiving terminal 102, and the receiving terminal
102 may display a second image representing the position
information, for example a position image 112 indicating the
position of the touch means, such as that illustrated in FIG. 1B,
together with the first image 110 (S606).
[0115] Here, the position information of the touch means can be
information in an image form or information in a coordinate form.
That is, the user terminal 100 can generate the position
information for the second image 112 directly in the form of image
data and transmit it to the receiving terminal 102, or transmit
only the position information of the touch means to the receiving
terminal 102 in the form of a control signal. Alternatively, the
user terminal 100 can generate a position image (second image) for
of the touch means and transmit image data with the first image and
the second image included to the receiving terminal 102, so that
the second image 112 can be displayed on the receiving terminal 102
concurrently with the sharing of the first image 110.
[0116] For example, the position of the touch means can be
displayed on the receiving terminal 102 if the touch pressure of
the touch means is smaller than or equal to a preset value or the
touch area is smaller than or equal to a preset value. That is, if
the touch means touches the user terminal 100 lightly, the second
image 112 can be displayed on the receiving terminal 102. If the
touch means moves while touching the user terminal 100 lightly, the
second image 112 may reflect the movement of the touch means, and
the second image 112 on the receiving terminal 102 may also move
continuously (S608).
[0117] According to an embodiment, if the touch means touches the
user terminal 100 lightly, the user terminal 100 may not recognize
the touch of the touch means as a touch input. If the touch means
touches the user terminal 100 with a strong pressure, i.e. if the
touch pressure exceeds a preset pressure or the touch area exceeds
a preset touch area, the user terminal 100 may recognize the touch
of the touch means as a touch input. Thus, if the touch means
touches the user terminal 100 lightly, the icon may not be
executed, but if the touch means touches the icon strongly, the
icon can be executed.
[0118] As described above, the embodiment shown in FIG. 5 and the
embodiment shown in FIG. 6 can be applied together. That is, the
second image 112 can be displayed on the receiving terminal 102
when the touch means is positioned within a preset distance from
the user terminal 100 and when the touch means lightly touches the
user terminal 100.
[0119] Referring to FIGS. 8A to 8E, a more detailed description is
provided below on setting the sensing levels for the embodiments
illustrated in FIG. 5 and FIG. 6.
[0120] The user terminal 100 can be set to have multiple sensing
levels. For example, a first level, a second level, and a third
level can be set at the user terminal 100; a first level for
sensing a nearness of a touch means 804, a second level for sensing
the touch means 804 touching with a level equal to or smaller than
a preset level (pressure level or area level), and a third level
for sensing the touch means 804 touching with a level equal to or
greater than a preset level.
[0121] A description is provided below of the operations of the
user terminal 100 and the receiving terminal 102 when there are
multiple levels set as above.
[0122] At the receiving terminal 102, the first image 110 can be
displayed as illustrated in FIG. 8A. Of course, a first image 110
that is substantially the same can also be displayed at the user
terminal 100. The first image 110 can include event-executing
entitys 800 such as icons, application programs, links, etc., and
when an event-executing entity 800 is selected, a racing game, for
example, can be executed as illustrated in FIG. 8B. In the
descriptions that follow, it will be assumed that the touch
position of the touch means 804 corresponds to a particular point
on the event-executing entity 800.
[0123] First, when the touch means 804 is brought near the display
unit 806 of the user terminal 100 as illustrated in FIG. 8C, the
second image 112 indicating the position of the touch means 804 can
be displayed on the receiving terminal 102 as illustrated in FIG.
8A. In addition, the third image 114 can also be shown on the user
terminal 100.
[0124] Next, if the touch means 804 touches the display unit 806
with a level smaller than or equal to a preset level as illustrated
in FIG. 8D, the image 112 or 114 indicating the position of the
touch means 804 can remain as is, and the event-executing entity
800 may remain unexecuted.
[0125] If the touch means 804 touches the display unit 806 with a
level greater than or equal to the preset level as illustrated in
FIG. 8E, the user terminal can recognize this as a selection of the
event-executing entity 800 and execute the game. The distal end 810
of the touch means 804 can be structured such that it can be
inserted inside, and when the user makes a touch with the touch
means 804 with a level greater than or equal to a preset level, the
display unit 806 may be pressed with the distal end 810 inserted
inside, as illustrated in FIG. 8E.
[0126] In short, a linkage and control system based on this
embodiment can perform different operations according to sensing
levels.
[0127] In another embodiment, if the touch means 804 is brought
near, an image 112 or 114 indicating the position of the touch
means 804 can be shown, and if the touch means 804 touches the
display unit 806, the event-executing entity 800 can be
executed.
[0128] FIG. 9 is a flowchart illustrating a method of linking and
controlling terminals according to a sixth embodiment.
[0129] Referring to FIG. 9, the user terminal 100 and the receiving
terminal 102 may begin linked operation (S900).
[0130] Next, the receiving terminal 102 may transmit image data
corresponding to at least a portion of the displayed first image to
the user terminal 100, and the user terminal 100 may display a
first image corresponding to the image data (S902). That is, the
user terminal 100 and the receiving terminal 102 may share the
first image.
[0131] Then, the user terminal 100 may sense a touch means, such as
a finger, a touch pen, etc., through any of a variety of methods
(S904). The user terminal 100 can sense the position of a touch
means that is near the user terminal 100 or lightly touching the
user terminal 100.
[0132] Next, the user terminal 100 may generate a combined image,
including the currently displayed first image together with the
second image corresponding to the sensed position of the touch
means, and may transmit combined image data (combination
information) corresponding to the combined image to the receiving
terminal 102, and the receiving terminal 102 may display the
combined image corresponding to the combined image data (S906).
Consequently, the second image together with the first image may be
displayed on the receiving terminal 102. In this case, the user
terminal 100 can display the first image only or display the first
image and the second image together.
[0133] If the user makes a touch with the touch means or moves
while touching, the user terminal 100 may transmit the position
information of the touch means to the receiving terminal 102, and
the receiving terminal 102 may display the second image, which
indicates the position of the touch means in accordance with the
position information, together with the corresponding first image
(S908). That is, the movement of the touch means may be reflected
in the screen of the receiving terminal 102. The first image on the
receiving terminal 102 may be the same image as the previous image
or may be a different image from the previous image.
[0134] In short, this embodiment has the user terminal 100 generate
a combined image that includes the first image and the second image
and transmit the combined image thus generated to the receiving
terminal 102, so that the receiving terminal 102 may consequently
display the second image together with the first image.
[0135] Although the descriptions above use the expression "combined
image," it is possible to modify the first image itself such that
the first image indicates the position of the touch means. To be
more specific, the user terminal 100 can modify the first image
such that a region corresponding to the position of the touch means
is changed to a shadow image, etc., that is, the first image itself
can be modified to create a new image, after which the image thus
created can be transmitted to the receiving terminal 102. Here, the
modified first image can be substantially the same as the combined
image.
[0136] FIG. 10 is a flowchart illustrating a method of linking and
controlling terminals according to a seventh embodiment.
[0137] Referring to FIG. 10, the user terminal 100 and the
receiving terminal 102 may begin linked operation (S 1000).
[0138] Next, the user or the user terminal 100 may set multiple
sensing levels for sensing the touch means (S1002). As described
above, various levels can be set for different embodiments. Such
settings may be established at the beginning of the linked
operation or may be established beforehand in the user terminal 100
prior to linked operation.
[0139] Then, the user terminal 100 may sense the touch means, and
the receiving terminal 102 may display the second image, which
represents the sensed position of the touch means (S 1004).
[0140] Next, it may be determined whether or not there was a
request by the user to stop linked operation (S 1006). The stopping
of linked operation can be requested by the user by a method such
as a menu selection, etc., or can also be requested by turning off
the connection between the user terminal 100 and the receiving
terminal 102. The user, controlling the user terminal 100 while
viewing the receiving terminal 102, may wish to use the user
terminal 100 only or may wish to view the receiving terminal 102
only, in which case the user can request for a stopping of linked
operation while the user terminal 100 and the receiving terminal
102 are in a connected state.
[0141] If there is no request from the user to stop the linked
operation, then step S 1004 may be performed again.
[0142] Conversely, if there is a request from the user to stop
linked operation, then the user terminal 100 may initialize the
multiple levels such that only one level is available or change the
levels to sensing levels which only sense touches (S 1008). That
is, the user terminal 100 may change the levels such that the touch
means is not sensed if the touch means does not make a touch, and
that the touch means is sensed only when the touch means makes a
touch.
[0143] In short, a method of linking and controlling terminals
according to an embodiment can allow a user to arbitrarily request
linked operation and request a stopping of the linked operation and
to freely set and change sensing levels.
[0144] According to another embodiment, a preliminary sensing level
can be set at the beginning of linked operation between the user
terminal 100 and receiving terminal 102, and the sensing levels can
be set differently during linked operation. For example, the user
can set different sensing levels during linked operation according
to the nearness distance and touch strength of the touch means and
can change the sensing level settings to sense the touch means only
when it is in contact.
[0145] FIG. 11 illustrates a system for linking and controlling
terminals according to another embodiment.
[0146] Referring to FIG. 11, a system for linking and controlling
terminals based on this embodiment can include a user terminal 100,
a receiving terminal 102, and a transceiver device 1100 (e.g. a
dongle).
[0147] The transceiver device 1100 can connect the communications
between the user terminal 100 and the receiving terminal 102. To be
more specific, when the user terminal 100 transmits the position
information of a touch means, the combined image data, or the
selection information for an event-executing entity to the
transceiver device 1100, the transceiver device 1100 can transmit
the position information of the touch means, the combined image
data, or the selection information for the event-executing entity
to the receiving terminal 102. Also, image data transmitted from
the receiving terminal 102 can be transmitted by the transceiver
device 1100 to the user tell iinal 100.
[0148] The transceiver device 1100 can be connected to the user
terminal 100 or the receiving terminal 102, or can exist separately
without being connected to the user terminal 100 and receiving
terminal 102. The transceiver device 1100 can be, for example, a
dongle, a set-top, etc.
[0149] According to an embodiment, the position information,
combined image data, or selection information transmitted from the
user terminal 100 can be forwarded by the transceiver device 1100
as is, without modification, to the receiving terminal 102.
[0150] According to another embodiment, the transceiver device 1100
may convert the position information, combined image data, or
selection information transmitted from the user terminal 100 to a
format suitable for the receiving terminal 102 and then transmit
the converted position information or combined image data to the
receiving terminal 102. Since many companies currently manufacture
the receiving terminal 102, in the form of a smart TV, etc., it may
be necessary to match the position information, combined image
data, or selection information with the format of the receiving
terminal 102 according to manufacturer. An embodiment can use the
transceiver device 1100 to convert the position information,
combined image data, or selection information to fit the format of
the receiving terminal 102, so that the terminals 100 and 102 can
be linked regardless of manufacturer.
[0151] Although it is not illustrated in the drawings, the
transceiver device 1100 can include a communication unit, a signal
unit, and a format changer unit.
[0152] The communication unit may connect the user terminal 100 and
the receiving terminal 102.
[0153] The signal unit can transmit to the receiving terminal 102
the position information of the touch means, the combined image
data, or the selection information transmitted from the user
terminal 100, and can transmit the first image received from the
receiving terminal 102 to the user terminal 100.
[0154] The format changer unit can modify the position information,
combined image data, or selection information to the format of the
receiving terminal 102 or modify the first image to the format of
the user terminal 100.
[0155] In short, the user terminal 100 and the receiving terminal
102 can send or receive the position information, image data,
combined image data, or selection information by way of the
transceiver device 1100. In this case, the user terminal 100 or the
receiving terminal 102 connected to the transceiver device 1100
need not have a communication function.
[0156] According to another embodiment, the transceiver device 1100
may not only provide communication between the user terminal 100
and the receiving terminal 102 but may also sense the position of a
touch means 1102.
[0157] According to yet another embodiment, the transceiver device
1100 may serve as a communication means for the user terminal 100
and can provide a particular communication function, such as Wibro
communication, for example, and can also convert a particular
communication function into another communication function, such as
by converting Wibro to Wi-Fi, for example, for use by the user
terminal 100.
[0158] FIG. 12 is a flowchart illustrating a method of linking and
controlling terminals according to an eighth embodiment.
[0159] Referring to FIG. 12, a receiver may be installed on the
receiving terminal 102 (S1200). In another embodiment, the receiver
can be built into the receiving terminal 102.
[0160] Next, the user terminal 100 and the receiving terminal 102
may begin linked operation (S1202).
[0161] Then, the user terminal 100 may transmit image data
corresponding to the first image to the receiving terminal 102 to
share the first image 110, or the receiving terminal 102 may
transmit the image data to the user terminal 100 to share the first
image 110 (S1204).
[0162] Next, when a touch means is brought near the user terminal
100 or touches the display unit of the user terminal 100 with a
pressure or an area smaller than or equal to a preset pressure or
preset area, the receiver may sense the position of the touch means
by way of infrared rays and ultrasonic waves emitted from the touch
means and transmit the sensed position of the touch means to the
receiving terminal 102, and the receiving terminal 102 may display
the second image, which represents the position thus obtained by
sensing, together with the first image (S 1206).
[0163] Then, the user terminal 100 may transmit the position
information based on the movement of the touch means to the
receiving terminal 102, and the receiving terminal 102 can show the
movement of the touch means as a second image 112 or a third image
different from the second image.
[0164] Various methods for sensing a touch means will now be
described in more detail with reference to the accompanying
drawings.
[0165] FIG. 13 illustrates a method of sensing a touch means
according to a first embodiment.
[0166] Referring to FIG. 13, a touch pen 1300 can be used as the
touch means intended for touching the user terminal 100. A
capacitance-based touch panel may be used for the user terminal
100.
[0167] The touch pen 1300 may be composed of a body 1310 and a
touch part 1312. The body 1310 may be made of an electrically
non-conducting material, while the touch part 1312 may be a
conductor. Thus, because of the touch part 1312, a change in
capacitance may occur when the touch pen 1300 is brought near the
user terminal 100 or is touching the user terminal 100, and the
user terminal 100 can sense the touch pen 1300 based on the change
in capacitance.
[0168] FIG. 14 illustrates a method of sensing a touch means
according to a second embodiment.
[0169] Referring to FIG. 14, the user terminal 100 can include a
touch panel 1400 and an electromagnetic field generator unit
1402.
[0170] The electromagnetic field generator unit 1402 can be
connected to a rear surface of the touch panel 1400 and can be made
of a thin metal film to generate an electromagnetic field when
electricity is applied.
[0171] The touch pen 1404 may include a body 1410 and a touch part
1412, where the touch part 1412 can preferably be made of a small
metal coil. Consequently, when the touch pen 1404 is brought near
the touch panel 1400, electromagnetic induction may occur in the
touch part 1412, and as a result, an alteration may occur in the
electromagnetic field created by the electromagnetic field
generator unit 1402. Thus, the user terminal 100 may recognize the
position of the touch pen 1404 by sensing this alteration in the
electromagnetic field. In particular, since the alteration of the
electromagnetic field would differ according to the nearness and
touch strength of the touch pen 1404, this method of sensing the
touch means can minutely sense the degree of proximity and the
touch pressure of the touch pen 1404 with respect to the touch
panel 1400.
[0172] FIG. 15A and FIG. 15B illustrate a method of sensing a touch
means according to a third embodiment.
[0173] Referring to FIG. 15A, a receiver 1500 can be installed on a
portion of the user terminal 100, and a touch pen 1502 can be
used.
[0174] The receiver 1500 can include an infrared sensor and two
ultrasonic sensors to sense the movement of the touch pen 1502 by
receiving the infrared rays and ultrasonic waves emitted from the
touch part (pen tip) of the touch pen 1502, and can transmit the
position information of the touch pen 1502 obtained in accordance
with the sensing results to the receiving terminal 102. The
receiving terminal 102 may display a second image that represents
the transmitted position information. Consequently, the second
image may be displayed together with the first image. Here, the
position information can be transmitted to the receiving terminal
102 by the receiver 1500 or by the user terminal 100.
[0175] According to another embodiment, the receiver 1500 can
perform not only the function of sensing the position of the touch
pen 1502 but also the function of transmitting image data and
position information to the receiving terminal 102. To be more
specific, the receiver 1500 can include a touch control unit 1510,
an image signal unit 1512, a control signal unit 1514, and a
transceiver unit 1516, as illustrated in FIG. 15B.
[0176] The touch control unit 1510 may serve to sense the position
of the touch pen 1502 by using the received infrared rays and
ultrasonic waves and provide the user terminal 100 with the
information on the sensed position. The user terminal 100 may show
the position of the touch pen 1502 or perform a related operation
in accordance to the information thus provided.
[0177] The image signal unit 1512 can be provided with image data
from the user terminal 100 and transmit the image data thus
provided to the receiving terminal 102 via the transceiver unit
1516.
[0178] The control signal unit 1514 may serve to transmit a control
signal, which includes the position information of the touch pen
1502 obtained above by sensing, to the receiving terminal 102. That
is, since the receiver 1500 transmits the image data transmitted
from the user terminal 100 and the position information of the
touch pen 1502 to the receiving terminal 102, the user terminal 100
does not have to include a communication function. Therefore, even
with a terminal that does not have a communication function or a
terminal that has a communication function but is unable to use the
related communication facilities, it is possible to recognize the
position and action of the touch pen 1502 using the receiver 1500
as well as to employ a linkage method according to an embodiment
for sharing images, displaying the second image, etc.
[0179] In another embodiment, the receiver can be incorporated into
the user terminal 100 to be implemented as a single body.
[0180] FIG. 16A and FIG. 16B illustrate a method of sensing a touch
means according to a fourth embodiment.
[0181] Referring to FIG. 16A, in a system for linking and
controlling terminals according to an embodiment, a receiver 1600
can be installed on or built into the receiving terminal 102 rather
than the user terminal 100.
[0182] When a touch pen 1602 serving as the touch means for the
user terminal 100 emits ultrasonic waves and infrared rays to the
receiver 1600 installed on the receiving terminal 102, the receiver
1600 may receive the ultrasonic waves and infrared rays to sense
the position of the touch pen 1602.
[0183] The receiver 1600 may transmit information regarding the
position of the touch pen 1602 thus sensed to the receiving
terminal 102, and the receiving terminal 102 may display the second
image representing the position of the touch pen 1602 together with
the first image. Of course, the user terminal 100 and the receiving
terminal 102 may display the second image while sharing the first
image.
[0184] According to another embodiment, the receiver 1600 can serve
not only to sense the position of the touch pen 1602 but also to
perform communication. To be more specific, the receiver 1600 can
include a touch means sensing unit 1610, an image signal unit 1612,
a control signal unit 1614, and a transceiver unit 1616.
[0185] The touch means sensing unit 1610 may serve to sense the
position of the touch pen 1602.
[0186] The image signal unit 1612 may receive the combined image
data transmitted from the user terminal 100 by way of the
transceiver unit 1616 and transmit the received image data by way
of the transceiver unit 1616 to the receiving terminal 102, or may
also receive image data from the receiving terminal 102 and
transmit the received image data to the user terminal 100.
[0187] The control signal unit 1614 can receive a control signal
transmitted from the user terminal 100 related to the linked
operation, etc., and can transmit the received control signal to
the receiving terminal 102. Of course, the position information of
the touch pen 1602 need not be transmitted from the user terminal
100 to the receiver 1600. As such, the receiver 1600 may provide
not only the function of sensing the position of the touch pen 1602
but also a communication function. Thus, a linking and control
method according to an embodiment can be used even when the
receiving terminal 102 does not have a communication function.
[0188] FIG. 17 is a block diagram illustrating the structure of a
user terminal according to an embodiment.
[0189] Referring to FIG. 17, the user terminal 100 of this
embodiment can include a control unit 1700, a linkage unit 1702, an
image unit 1704, a control unit 1700, a linkage unit 1702, an image
unit 1704, a sensing unit 1706, a settings unit 1708, a display
unit 1710, a signal unit 1712, a transceiver unit 1714, a decoding
unit 1716, and a storage unit 1718.
[0190] The linkage unit 1702 may manage all functions related to
linkage with the receiving terminal 102.
[0191] The image unit 1704 can include an image generator unit and
an image changer unit and can display, via the display unit 1710,
an image corresponding to the image data transmitted from the
receiving terminal 102.
[0192] The image generator unit can generate a position image,
which may represent the position of a touch means that is
positioned within a preset distance from the display unit 1710,
generate a combined image, which may include the second image and
the first image, or generate a position image of a touch means that
is touching the display unit 1710 with a pressure or an area
smaller than or equal to a preset value.
[0193] The image changer unit can change the position image
according to the distance between the touch means and the user
terminal 100, and can change the position image according to the
pressure or area with which the touch means contacts the user
terminal 100. Also, the image changer unit can change the position
image according to whether or not the touch means is over an
event-executing entity or a preset position.
[0194] The sensing unit 1706 may serve to sense a touch means, such
as a finger or a touch pen, etc. More specifically, the sensing
unit 1706 can sense the position of the touch means, distinguishing
when the touch means is near and when it is touching. The method of
sensing is not limited to a particular method and can be a
capacitance-based method, an electromagnetic induction-based
method, etc. The information on the position of the touch means as
sensed by the sensing unit 1706 can also be generated by a position
information generating unit (not shown).
[0195] The settings unit 1708 may manage the settings of various
functions, such as linkage function settings, sensing level
settings, etc.
[0196] The display unit 1710 can be implemented in various ways
such as by using capacitance-based types, resistive overlay types,
electromagnetic induction types, etc. The display unit 1710 can be
a touch display unit equipped with a touch function.
[0197] The signal unit 1712 can include an information transmitting
unit 1720 and an information receiving unit 1722.
[0198] The information transmitting unit 1720 can transmit the
position information of a touch means or the selection information
for an event-executing entity to the receiving terminal 102. The
selection information can include position information
corresponding to a touch input for selecting an event-executing
entity or control information for executing an event-executing
entity according to a touch input.
[0199] The information receiving unit 1722 may receive image data
from the receiving terminal 102 corresponding to the entirety or a
portion of the first image.
[0200] The transceiver unit 1714 may serve as a communication
passageway to the receiving terminal 102.
[0201] The decoding unit 1716 may decode the image data received
from the receiving terminal 102.
[0202] The storage unit 1718 may store various data, such as the
first image, image signals, position information, control signals,
application programs, etc.
[0203] The control unit 1700 may control the overall operations of
the components of the user terminal 100.
[0204] Although it has not been described above, the user terminal
100 can further include a touch means unit, a receiver unit, an
electromagnetic field generator unit, or a touch means operating
unit.
[0205] The touch means unit may receive and manage information on
the position of the touch means when the receiver is connected with
the user terminal 100. That is, the receiver may sense the position
of the touch means, while the touch means unit may analyze the
signals transmitted from the receiver to detect the position of the
touch means.
[0206] The receiver unit may receive infrared rays and ultrasonic
waves transmitted from the touch means when the touch means is
brought near to or in contact with the user terminal 100, and may
analyze the infrared rays and ultrasonic waves thus received to
detect the position of the touch means.
[0207] The electromagnetic field generator unit may serve to create
an electromagnetic field for sensing the touch means by
electromagnetic induction, and may preferably be formed on a rear
surface of the display unit.
[0208] The touch means operating unit can perform a particular
operation when the touch means is brought near the user terminal
100 or is touching the user terminal 100 with a pressure or an area
smaller than or equal to a preset value. For example, a scroll
function can be performed if the touch means is brought near a
lower part of the display unit 1710 on the user terminal 100. In
this case, the position information of the touch means can be
transmitted to the receiving terminal 102 or a third image 114
corresponding to the position information can be displayed on the
display unit 1710. That is, the user terminal 100 can transmit the
position information of the touch means or display a third image
114 on the user terminal 100, in response to an approaching near of
or a light touch by the touch means, to result in a particular
operation such as scrolling, etc.
[0209] FIG. 18 is a block diagram illustrating the structure of a
user terminal according to another embodiment.
[0210] Referring to FIG. 18, the user terminal 100 of this
embodiment can include a control unit 1800, a display unit 1802, a
sensing unit 1804, a signal unit 1806, an image unit 1808, an
information provider unit 1810, and a mode changer unit 1812. The
user terminal 100 can include all or just some of the components
above. Also, the user terminal 100 can additionally include
components other than the components above.
[0211] The display unit 1802 may display a first image that is
shared by the user terminal 100 and the receiving terminal 102.
Also, the display unit 1802 can display a menu, etc., from which to
select a touch mode.
[0212] The sensing unit 1804 may sense the position of a touch
means by way of various methods such as those described above, when
the touch means is near or is touching the user terminal 100. Here,
the position information representing the position of the touch
means can be the position information of the touch means that is
positioned within a preset distance from the user terminal 100.
[0213] The signal unit 1806 may transmit the position information
of the touch means obtained by the sensing above to the receiving
terminal 102 and may transmit image data corresponding to the first
image shared by the user terminal 100 and the receiving terminal
102 to the receiving terminal 102.
[0214] The image unit 1808 may generate combined image data that is
to be shared by the user terminal 100 and the receiving terminal
102 or the second image that indicates the position of the touch
means.
[0215] The information provider unit 1810 can output information
according to the sensing results of the sensing unit 1804. That is,
nearness information can be outputted if a touch means is brought
near the user terminal 100 and sensed by the sensing unit 1804. The
nearness information can be in the form of vibration, sound, or
light, so as to stimulate the user's tactile, auditory, or visual
senses.
[0216] The information provider unit 1810 can provide the user with
a tactile, auditory, or visual sensation in various types according
to the state of nearness of the touch means with respect to the
user terminal 100, to allow the user of the user terminal 100 to
perceive various tactile, auditory, or visual sensations.
[0217] For example, with the user terminal 100 having recognized a
near touch of the touch means, the information provider unit 1810
can provide a continuous vibration, sound, or light during a
movement of the touch means. That is, a short vibration can be
provided once when a near touch of the touch means is first
recognized, after which continuous vibrations can be provided when
the touch means moves. In other words, when a near touch of the
touch means is first recognized, a vibration can be provided for a
first duration, and afterwards when the touch means moves, a
vibration can be provided for a second duration. Here, the second
duration can be longer than the first duration.
[0218] Alternatively, the information provider unit 1810 can
provide a vibration when a near touch is first recognized, and
afterwards provide a sound when the touch means is moving.
[0219] In another example, the information provider unit 1810 can
provide the user with nearness information in the form of sound,
etc., when the touch means is brought near a preset entity such as
a folder, control UI, etc., from among the images shown on the
screen of the user terminal 100. That is, as described above with
reference to FIGS. 6A and 6B, the position image of the touch means
can change when the touch means is placed at a preset position, and
at this time, the information provider unit 1810 can output
nearness information, where the nearness information can correspond
to event occurrence information. In this way, the user can perceive
the entity immediately.
[0220] The information provider unit 1810 can provide the nearness
information for a first duration when the sensing unit 1804
recognizes a nearness state of the touch means, and can provide the
nearness information for a second duration when the touch means
moves while the sensing unit 1804 is aware of the nearness state of
the touch means. The second duration can be a duration
corresponding to the duration for which the touch means moves while
the sensing unit 1804 is aware of the nearness state of the touch
means.
[0221] Also, the information provider unit 1810 can provide the
nearness information in different forms for a first case in which
the sensing unit 1804 recognizes a nearness state of a touch means
and a second case in which the touch means is moved while the
sensing unit 1804 is aware of the nearness state of the touch
means. As described above, a vibration can be provided for the
first case and a sound can be provided for the second case, or
vibrations of a first pattern can be provided for the first case
and vibrations of a second pattern can be provided for the second
case, so as to allow the user to perceive the movement of the touch
means.
[0222] If the touch means touches the display unit 1802 after the
sensing unit 1804 has recognized a nearness state, the information
provider unit 1810 may not provide the nearness information,
allowing the user of the user terminal to differentiate between a
near state and a direct touch.
[0223] After the touch means has touched the display unit 1802, if
the sensing unit 1804 recognizes a nearness state of the touch
means for a second time, the information provider unit 1810 can
provide the nearness information. Once the touch means is brought
near the user terminal 100 and touches the user terminal 100, the
touch means may be separated from the user terminal 100. That is,
since the purpose of the touch has been fulfilled, the touch means
may be separated from the user terminal 100 to proceed with the
next operation, at which time a nearness state may occur again. In
this case, since the touch means is put in a nearness state for the
first time after touching the display unit 1802 and was not
intentionally placed in a nearness state by the user, the
information provider unit 1810 may not provide nearness
information. Then, when a nearness state occurs for the second
time, i.e. the user intentionally triggers a nearness state, the
information provider unit 1810 can provide nearness
information.
[0224] The mode changer unit 1812 may change the touch mode
according to the user's input, where the touch mode can include a
near touch mode 1820 and a direct touch mode 1822. Here, the user
input can be made by way of a switch. The switch can be configured
as a ring/vibration conversion switch.
[0225] In the near touch mode 1820, the position image of a nearby
touch means can be displayed on the user terminal 100 or the
receiving terminal. Also, the position image of a touch means
touching the user terminal 100 with a pressure level or area
smaller than or equal to a preset value can be displayed on the
user terminal 100 or the receiving terminal.
[0226] In the near touch mode, the user terminal 100 may recognize
the position of a touch means that is near the user terminal 100
for a touch input of the touch means, and in the direct touch mode,
the user terminal 100 may recognize a touch by the touch means as a
touch input.
[0227] According to an embodiment, the mode changer unit 1812 can
change the touch mode at the beginning of linked operation between
the user terminal 100 and the receiving terminal 102 or change the
touch mode according to the user's command, e.g. the user's touch,
voice data, or visual data. For example, the user can change the
touch mode by selecting on a menu shown on the user terminal 100,
or the touch mode can be changed by the user's voice or a visual
sequence such as motion.
[0228] The touch mode change function described above can be
provided when the linked operation of the user terminal 100 and the
receiving terminal 102 begins, or can be provided in the user
terminal 100 regardless of linked operation. Also, the touch mode
change function for linked operation can be provided automatically
when the linked operation begins or can be provided after linking
when the user selects the function. With a small screen as on a
smart phone, it can be useful to sense a touch means that is nearby
and show a corresponding image on the smart phone, and in such a
device, it may be advantageous to provide the touch mode change
function regardless of linked operation.
[0229] According to another embodiment, the user terminal 100 can
be provided with a link mode in addition to the near touch mode and
direct touch mode. For example, if a user carrying a user terminal
100 such as a smart phone or a tablet PC, etc., wishes to link it
to a receiving terminal 102, the user can select the link mode.
When the user selects the link mode, the user terminal 100 can
search for display apparatuses close by, begin linking with a
searched receiving terminal 102, and display a menu from which to
select a near touch mode and a direct touch mode on the user
terminal 100 at the beginning of the linked operation.
[0230] While it is not described above, the selection of the touch
mode change can be achieved by methods other than selecting a menu
displayed on the user terminal 100, such as by pressing a button
provided on a side surface, a front surface, etc., of a smart
phone.
[0231] The control unit 1800 may control the overall operations of
the components of the user terminal 100.
[0232] FIG. 19 is a block diagram illustrating the structure of a
receiving terminal according to an embodiment.
[0233] Referring to FIG. 19, a receiving terminal 102 based on this
embodiment can include a control unit 1900, a linkage unit 1902, a
transceiver unit 1904, a signal unit 1906, an image unit 1908, a
display unit 1910, and an operation executing unit 1912.
[0234] The linkage unit 1902 may manage the function of linking to
the user terminal 100.
[0235] The transceiver unit 1904 may serve as a communication
passageway to the user terminal 100.
[0236] The signal unit 1906 can include an information transmitting
unit 1920 and an information receiving unit 1922.
[0237] The information transmitting unit 1920 may transmit image
data corresponding to the first image to the user terminal 100.
[0238] The information receiving unit 1922 can receive the position
information of the touch means or the selection information of the
event-executing entity that is transmitted from the user terminal
100.
[0239] The image unit 1908 may display the first image through the
display unit 1910, and may display the second image corresponding
to the position information of the touch means received above,
together with the first image. According to another embodiment, the
image unit 1908 can combine the first image with the second image
and display the combined image, i.e. the result of the combining,
through the display unit 1910.
[0240] The display unit 1910 is not limited to a particular type as
long as it is capable of displaying images, and can be implemented,
for example, as an LCD, OLED, PDP, etc. The display unit 1910 does
not necessarily require a touch function.
[0241] The operation executing unit 1912 can execute the operation
corresponding to the selection information of the event-executing
entity.
[0242] The storage unit 1914 may store various data such as the
first image, the second image, a combined image, application
programs, etc.
[0243] The control unit 1900 may control the overall operations of
the components of the receiving terminal 102.
[0244] Components in the embodiments described above can be easily
understood from the perspective of processes. That is, each
component can also be understood as an individual process.
Likewise, processes in the embodiments described above can be
easily understood from the perspective of components.
[0245] Also, the technical features described above can be
implemented in the form of program instructions that may be
performed using various computer means and can be recorded in a
computer-readable medium. Such a computer-readable medium can
include program instructions, data files, data structures, etc.,
alone or in combination. The program instructions recorded on the
medium can be designed and configured specifically for the present
invention or can be a type of medium known to and used by the
skilled person in the field of computer software. Examples of a
computer-readable medium may include magnetic media such as hard
disks, floppy disks, magnetic tapes, etc., optical media such as
CD-ROM's, DVD's, etc., magneto-optical media such as floptical
disks, etc., and hardware devices such as ROM, RAM, flash memory,
etc. Examples of the program of instructions may include not only
machine language codes produced by a compiler but also high-level
language codes that can be executed by a computer through the use
of an interpreter, etc. The hardware mentioned above can be made to
operate as one or more software modules that perform the actions of
the embodiments, and vice versa.
[0246] The embodiments described above are disclosed only for
illustrative purposes. A person having ordinary skill in the art
would be able to make various modifications, alterations, and
additions without departing from the spirit and scope, but it is to
be appreciated that such modifications, alterations, and additions
are encompassed by the scope of claims set forth below.
* * * * *