U.S. patent application number 17/573192 was filed with the patent office on 2022-06-30 for privacy enforcement via localized personalization.
The applicant listed for this patent is Intel Corporation. Invention is credited to Nathan Heldt-Sheller, Ned M. Smith, Thomas G. Willis.
Application Number | 20220207178 17/573192 |
Document ID | / |
Family ID | |
Filed Date | 2022-06-30 |
United States Patent
Application |
20220207178 |
Kind Code |
A1 |
Smith; Ned M. ; et
al. |
June 30, 2022 |
PRIVACY ENFORCEMENT VIA LOCALIZED PERSONALIZATION
Abstract
This disclosure is directed to privacy enforcement via localized
personalization. An example device may comprise at least a user
interface to present content. A message may be received into a
trusted execution environment (TEE) situated within the device or
remotely, the message including at least metadata and content. The
TEE may determine relevance of the content to a user based on the
metadata and user data. Based on the relevance, the TEE may cause
the content to be presented to the user via the user interface. In
one embodiment, the TEE may be able to personalize the content
based on the user data prior to presentation. If the content
includes an offer, the TEE may also be able to present
counteroffers to the user based on user interaction with the
content. The TEE may also be able to cause feedback data to be
transmitted to at least the content provider.
Inventors: |
Smith; Ned M.; (Beaverton,
OR) ; Heldt-Sheller; Nathan; (Portland, OR) ;
Willis; Thomas G.; (Portland, OR) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Intel Corporation |
Santa Clara |
CA |
US |
|
|
Appl. No.: |
17/573192 |
Filed: |
January 11, 2022 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
15039021 |
May 24, 2016 |
11244068 |
|
|
PCT/US2013/077653 |
Dec 24, 2013 |
|
|
|
17573192 |
|
|
|
|
International
Class: |
G06F 21/62 20060101
G06F021/62; G06F 17/00 20060101 G06F017/00; G06F 21/60 20060101
G06F021/60; G06Q 30/02 20060101 G06Q030/02 |
Claims
1. A device configured for privacy enforcement, comprising: a
communication module to interact with at least a content provider;
a user interface module to present content to a user; and a trusted
execution environment (TEE) to: receive a message from the content
provider via the communication module, the message including at
least metadata and content; determine relevance of the content to
the user based on at least one of the metadata and user data; and
cause the content to be presented to the user via the user
interface module based on the relevance of the content.
Description
PRIORITY APPLICATION
[0001] This application is a continuation of U.S. application Ser.
No. 15/039,021, filed May 24, 2016, which is a U.S. National Stage
Application under 35 U.S.C. 371 from International Application No.
PCT/US2013/077653, filed on Dec. 24, 2013, published as WO
2015/099697, all of which are incorporated herein by reference in
their entirety.
TECHNICAL FIELD
[0002] The present disclosure relates to data security, and more
particularly, to a scheme for allowing message reception,
personalization, interaction, etc. while protecting personal
data.
BACKGROUND
[0003] Electronic communication has become well-integrated in
various aspects of modern society. A user may not simply benefit
from being able to access different types of content using various
devices at virtually any time, but this convenience may develop
into reliance. This level of attention to a particular information
source may be attractive to certain content providers. For example,
governmental entities may broadcast important information to their
constituents, educational institutions may provide information to
students and parents, and of course, business concerns may seek to
deliver advertisements and similar content to potential consumers.
These institutions may not desire to indiscriminately blanket all
existing devices with a variety of messages. In addition to the
potential to alienate their target audience with a barrage of
irrelevant information, the additional traffic flowing through
wired and/or wireless communication mediums may cause performance
issues that may further enrage the audience they wish to capture.
Thus, many content providers attempt to direct their communications
to specific parties that may have interest in the content or in
products advertised in the content.
[0004] In tailoring content delivery to certain audiences, content
providers may require some information about the people using these
devices. However, users may desire the ability to exercise control
over how their personal information is disseminated (e.g., to guard
against being overwhelmed by an avalanche of offers,
advertisements, etc.). More importantly, as users become more
reliant upon their various electronic devices, there is a
correspondingly increasing concern about private data getting into
the wrong hands. For example, users may store a large amount of
private information on their devices including data that identifies
the user, where the user lives, where the user works, the user's
medical conditions, the user's financial accounts, the user's
relatives, friends, etc. The fear of this information possibly
being obtained by people having mischievous or even criminal
intentions may cause users to resist receiving content that they
may otherwise have enjoyed or otherwise benefited from.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] Features and advantages of various embodiments of the
claimed subject matter will become apparent as the following
Detailed Description proceeds, and upon reference to the
[0006] Drawings, wherein like numerals designate like parts, and in
which:
[0007] FIG. 1 illustrates an example system configured for privacy
enforcement via localized personalization in accordance with at
least one embodiment of the present disclosure;
[0008] FIG. 2 illustrates an example configuration for a device in
accordance with at least one embodiment of the present
disclosure;
[0009] FIG. 3 illustrates an example message in accordance with at
least one embodiment of the present disclosure;
[0010] FIG. 4 illustrates example instructions for dimension
matching in accordance with at least one embodiment of the present
disclosure;
[0011] FIG. 5 illustrates example instructions for content
personalization in accordance with at least one embodiment of the
present disclosure;
[0012] FIG. 6 illustrates example feedback in accordance with at
least one embodiment of the present disclosure;
[0013] FIG. 7 illustrates example operations for message generation
and feedback reception in accordance with at least one embodiment
of the present disclosure; and
[0014] FIG. 8 illustrates example operations for privacy
enforcement via localized personalization in accordance with at
least one embodiment of the present disclosure.
[0015] Although the following Detailed Description will proceed
with reference being made to illustrative embodiments, many
alternatives, modifications and variations thereof will be apparent
to those skilled in the art.
DETAILED DESCRIPTION
[0016] This disclosure is directed to privacy enforcement via
localized personalization. An example device may comprise at least
a user interface to present content. A message may be received into
a trusted execution environment (TEE) situated within the device or
remotely, the message including at least metadata and content. The
TEE may determine relevance of the content to a user based on the
metadata and user data. Based on the relevance, the TEE may cause
the content to be presented to the user via the user interface. In
one embodiment, the TEE may be able to personalize the content
based on the user data prior to presentation. If the content
includes an offer, the TEE may also be able to present
counteroffers to the user based on user interaction with the
content. The TEE may also be able to cause feedback data to be
transmitted to at least the content provider.
[0017] In one embodiment, an example device configured for privacy
enforcement may comprise at least a communication module, a user
interface module and a TEE. The communication module may be to
interact with at least a content provider. The user interface
module may be to present content to a user. The TEE may be to
receive a message from the content provider via the communication
module, the message including at least metadata and content, to
determine relevance of the content to the user based on at least
one of the metadata and user data and to cause the content to be
presented to the user via the user interface module based on the
relevance of the content.
[0018] For example, the TEE may be situated in the device or
outside of the device in at least one computing device. The TEE may
comprise a secure memory space accessible to only applications
verified as safe by the TEE. The metadata may comprise at least
public routing data and private criteria. In one example
implementation, at least the private criteria are encrypted and the
TEE is further to decrypt the private criteria. The private
criteria may comprise dimension matching criteria including
instructions for determining the relevance of the content. The TEE
may further be to personalize the content prior to presentation
based on personalization criteria also included in the private
criteria, the personalization criteria including instructions for
altering the content based on the user data. The TEE may further be
to cause additional content to be presented via the user interface
module based on counter offer criteria also included in the private
criteria, the counter offer criteria including instructions for
presenting additional content based on the interaction between the
user and the presented content. The private criteria may also
comprise feedback criteria including instructions for collecting
the feedback data based on at least one of the user data and
interaction between the user and the presented content. In this
regard, the TEE may further be to cause the feedback data to be
collected based on the interaction and cause the feedback data to
be transmitted to at least the content provider. The feedback data
may comprise, for example, at least privacy protected data
resulting from the interaction and sanitized user data, the TEE
being further to cause the communication module to transmit the
privacy protected data to the content provider and to transmit the
sanitized user data to an anonymous data accumulator.
[0019] The device may further comprise a data collection module to
collect the user data from at least one of user interaction with
the device, sensors in the device or data sources outside the
device. The TEE may further be to cause the user interface module
to present a notification informing the user regarding availability
of the content. An example method consistent with embodiments of
the present disclosure may comprise receiving a message in a TEE
from a content provider, the message including at least metadata
and content, determining relevance of the content to a user based
on at least one of the metadata and user data and causing the
content to be presented to the user based on the relevance of the
content.
[0020] FIG. 1 illustrates an example system configured for privacy
enforcement via localized personalization in accordance with at
least one embodiment of the present disclosure. System 100 may
comprise, for example, device 102 and content provider 104.
Examples of device 102 may include, but are not limited to, a
mobile communication device such as a cellular handset or a
smartphone based on the Android.RTM. OS, iOS.RTM., Windows.RTM. OS,
Blackberry.RTM. OS, Palm.RTM. OS, Symbian.RTM. OS, etc., a mobile
computing device such as a tablet computer like an iPad.RTM.,
Surface.RTM., Galaxy Tab.RTM., Kindle Fire.RTM., etc., an
Ultrabook.RTM. including a low-power chipset manufactured by Intel
Corporation, a netbook, a notebook, a laptop, a palmtop, etc., a
typically stationary computing device such as a desktop computer, a
smart television, etc. Content provider 104 may be situated
remotely from device 102, and may comprise at least one computing
device accessible via a local area network (LAN) or a wide area
network (WAN) such as the Internet. An example of content provider
104 may include one or more servers organized in a cloud computing
configuration.
[0021] System 100 may further comprise, for example, TEE module
106, data collection module 108, user data module 110, context data
module 112 and user interface module 114. User interface module 114
may be in device 102 (e.g., content may be presented to a user of
device 102 via user interface 114). However, as indicated by dashed
line 136, modules 106 to 112 may be flexibly arranged consistent
with the present disclosure. For example, while any or all of
modules 106 to 112 may be located in device 102, it is also
possible for any of these modules to be located remotely from
device 102 (e.g., supported by at least one server in a
cloud-computing configuration similar to content provider 104.
There are advantages to both configurations. Having modules 106 to
112 located within device 102 may improve the security of the data
handled by these modules (e.g., there is no need to expose data in
cloud-based servers, during transmission to device 102, etc.).
However, moving the functionality associated with modules 106 to
112 to a remote device may reduce the data processing load on
device 102 and allow for implementation of system 100 using a
broader range of devices.
[0022] TEE module 106 may be a secure workspace in which known-good
programs may execute, confidential information may be stored in a
secure manner, etc. In general, TEE module 106 may comprise a set
of computing resources that are secure such that programs executing
within TEE module 106, and any data associated with the executing
programs, are isolated. The programs/data cannot be interfered with
or observed during program execution with the exception that the
program may be started or stopped and the associated data may be
inserted or deleted. The insertion of data may be unobserved, and
thus not interfered with, and any data leaving TEE module 106 is
released in a controlled manner. Consistent with the present
disclosure, at least one known-good program executing within TEE
module 106 may perform any or all operations disclosed herein in
regard to TEE module 106. In one example implementation, TEE module
106 may utilize Software Guard Extensions (SGX) technology
developed by the Intel Corporation. SGX may provide a secure and
hardware-encrypted computation and storage area inside of the
system memory, the contents of which cannot be deciphered by
privileged code or even through the application of hardware probes
to memory bus. When TEE module 106 is protected by SGX, embodiments
consistent with the present disclosure make it impossible for an
intruder to decipher the contents of TEE module 106. Protected data
cannot be observed outside of SGX, and thus, is inaccessible
outside of SGX.
[0023] In an example implementation wherein TEE module 106 resides
within SGX, the identity of programs (e.g., based on a
cryptographic hash measurement of each program's contents) may be
signed and stored inside each program. When the programs are then
loaded, the processor verifies that the measurement of the program
(e.g., as computed by the processor) is identical to the
measurement previously embedded inside the program. The signature
used to sign the embedded measurement is also verifiable because
the processor is provided with a public key used to verify the
signature at program load time. This way malware can't tamper with
the program without also altering its verifiable measurement.
Malware also cannot spoof the signature because the signing key is
secure with the program's author. Thus, the software may not be
read, written to or altered by any malware. Moreover, data may also
be protected in TEE module 106. For example, known-good programs in
TEE module 106 may encrypt data such as keys, passwords, licenses,
etc. so that only verified good programs may decrypt this
information. While only one TEE module 106 is disclosed in device
102, it is also possible for a plurality of TEE modules 106 to
exist. The use of a plurality of TEE modules 106 may increase
security in device 102 in that if one TEE module 106 is compromised
the security of the remaining separate TEE modules 106 remains
intact.
[0024] Data collection module 108 may be configured to collect data
regarding the status of device 102, a user of device 102, an
environment in which the device is operating, etc. This data may be
provided by various resources including, but not limited to, data
stored within the device, sensors in the device, a LAN or WAN like
the Internet, etc. For example, data collection module 108 may
collect user data including, but not limited to, data identifying
at least one user of device 102, background information regarding
the at least one user's gender, age, ethnicity, education,
residence, employment, interests, marital status, relations (e.g.,
relatives, friends, business associates, etc.), clubs, affiliations
and any other data that may be used to, for example, target content
distributed by content provider 104. Data collection module 108 may
also collect data regarding the context of device 102 including
device statistics (e.g., utilization, power level, running and/or
loaded applications, etc.), environmental information regarding
current and/or historical location data for device 102 (e.g., as
determined by Global Positioning System (GPS) coordinates, cellular
network registration, access points (APs) sensed in proximity to
device 102, etc.), other devices sensed in proximity to device 102,
etc.
[0025] User data module 110 may receive entered data 118 from user
interface module 114 (e.g., data manually entered by the user,
sensed biometric data, etc.) and collected data 120A from data
collection module 108 (e.g., collected from local or remote data
sources, sensed by sensors in device 102, etc.). User
identification may be important where, for example, there is more
than one user for device 102 (e.g., where device 102 may be shared
between family members, coworkers, etc.). User data module 110 may
process received data 118 and 120A to generate user data 120.
Context data module 112 may receive collected data 120B from data
collection module 108 (e.g., data pertaining to the current
condition of device 102, the environment in which device 102 is
operating, etc.). Context data module 112 may process data 120B to
generate context data 124. TEE module 106 may utilize user data 122
and/or context data 124 when processing message 116 into
personalized content 126. Personalized content may be content
delivered to device 102 from content provider 104 via message 116
that has been modified based on user data 122 and/or context data
124. Personalized context 126 may then be provided to user
interface module 114 for presentation and/or interaction.
[0026] User interface module 114 may be capable of more than one
mode of presentation and/or interaction in regard to personalized
content 126. In one embodiment, user interface module 114 may
present notification 130 to a user of device 102 informing the user
that message 116 was received and/or personalized content 126 is
available. Notification 130 may be a visible or audible
notification to the user, and may be as simple as a small indicator
on the display of device 102, a modification to an object already
displayed on the display (e.g., superimposing an indicator over an
object, changing the appearance of an object, etc.), an audible
alert to the user, etc. In one embodiment, Notification 120 may
automatically or manually (e.g., via user interaction) cause
presentation application 128 to be activated in device 102 or
selected for interaction if already active. It may also be possible
for notification 120 to be presented on device 102 (e.g., a smart
phone or other mobile device), which prompts the user to activate
presentation application 128 on another device (e.g., a more
powerful device such as tablet computer, laptop computer, etc.).
Presentation application 128 may be any program capable of
presenting personalized content 126 including, but not limited to,
browser applications, multimedia applications, a proprietary viewer
associated with content provider 104, etc. The user may then
interact with personalized content 126 as shown at 132. Content
interaction 132 may comprise, for example, the user reading the
content and then interacting with user interface module 114 to
answer questions presented by the content, place purchase orders
for goods described in the content, receive counteroffers if
initially presented offers are declined, etc. TEE 106 may then
provide feedback 134 to content provider 104. Feedback 134 may
comprise at least the result of content interaction 132 including,
for example, the responses of the user to queries in personalized
content 126, responses to offers/counteroffers proposed by
personalized content 126, metrics regarding the user's interaction
with personalized content 126 (e.g., duration of the interaction,
sensed biometric information such as user eye focus on personalized
content 126, sensed sounds during the interaction, etc.). In one
embodiment, feedback 126 may further comprise user data 122 and/or
context data 124. This data may be employed by content providers
104 for targeting message 116, for optimizing the content in
message 116, etc. Due to privacy and/or safety concerns, the user
data 122 and/or context data 124 provided in feedback 134 may be
filtered and/or sanitized prior to transmission.
[0027] At least one benefit that may be realized from system 100 is
the capability for content provider 104 to deliver personalized
content 126 to a user of device 102 without placing fear into the
user about their personal/confidential data. Since the
personalization may occur on the terms of the user (e.g., within
the device, within a cloud solution configured by the user), the
level of security enforcement is totally within the user's control.
Moreover, content provider 104 may also get feedback 134, but again
this interaction may be controlled entirely by the user. For
example, the user may establish rules dictating what categories of
data may be divulged to content provider 104, how much data, how
the data is filtered/sanitized, etc.
[0028] FIG. 2 illustrates an example configuration for a device in
accordance with at least one embodiment of the present disclosure.
In particular, device 102' may be able to perform example
functionality such as disclosed in FIG. 1. However, device 102' is
meant only as an example of equipment usable in embodiments
consistent with the present disclosure, and is not meant to limit
these various embodiments to any particular manner of
implementation.
[0029] Device 102' may comprise system module 200 to manage device
operations. System module 200 may include, for example, processing
module 202, memory module 204, power module 206, user interface
module 114' and communication interface module 208. Device 102' may
also include at least communication module 210 and TEE module 106'.
While communication module 210 and TEE module 106' have been shown
separately from system module 200, the example implementation of
device 102' has been provided merely for the sake of explanation
herein. Some or all of the functionality associated with
communication module 210 and/or TEE module 106' may also be
incorporated within system module 200. In device 102', processing
module 202 may comprise one or more processors situated in separate
components, or alternatively, may comprise one or more processing
cores embodied in a single component (e.g., in a System-on-a-Chip
(SoC) configuration) and any processor-related support circuitry
(e.g., bridging interfaces, etc.). Example processors may include,
but are not limited to, various x86-based microprocessors available
from the Intel Corporation including those in the Pentium, Xeon,
Itanium, Celeron, Atom, Core i-series product families, Advanced
RISC (e.g., Reduced Instruction Set Computing) Machine or "ARM"
processors, etc. Examples of support circuitry may include chipsets
(e.g., Northbridge, Southbridge, etc. available from the Intel
Corporation) configured to provide an interface through which
processing module 202 may interact with other system components
that may be operating at different speeds, on different buses, etc.
in device 102'. Some or all of the functionality commonly
associated with the support circuitry may also be included in the
same physical package as the processor (e.g., such as in the Sandy
Bridge family of processors available from the Intel
Corporation).
[0030] Processing module 202 may be configured to execute various
instructions in device 102'. Instructions may include program code
configured to cause processing module 202 to perform activities
related to reading data, writing data, processing data, formulating
data, converting data, transforming data, etc. Information (e.g.,
instructions, data, etc.) may be stored in memory module 204.
Memory module 204 may comprise random access memory (RAM) and/or
read-only memory (ROM) in a fixed or removable format. RAM may
include memory to hold information during the operation of device
102' such as, for example, static RAM (SRAM) or dynamic RAM (DRAM).
ROM may comprise memories utilizing a Basic Input/output System
(BIOS) or Unified Extensible Firmware Interface (UEFI) for
performing boot operations, programmable memories such as, for
example, electronic programmable ROMs (EPROMS), Flash, etc. Memory
module 203 may also comprise magnetic memories including, for
example, floppy disks, fixed/removable hard drives, etc.,
electronic memories including, for example, solid state flash
memory (e.g., embedded multimedia card (eMMC), etc.), removable
cards/sticks (e.g., micro storage devices (uSD), USB, etc.),
optical memories including, for example, compact disc ROM (CD-ROM),
digital video disc (DVD), etc.
[0031] Power module 206 may include internal power sources (e.g., a
battery, fuel cell, etc.) and/or external power sources (e.g.,
electromechanical or solar generation, power grid, etc.), and
related circuitry configured to supply device 102' with the energy
needed to operate. User interface module 114' may include equipment
and/or software to allow users to interact with device 102' such
as, for example, various input mechanisms (e.g., microphones,
switches, buttons, knobs, keyboards, speakers, touch-sensitive
surfaces, one or more sensors configured to capture images, video
and/or to sense proximity, distance, motion, gestures, orientation,
etc.) and various output mechanisms (e.g., speakers, displays,
lighted/flashing indicators, electromechanical components for
vibration, motion, etc.). The above example equipment associated
with user interface module 114' may be incorporated within device
102' and/or may be coupled to device 102' via a wired or wireless
communication medium.
[0032] Communication interface module 208 may handle packet routing
and other control functions for communication module 210, which may
include resources configured to support wired and/or wireless
communications. Wired communications may include serial and
parallel wired mediums such as, for example, Ethernet, Universal
Serial Bus (USB), Firewire, Digital
[0033] Video Interface (DVI), High-Definition Multimedia Interface
(HDMI), etc. Wireless communications may include, for example,
close-proximity wireless mediums (e.g., radio frequency (RF) such
as based on the Near Field Communications (NFC) standard, infrared
(IR), optical character recognition (OCR), magnetic character
sensing, etc.), short-range wireless mediums (e.g., Bluetooth,
WLAN, Wi-Fi, etc.) and long range wireless mediums (e.g., cellular
wide-area radio communication technology, satellite-based
communications, etc.). In one embodiment, communication interface
module 208 may prevent interference between different active
wireless links in communication module 210. In performing this
function, communication interface module 208 may schedule
activities for communication module 210 based on, for example, the
relative priority of messages awaiting transmission.
[0034] In the embodiment illustrated in FIG. 2, TEE module 106' may
interact with at least processing module 202, memory module 204 and
communication module 210. For example, Processing module 202 and/or
memory module may perform the operations associated with data
collection module 108, user data module 110 and context data module
112. Collected data 120A and B may be processed into user data 122
and/or context data 124 by processing module 202 that may be stored
in memory module 204. Message 116 may be received into TEE module
106' via communication module 210, and at least one application
128' in TEE module 106' may generate personalized content 126 by
personalizing the content in message 116 based on context data 124
by processing module 202 stored in memory module 204. Personalized
content 126 may then be stored in memory module 204 in preparation
for presentation to a user of device 102' (e.g., after the user
receives notification 130 as to the availability of personalized
content 126 triggered, for example by TEE module 106').
[0035] FIG. 3 illustrates an example message in accordance with at
least one embodiment of the present disclosure. In one embodiment,
message 116' may comprise at least metadata 300 and content 302.
Content 302 may comprise text, images, video, audio, user interface
objects, etc. to be presented to a user of device 102. Metadata 300
may comprise data for routing message 116' and/or data regarding
how functionality should be carried out with respect to content 302
and/or collecting data for feedback 134. A more detailed example of
metadata 300 is illustrated at 300' in FIG. 3. Public routing data
304 in metadata 300' may comprise general information for getting
message 116' to device 102. For example, public routing data 204
may comprise a broad category of devices to receive message 116'
such as, for example, a certain type of device, devices
communicating on a certain network, devices associated with users
in a broad category defined by gender, profession, etc. In
practice, it may be beneficial for content provider 104 to allow
nothing meaningful to be gained from public routing data 304 in
regard to the strategy for disseminating content 302.
[0036] In one embodiment, at least some of metadata 300' may be
encrypted in a manner that only device 102 may decrypt.
Traditionally, TEE module 106 would be required to produce a public
key compatible with the key the content provider 104 used to
encrypt private criteria 306. This traditional approach has the
problem of the public key uniquely identifying device 102 to at
least content provider 104 (and likewise the users associated with
device 102, which may be undesirable for these users in the
instance that content provider 104 is an advertiser or marketer).
Moreover, private criteria 306 would have to be
customized/re-encrypted for each device 102. Such customization may
prove to be a waste of resources as many messages 116 may be
filtered out before presentation by TEE module 106' based on public
routing data 304. Alternatively, when TEE module 106 interacts with
content provider 104 (and/or with anonymous data accumulator 600 as
disclosed in FIG. 6), it may employ an Enhanced Privacy ID (EPID)
signed SIGMA (Sign-and Mac) communication session. The EPID signed
SIGMA session facilitates anonymous interaction between device 102
and at least content provider 104 during which TEE module 106 may
transmit dimension data (e.g., "sanitized" user data devoid of data
identifying the corresponding user) and may then receive at least
one key for decrypting private criteria 306. The decryption keys
may be symmetric (e.g., may be based on the Advanced Encryption
Standard (AES), Rivest Cipher 4 (RC4), etc . . . ) or asymmetric
public keys wherein the private key may wrap a symmetric public key
that is then delivered to TEE module 106. Protecting public keys
within private keys is counter to the traditional use of asymmetric
public keys for encryption, not decryption. Keeping public keys
secret may help to prevent Man-In-The-Middle (MITM) attacks from
intercepting the public keys.
[0037] Private criteria 306 may be encrypted to, for example,
prevent competitors of content provider 104 from determining
proprietary information with respect to their strategy for
disseminating content 302. For example, content provider 104 may
market products to the user of device 102, and a strategy for
marketing these products may be readily determinable from private
criteria 306. Thus, content provider 104 may only participate in
system 100 if there is some assurance that their marketing strategy
is kept secret. The data in private criteria 306 may perform a
variety of functionality, examples of which are presented in FIG.
3. For example, dimension matching criteria 308 may further expand
upon the broad categories defined in public routing data 304 to
determine if content 302 is applicable to the current user of
device 102. Dimension matching criteria 308 may include
instructions, rules, etc. that further refine whether content 302
should be presented to the current user of device 102. Example code
(e.g., a set of instructions) that may be included in dimension
matching criteria 308 is disclosed in FIG. 4. Dimension matching
example 400 is written in Extensible Access Control Markup Language
(XACML), but may also be composed using other basic encoding rules
(BER) depending upon, for example, the requirements/characteristics
of the particular implementation. Examples of other BERs may
include JavaScript Object Notation (JSON), Abstract Syntax Notation
One (ASN.1), etc. Example 400 defines an example policy (e.g., a
user is within a certain age range, has a certain level of
education, already uses a certain product, has a certain familial
makeup, has manually configured user preferences in device 102 to
allow content 302 to be presented, etc.) and may then query whether
the current user of the device satisfies this policy (e.g., based
on user data 110 and/or context data 112). If the current user of
the device fits within the polices defined in dimension matching
criteria 308, then a determination may be made that the
presentation of content 302 is appropriate.
[0038] If it is determined that content 302 is appropriate for the
current user of device 102 based on dimension matching criteria
308, then personalization criteria 310 may describe how to
personalize content 302 for the current user of device 102 (e.g.,
based on user data 110, context data 112, etc.). For example,
personalization criteria 310 may define areas of content 302 that
may be altered to reflect the user, the perspective of the user,
the location of the device/user, etc. Example code corresponding to
functionality that may be performed by personalization criteria 310
is disclosed in FIG. 5. Personalization example 500 comprises
example XACML code to insert the title (e.g., Mr., Ms., Mrs., etc.)
and the name of the user of device 102 into content 302. In this
manner, content 302 may appeal more to the current user, which may
help to better maintain the attention of the user of device
102.
[0039] Counter offer criteria 312 may be optional in private
criteria 306 in that is may only be required in certain scenarios
(e.g., when content 302 comprises an advertisement including at
least one offer to which the user of device 102 may respond).
Counter offer criteria 312 may comprise at least one other offer
that may be presented to the user if an offer included in content
302 is declined, not of interest to the user, etc. Counter offer
criteria 314 may present counter offers to the user automatically
(e.g., after an initial offer is declined during content
interaction 132). The number, type, etc. of counter offers
available to a user may depend on the particular implementation of
system 100. Regardless of whether an offer or counter offer is
accepted by the user, feedback criteria module 314 may include
instructions as to how to generate feedback 134. In one embodiment,
feedback 134 may comprise data derived from content interaction
132, which may then be transmitted back to content provider 104.
Data in feedback 134 may comprise the identification of the user,
user answers to questions posed during content interaction 132,
user acceptance/refusal information regarding offers made during
content interaction 132, user payment/delivery information if an
offer was accepted, etc. In the same or a different embodiment,
feedback 134 may also comprise user data 110 and/or context data
112 for use by content provider 104 for determining the
attractiveness, effectiveness, etc. of content 302. A more detailed
example disclosing how feedback 134 may be provided to ensure that
the privacy of the user is protected is disclosed in FIG. 6.
[0040] FIG. 6 illustrates example feedback in accordance with at
least one embodiment of the present disclosure. In one embodiment,
feedback 134' may be provided to different entities that
participate in the content creation process. Anonymous data
accumulator 600 may be part of content provider 104 or may be a
totally independent entity. Anonymous data accumulator 600 may
comprise at least one computing device (e.g., a server) accessible
via a LAN or WAN like the Internet (e.g., in a cloud-computing
configuration) to accumulate data from a variety of devices 102.
Anonymous data accumulator 600 may process the collected data to
determine statistics, distributions, trends, etc. within the data,
and may then provide the processed data (e.g., targeting data 602)
to content provider 104. Content provider 104 may utilize targeting
data 602 to, for example, improve existing content 302, to generate
new content 302, etc.
[0041] Given the example presented in FIG. 4, feedback 134' may
comprise at least two data flows. Privacy protected telemetry data
604 may include, for example, the results of content interaction
132. It may be important to deliver this information directly to
content provider 104 as it may contain offer acceptance information
regarding an offer (or counter offer) that was presented in content
302. Privacy protected telemetry data 604 may be filtered by TEE
module 106 prior to transmission to ensure only necessary data is
being provided to content provider 104. Filtering in TEE module 106
may help to ensure that any data intended to be kept private by the
user is filtered in a safe environment (e.g., not susceptible to
compromise from other programs, outside influences, etc.) prior to
privacy protected telemetry data 604 being sent. Alternatively,
privacy protected telemetry data 604 may be delivered to content
provider 104 via an anonymous interaction protocol such as an EPID
signed SIGMA session as described above with respect to FIG. 3.
Sanitized user data 606 may comprise user data 110 and/or context
data 112 (e.g., user gender, age, location, familial makeup,
profession, interests, etc.) that may be provided to anonymous data
accumulator 600 in a similar manner to privacy protected telemetry
data 604 in that TEE module 106 may remove private/confidential
data prior to transmission or via an anonymous interaction
protocol. In one embodiment, the rules for determining the data
that may be transmitted in privacy protected telemetry data 604
and/or sanitized user data 606 may be set by the manufacturer of
device 102, may be configured by users of device 102 (e.g., via
user interface module 114), etc.
[0042] FIG. 7 illustrates example operations for message generation
and feedback reception in accordance with at least one embodiment
of the present disclosure. The operations shown in FIG. 7 may be
performed by, for example, a content provider. In operation 700 the
content provider may identify target parameter for content (e.g.,
to be transmitted to target users in a message). Private criteria
may then be determined based at least on the target parameters in
operation 702. Private criteria may include instructions that, in
view of the strategy of the content provider, determine whether to
present the content to a user, customize the content, present
counter offers, collect data etc. The content provider may then
proceed to generate metadata based at least on the private criteria
in operation 704. The metadata generated in operation 704 may be
combined with the content to generate a message in operation 706
that may then be transmitted in operation 708 (e.g., based on
public routing data included in the message. Feedback may then be
received in operation 710 (e.g., from the devices to which the
message was transmitted, for a separate entity such as an anonymous
data accumulator that collects feedback from the devices,
etc.).
[0043] FIG. 8 illustrates example operations for privacy
enforcement via localized personalization in accordance with at
least one embodiment of the present disclosure. The operations
shown in FIG. 8 may be performed by, for example, devices that
receive messages from content providers, at least one computing
device accessible via a LAN or WAN like the Internet (e.g., in a
cloud-computing configuration), etc. A message may be received in a
TEE module in operation 800 and any private data in the message may
be decrypted in operation 802 (e.g., operation 802 may only be
necessary if the message includes encrypted private criteria). In
operation 804 dimension matching may be performed to determine if
content in the message should be presented to the user (e.g., based
on at least one policy encoded within the private criteria). A
determination may be made in operation 806 as to whether dimension
matching was achieved between the user and the content. If in
operation 806 it is determined that the dimension matching failed,
then in operation 808 presentation of the content may be
aborted.
[0044] If in operation 806 it is determined that dimension matching
was successful (e.g., that the content should be presented), then
in operation 810 the content may be customized based on at least
one of user data or context data. A determination may then be made
in operation 812 as to whether the device of the user is enabled
for notification regarding the availability of the content. If it
is determined in operation 812 that content notification is
enabled, then in operation 814 the notification may be presented on
the device. A determination in operation 812 that content
notification is not enabled in the device or operation 814 may be
followed by operation 816 wherein a determination may be made as to
whether the device is enabled for user interaction with the
content. If it is determined that the device is not enabled for
user interaction with the content (e.g., an application for
interacting with the content is not active or may not be
installed), then in operation 818 the content may optionally be
stored for later presentation (e.g., if the device supports this
functionality) and presentation of the content may be aborted in
operation 808. If in operation 816 it is determined that the device
is enabled for user interaction with the content, then the content
may be presented to the user in operation 820 (e.g., via a user
interface in the device).
[0045] Operations 822 to 824 in FIG. 8 may be optional as they may
only be applicable when the content presented to the user in
operation 820 comprises an offer. In operation 822 a determination
may be made as to whether an offer presented in the content is
accepted by the user. If it is determined in operation 822 that the
offer was not accepted by the user, then in operation 824 any
counter offers included in the message may then be presented to the
user. A determination in operation 822 that the offer was accepted
or operation 824 may then be followed by operation 826 wherein
feedback may be prepared for transmission to at least the content
provider (e.g., and possibly other entities such as an anonymous
data accumulator, etc.).
[0046] Feedback may include, for example, the results of the user
interaction with the content, data about the user, about the
context of the user/device, etc. In operation 828 the feedback may
be analyzed by the TEE module to determine if any user data in the
feedback is private and/or confidential. A determination may then
be made in operation 830 as to whether the feedback comprises
private and/or confidential user data. If in operation 830 it is
determined that the feedback comprises private and/or confidential
user data, then in operation 834 the feedback may be filtered
and/or sanitized to remove private and/or confidential user data.
In instances where the feedback includes a large amount of data, an
alternative option may be to establish a communication session that
links the TEE module to the content provider, the target data
accumulator, etc. via an anonymous interaction protocol such as an
EPID signed SIGMA communication session that allows data to be
transmitted without identifying the source of the data. A
determination in operation 830 that the feedback does not comprise
private and/or confidential information or operation 834 may be
followed by transmission of the feedback in operation 832.
[0047] While FIGS. 7 and 8 illustrate operations according to
different embodiments, it is to be understood that not all of the
operations depicted in FIGS. 7 and 8 are necessary for other
embodiments. Indeed, it is fully contemplated herein that in other
embodiments of the present disclosure, the operations depicted in
FIGS. 7 and 8, and/or other operations described herein, may be
combined in a manner not specifically shown in any of the drawings,
but still fully consistent with the present disclosure. Thus,
claims directed to features and/or operations that are not exactly
shown in one drawing are deemed within the scope and content of the
present disclosure.
[0048] As used in this application and in the claims, a list of
items joined by the term "and/or" can mean any combination of the
listed items. For example, the phrase "A, B and/or C" can mean A;
B; C; A and B; A and C; B and C; or A, B and C. As used in this
application and in the claims, a list of items joined by the term
"at least one of" can mean any combination of the listed terms. For
example, the phrases "at least one of A, B or C" can mean A; B; C;
A and B; A and C; B and C; or A, B and C.
[0049] As used in any embodiment herein, the term "module" may
refer to software, firmware and/or circuitry configured to perform
any of the aforementioned operations. Software may be embodied as a
software package, code, instructions, instruction sets and/or data
recorded on non-transitory computer readable storage mediums.
Firmware may be embodied as code, instructions or instruction sets
and/or data that are hard-coded (e.g., nonvolatile) in memory
devices. "Circuitry", as used in any embodiment herein, may
comprise, for example, singly or in any combination, hardwired
circuitry, programmable circuitry such as computer processors
comprising one or more individual instruction processing cores,
state machine circuitry, and/or firmware that stores instructions
executed by programmable circuitry. The modules may, collectively
or individually, be embodied as circuitry that forms part of a
larger system, for example, an integrated circuit (IC), system
on-chip (SoC), desktop computers, laptop computers, tablet
computers, servers, smartphones, etc.
[0050] Any of the operations described herein may be implemented in
a system that includes one or more storage mediums (e.g.,
non-transitory storage mediums) having stored thereon, individually
or in combination, instructions that when executed by one or more
processors perform the methods. Here, the processor may include,
for example, a server CPU, a mobile device CPU, and/or other
programmable circuitry. Also, it is intended that operations
described herein may be distributed across a plurality of physical
devices, such as processing structures at more than one different
physical location. The storage medium may include any type of
tangible medium, for example, any type of disk including hard
disks, floppy disks, optical disks, compact disk read-only memories
(CD-ROMs), compact disk rewritables (CD-RWs), and magneto-optical
disks, semiconductor devices such as read-only memories (ROMs),
random access memories (RAMs) such as dynamic and static RAMs,
erasable programmable read-only memories (EPROMs), electrically
erasable programmable read-only memories (EEPROMs), flash memories,
Solid State Disks (SSDs), embedded multimedia cards (eMMCs), secure
digital input/output (SDIO) cards, magnetic or optical cards, or
any type of media suitable for storing electronic instructions.
Other embodiments may be implemented as software modules executed
by a programmable control device.
[0051] Thus, this disclosure is directed to privacy enforcement via
localized personalization. An example device may comprise at least
a user interface to present content. A message may be received into
a trusted execution environment (TEE) situated within the device or
remotely, the message including at least metadata and content. The
TEE may determine relevance of the content to a user based on the
metadata and user data. Based on the relevance, the TEE may cause
the content to be presented to the user via the user interface. In
one embodiment, the TEE may be able to personalize the content
based on the user data prior to presentation. If the content
includes an offer, the TEE may also be able to present
counteroffers to the user based on user interaction with the
content. The TEE may also be able to cause feedback data to be
transmitted to at least the content provider.
[0052] The following examples pertain to further embodiments. The
following examples of the present disclosure may comprise subject
material such as a device, a method, at least one machine-readable
medium for storing instructions that when executed cause a machine
to perform acts based on the method, means for performing acts
based on the method and/or a system for privacy enforcement via
localized personalization, as provided below.
EXAMPLE 1
[0053] According to this example there is provided a device for
privacy enforcement. The device may comprise at least a
communication module to interact with a content provider, a user
interface module to present content to a user and a trusted
execution environment (TEE) to receive a message from the content
provider via the communication module, the message including at
least metadata and content, determine relevance of the content to
the user based on at least one of the metadata and user data and
cause the content to be presented to the user via the user
interface module based on the relevance of the content.
EXAMPLE 2
[0054] This example includes the elements of example 1, wherein the
TEE is situated in the device or outside of the device in at least
one computing device.
EXAMPLE 3
[0055] This example includes the elements of any of examples 1 to
2, wherein the TEE comprises a secure memory space accessible to
only applications verified as safe by the TEE.
EXAMPLE 4
[0056] This example includes the elements of any of examples 1 to
2, wherein the metadata comprises at least public routing data and
private criteria.
EXAMPLE 5
[0057] This example includes the elements of example 4, wherein at
least the private criteria are encrypted and the TEE is further to
decrypt the private criteria.
EXAMPLE 6
[0058] This example includes the elements of example 4, wherein the
private criteria are formulated using basic encoding rules
including at least one of Extensible Access Control Markup Language
(XACML), JavaScript Object Notation (JSON) or Abstract Syntax
Notation One (ASN.1).
EXAMPLE 7
[0059] This example includes the elements of example 4, wherein the
private criteria comprises dimension matching criteria including
instructions for determining the relevance of the content.
EXAMPLE 8
[0060] This example includes the elements of example 7, wherein the
dimension matching criteria comprises considering any user
preferences regarding the presentation of content that are
configured in the device.
EXAMPLE 9
[0061] This example includes the elements of example 4, wherein the
TEE is further to personalize the content prior to presentation
based on personalization criteria included in the private criteria,
the personalization criteria including instructions for altering
the content based on the user data.
EXAMPLE 10
[0062] This example includes the elements of example 4, wherein the
TEE is further to cause additional content to be presented via the
user interface module based on counter offer criteria included in
the private criteria, the counter offer criteria including
instructions for presenting additional content based on the
interaction between the user and the presented content.
EXAMPLE 11
[0063] This example includes the elements of example 4, wherein the
private criteria comprises feedback criteria including instructions
for collecting the feedback data based on at least one of the user
data and interaction between the user and the presented
content.
EXAMPLE 12
[0064] This example includes the elements of example 11, wherein
the TEE is further to cause the feedback data to be collected based
on the interaction and to cause the feedback data to be transmitted
to at least the content provider.
EXAMPLE 13
[0065] This example includes the elements of example 11, wherein
the feedback data comprises at least privacy protected data
resulting from the interaction and sanitized user data, the TEE
being further to cause the communication module to transmit the
privacy protected data to the content provider and to transmit the
sanitized user data to an anonymous data accumulator.
EXAMPLE 14
[0066] This example includes the elements of example 13, wherein at
least one of the privacy protected data or the sanitized user data
may be transmitted using an anonymous interaction protocol.
EXAMPLE 15
[0067] This example includes the elements of any of examples 1 to
2, further comprising a data collection module to collect the user
data from at least one of user interaction with the device, sensors
in the device or data sources outside of the device.
EXAMPLE 16
[0068] This example includes the elements of any of examples 1 to
2, wherein the TEE is further to cause the user interface module to
present a notification informing the user regarding availability of
the content.
EXAMPLE 17
[0069] This example includes the elements of example 16, wherein
the notification is at least one of an indicator or icon displayed
on the device or a sound generated by the device.
EXAMPLE 18
[0070] This example includes the elements of any of examples 1 to
2, wherein the TEE module is further to cause the message to be
stored for later presentation by the device.
EXAMPLE 19
[0071] This example includes the elements of any of examples 1 to
2, wherein the metadata comprises at least public routing data and
encrypted private criteria, the TEE being further to decrypted the
private criteria.
EXAMPLE 20
[0072] This example includes the elements of example 19, wherein
the private criteria comprises feedback criteria including
instructions for collecting the feedback data based on at least one
of the user data and interaction between the user and the presented
content, the TEE being further to cause the feedback data to be
collected based on the interaction and to cause the feedback data
to be transmitted to at least the content provider.
EXAMPLE 21
[0073] According to this example there is provided a method for
privacy enforcement. The method may comprise receiving a message in
a trusted execution environment (TEE) from a content provider, the
message including at least metadata and content, determining
relevance of the content to a user based on at least one of the
metadata and user data and causing the content to be presented to
the user based on the relevance of the content.
EXAMPLE 22
[0074] This example includes the elements of example 21, wherein
the metadata comprises at least public routing data and private
criteria.
EXAMPLE 23
[0075] This example includes the elements of example 22, wherein at
least the private criteria are encrypted and the method further
comprises decrypting the private criteria.
EXAMPLE 24
[0076] This example includes the elements of example 23, wherein
the private criteria are formulated using basic encoding rules
including at least one of Extensible Access Control Markup Language
(XACML), JavaScript Object Notation (JSON) or Abstract Syntax
Notation One (ASN.1).
EXAMPLE 25
[0077] This example includes the elements of any of examples 22 to
24, wherein the private criteria comprises dimension matching
criteria including instructions for determining the relevance of
the content.
EXAMPLE 26
[0078] This example includes the elements of example 25, wherein
the dimension matching criteria comprises considering any user
preferences regarding the presentation of content that are
configured in the device.
EXAMPLE 27
[0079] This example includes the elements of any of examples 22 to
24, and further comprises personalizing the content prior to
presentation based on personalization criteria included in the
private criteria, the personalization criteria including
instructions for altering the content based on the user data.
EXAMPLE 28
[0080] This example includes the elements of any of examples 22 to
24, and further comprises causing additional content to be
presented based on counter offer criteria included in the private
criteria, the counter offer criteria including instructions for
presenting additional content based on the interaction between the
user and the presented content.
EXAMPLE 29
[0081] This example includes the elements of any of examples 22 to
24, wherein the private criteria comprises feedback criteria
including instructions for collecting feedback data based on at
least one of the user data and interaction between the user and the
presented content.
EXAMPLE 30
[0082] This example includes the elements of example 29, and
further comprises causing the feedback data to be collected based
on the interaction and causing the feedback data to be transmitted
to at least the content provider.
EXAMPLE 31
[0083] This example includes the elements of example 29, wherein
the feedback data comprises at least privacy protected data
resulting from the interaction and sanitized user data, the method
further comprising causing the privacy protected data to be
transmitted to the content provider and the sanitized user data to
be transmitted to an anonymous data accumulator.
EXAMPLE 32
[0084] This example includes the elements of example 31, wherein at
least one of the privacy protected data or the sanitized user data
may be transmitted using an anonymous interaction protocol.
EXAMPLE 33
[0085] This example includes the elements of any of examples 21 to
24, and further comprises collecting the user data from at least
one of user interaction, sensors or data sources outside the
device.
EXAMPLE 34
[0086] This example includes the elements of any of examples 21 to
24, and further comprises causing a notification informing the user
regarding availability of the content to be presented.
EXAMPLE 35
[0087] This example includes the elements of example 34, wherein
the notification is at least one of an indicator or icon displayed
on the device or a sound generated by the device.
EXAMPLE 36
[0088] This example includes the elements of any of examples 21 to
24, wherein the metadata comprises at least public routing data and
encrypted private criteria, the method further comprising
decrypting the private criteria.
EXAMPLE 37
[0089] This example includes the elements of example 36, wherein
the private criteria comprises feedback criteria including
instructions for collecting feedback data based on at least one of
the user data and interaction between the user and the presented
content, the method further comprising causing the feedback data to
be collected based on the interaction; and causing the feedback
data to be transmitted to at least the content provider.
EXAMPLE 38
[0090] According to this example there is provided a system
including at least one device, the system being arranged to perform
the method of any of the above examples 21 to 37.
EXAMPLE 39
[0091] According to this example there is provided a chipset
arranged to perform the method of any of the above examples 21 to
37.
EXAMPLE 40
[0092] According to this example there is provided at least one
machine readable medium comprising a plurality of instructions
that, in response to be being executed on a computing device, cause
the computing device to carry out the method according to any of
the above examples 21 to 37.
EXAMPLE 41
[0093] According to this example there is provided at least one
device configured for privacy enforcement via localized
personalization, the device being arranged to perform the method of
any of the above examples 21 to 37.
EXAMPLE 42
[0094] According to this example there is provided a system for
privacy enforcement. The system may comprise means for receiving a
message in a trusted execution environment (TEE) from a content
provider, the message including at least metadata and content,
means for determining relevance of the content to a user based on
at least one of the metadata and user data and means for causing
the content to be presented to the user based on the relevance of
the content.
EXAMPLE 43
[0095] This example includes the elements of example 42, wherein
the metadata comprises at least public routing data and encrypted
private criteria, the method further comprising decrypting the
private criteria.
EXAMPLE 44
[0096] This example includes the elements of example 43, wherein
the private criteria comprises dimension matching criteria
including instructions for determining the relevance of the
content.
EXAMPLE 45
[0097] This example includes the elements of any of examples 43 to
44, and further comprises means for personalizing the content prior
to presentation based on personalization criteria included in the
private criteria, the personalization criteria including
instructions for altering the content based on the user data.
EXAMPLE 46
[0098] This example includes the elements of any of examples 43 to
44, and further comprises means for causing additional content to
be presented based on counter offer criteria included in the
private criteria, the counter offer criteria including instructions
for presenting additional content based on the interaction between
the user and the presented content.
EXAMPLE 47
[0099] This example includes the elements of any of examples 43 to
44, wherein the private criteria comprises feedback criteria
including instructions for collecting feedback data based on at
least one of the user data and interaction between the user and the
presented content, the system further comprising means for causing
the feedback data to be collected based on the interaction and
means for causing the feedback data to be transmitted to at least
the content provider.
EXAMPLE 48
[0100] This example includes the elements of example 47, wherein
the feedback data comprises at least privacy protected data
resulting from the interaction and sanitized user data, the system
further comprising means for causing the privacy protected data to
be transmitted to the content provider and the sanitized user data
to be transmitted to an anonymous data accumulator.
EXAMPLE 49
[0101] This example includes the elements of example 42, and
further comprises means for causing a notification informing the
user regarding availability of the content to be presented.
[0102] The terms and expressions which have been employed herein
are used as terms of description and not of limitation, and there
is no intention, in the use of such terms and expressions, of
excluding any equivalents of the features shown and described (or
portions thereof), and it is recognized that various modifications
are possible within the scope of the claims. Accordingly, the
claims are intended to cover all such equivalents.
* * * * *