1.1P Basic Linux Security
SIT719 Security and Privacy Issues in Analytics
Distinction Task 9.1 Survey on Differential Privacy for
Industrial IoT
Overview
The objective of machine learning is to extract useful information from data, such as how to classify
data, how to predict a quantity, or how to find clusters of similar samples. Privacy, on the other
hand, concerns the protection of private data from leakage, especially the information of
individuals.
An attacker can de-anonymize” an anonymized dataset. Numerous examples, most prominently
the case study of an anonymized Netflix dataset, reveal how little auxiliary information is needed
to de-identify a person from an “anonymized” dataset. Therefore there is a need for better privacy.
Differential privacy uses random noise to ensure that the publicly visible information doesn’t
change much if one individual in the dataset changes. As no individual sample can significantly
affect the output, attackers cannot infer the private information co
esponding to any individual
sample confidently.
On the other hand, the internet of things, or IoT, is a system of inte
elated computing devices,
mechanical and digital machines, objects, animals or people that are provided with unique
identifiers (UIDs) and the ability to transfer data over a network without requiring human-to-human
or human-to-computer interaction. Privacy is a major concern for the IoT devices and data sharing.
In this Distinction Task, you have to review the differential privacy options for Industrial IoT
networks. Please see more details in Task description. Before attempting this task, please make
sure you are already up to date with all Credit and Pass tasks and task 5.1 (D/HD).
Task Description
Instructions:
Part 1:
1. Download the resource zip file from OnTrack.
2. Read the Section VI “DIFFERENTIAL PRIVACY IN INDUSTRIAL INTERNET OF THINGS” of
the “main_article” titled “Differential Privacy Techniques for Cyber Physical Systems: A
Survey”.
3. This Section has the following sub-sections: A. Industrial System, B. Distributed Control
System, C. Industrial Database Systems
https:
www.cs.utexas.edu/%7Eshmat/shmat_oak08netflix.pdf
Geelong Students:
You need to follow only subsection A “Industrial System”. Read the articles mentioned in the
“main_document” (PDF titled articles_secA in the ZIP folder). You will gain knowledge on
Consumer data preservation and Location privacy.
Cloud Students:
You need to follow only subsection B “Distributed Control System”. Read the articles
mentioned in the “main_document” (PDF titled articles_secB in the ZIP folder). You will gain
knowledge on the privacy protection for control system.
Burwood Students:
You need to follow only subsection C “Industrial Database Systems”. Read the articles
mentioned in the “main_document” (PDF titled articles_secC in the ZIP folder). You will gain
knowledge on the privacy in database system, ML for cloud technologies.
Note: Alternatively, you are also free to choose three technical articles on
“differential privacy” for your allocated topics, which is one of these- Industrial
System, Distributed Control System, and Industrial Database Systems. In that case,
you don’t need to follow the articles we downloaded for you. You choose your own.
You can access technical articles from internet or through deakin web li
ary. Some
good technical article venue includes IEEE, ACM, Sciencedirect, google scholar, etc.
Part 2:
Write minimum 1200 word report based on answering the following questions:
i. What is the privacy challenge mentioned in those articles?
ii. How differential privacy (DP) can solve that problem? Here discuss the
methodology and fundamental working principles of DP?
iii. What is the outcome?
iv. Discuss the overall summary.
v. Add figures/tables
eferences if necessary.
Convert the report into a PDF document and check for plagiarism from the unit site of the
CloudDeakin. Ensure that the similarity score is less than 20%. Take a screenshot of the
similarity score and attach at the end of the PDF and submit using OnTrack system.
Overview
Task Description
Differential Privacy Techniques for Cyber Physical Systems: A Survey
746 IEEE COMMUNICATIONS SURVEYS & TUTORIALS, VOL. 22, NO. 1, FIRST QUARTER 2020
Differential Privacy Techniques for Cyber Physical
Systems: A Survey
Muneeb Ul Hassan , Mubashir Husain Rehmani , Senior Member, IEEE, and Jinjun Chen
Abstract—Modern cyber physical systems (CPSs) has widely
eing used in our daily lives because of development of
information and communication technologies (ICT). With the
provision of CPSs, the security and privacy threats associated
to these systems are also increasing. Passive attacks are being
used by intruders to get access to private information of CPSs.
In order to make CPSs data more secure, certain privacy preser-
vation strategies such as encryption, and k-anonymity have been
presented in the past. However, with the advances in CPSs
architecture, these techniques also need certain modifications.
Meanwhile, differential privacy emerged as an efficient technique
to protect CPSs data privacy. In this paper, we present a com-
prehensive survey of differential privacy techniques for CPSs.
In particular, we survey the application and implementation of
differential privacy in four major applications of CPSs named as
energy systems, transportation systems, healthcare and medical
systems, and industrial Internet of things (IIoT). Furthermore,
we present open issues, challenges, and future research direction
for differential privacy techniques for CPSs. This survey can
serve as basis for the development of modern differential pri-
vacy techniques to address various problems and data privacy
scenarios of CPSs.
Index Terms—Differential privacy, cyber physical systems
(CPSs), smart grid (SG), health care systems, transporta-
tion systems, industrial Internet of Things (IIoT), privacy
preservation.
I. INTRODUCTION
PREVIOUSLY, embedded computers were used to controland monitor the physical processes via feedback loop con-
trol [1]. With the passage of time, integration of computation
technologies with traditional embedded physical systems lead
the foundation of new type of systems named as cyber physical
systems (CPSs) [2]. The advances in CPSs have gathered con-
siderable attention over the last ten years [3]. The major reason
ehind this stupendous attention is the dual nature of CPSs,
via which they integrate the dynamic properties of embedded
computers with those of information and communication tech-
nologies (ICT) [4]. Similarly, the merger of ICT and embedded
Manuscript received September 8, 2018; revised April 19, 2019 and July
15, 2019; accepted September 21, 2019. Date of publication October 1, 2019;
date of cu
ent version March 11, 2020. This work was supported in part
y the Australian Research Council (ARC) under Project DP XXXXXXXXXX,
Project DP XXXXXXXXXX, Project LP XXXXXXXXXX, and Project LP XXXXXXXXXX.
(Co
esponding author: Mubashir Husain Rehmani.)
M. Ul Hassan and J. Chen are with the Faculty of Science, Engineering
and Technology, Swinburne University of Technology, Hawthorn, VIC 3122,
Australia (e-mail: XXXXXXXXXX; XXXXXXXXXX).
M. H. Rehmani is with the Department of Computer Science, Cork Institute
of Technology, Cork T12 P928, Ireland (e-mail: XXXXXXXXXX).
Digital Object Identifier XXXXXXXXXX/COMST XXXXXXXXXX
systems spread to a number of physical domains of dynamic
nature, including energy, transportation, healthcare, medical,
industrial, and manufacturing systems [5]. Majority of CPSs
are deployed in life support devices, critical infrastructures
(CI), or are very vital to our everyday lives. Therefore, CPSs
users expect them to be emancipated from every type of vul-
nerabilities. One of the critical issue in deployment of CPSs
in real world is their privacy, as any type of information leak-
age can result in very serious consequences [6]. Particularly,
the complex architecture of CPSs make it difficult to assess
the privacy threats, and new privacy issues arises. It is also
strenuous to trace, identify, examine, and eliminate privacy
attacks that may target multiple components of CPSs such as
eal-time sensors, wearable health devices, industrial control
systems, etc. [6]. Similarly, CPSs basically rely on diverse
number of sensors and data centres containing very huge
amount of personal and private data. For example, wearable
devices of patients are continuously reporting their real-time
data to consulting doctors [7]. However, if one does not use
a strong privacy preservation scheme during this commu-
nication, then any adversary can try to hack this personal
information and can use it for illegal benefits such as black-
mailing, false information injection, etc. [8]. Therefore, there
is a strong possibility of compromising the personal privacy
of CPS users in the absence of proper privacy protection
strategy [9].
Attacks on CPSs can be classified into passive (privacy
oriented) or active (security oriented). The basic objective
of passive attacks is to access a certain amount of private
data being shared in the network, or to infer about any crit-
ical information from public dataset [5]. Many researchers
proposed cryptographic techniques to preserve data pri-
vacy [13]–[15]. However, these cryptographic techniques are
computationally expensive, because users’ needs to main-
tain the set of encryption keys. Moreover, it becomes more
difficult to ensure privacy in a situation when public shar-
ing of data is required. Similarly, anonymization techniques
such as k-anonymity [16] are also proposed by researchers to
address privacy issues. However, these anonymization strate-
gies do not guarantee complete level of protection from
adversaries because the chances of re-identification increase if
size of attributes in dataset increases [11]. An adversary try-
ing to infer the data can match the non-anonymized data with
anonymized data, which in turn will lead to privacy
each.
Another important privacy scheme named as differential pri-
vacy was introduced in 2006 to overcome these privacy issues.
Cu
ently, the use of differential privacy is emerging as a future
1553-877X c© 2019 IEEE. Personal use is permitted, but republication
edistribution requires IEEE permission.
See http:
www.ieee.org/publications_standards/publications
ights/index.html for more information.
Authorized licensed use limited to: Deakin University. Downloaded on May 09,2020 at 08:11:27 UTC from IEEE Xplore. Restrictions apply.
https:
orcid.org/ XXXXXXXXXX
https:
orcid.org/ XXXXXXXXXX
UL HASSAN et al.: DIFFERENTIAL PRIVACY TECHNIQUES FOR CPSs: A SURVEY 747
TABLE I
COMPARISON OF PRIVACY PRESERVATION STRATEGIES ON THE BASIS OF METHOD, MERITS, WEAKNESSES, AND COMPUTATIONAL OVERHEAD
of privacy [17]. Differential privacy protects statistical or real-
time data by adding desirable amount of noise along with
maintaining a healthy trade-off between privacy and accuracy.
In differential privacy, user can control the level of privacy
or indistinguishability,