What is Plagiarism and How Can I Avoid It ?
Introduction
The purpose of an introduction is to prepare the
eader from general research to your specific area of
esearch.
Puts your research in context by explaining
significance of the research
Start with summarizing cu
ent research and
ackground information about the topic
Statement of the purpose of your research
Introduction
Project Goals and Background
[Description of the problem you researched. Generally known
information about the topic. Prior studies historical context to your
work. Requirements your hypothesis and an overview of expected
esults.]
Analysis of Requirements
[General description of the functional and non-functional requirements
which can be tested and measurable.]
Technology and Solution Survey
[Survey academically or commercially available technologies and
solutions that might meet the requirements.]
Literature Survey of Existing Research
[Survey what is the cu
ent state-of-the-art related to your proposed
work]
Plagiarism Tutorial CMPE 2
Project Goals and Background
“Recent years have witnessed a rapid increase in video
streaming services, as a result of the massive content
published by content providers, high-speed Internet, and the
number of devices connected to the Internet. Thus, content
providers have resorted to employ one or more content
distribution networks (CDNs) to handle scalability, and
improve the quality of experience (QoE) for users. By 2021,
77% of the Internet video tra"c is expected to cross CDNs [1].
Hence, adding cache storage space at routers becomes of
utmost importance to handle this massive growth, and
improve the network performance as well as user’s QoE.
Consequently, information centric networks (ICNs) (e.g.,
NDN [11], DONA [12], CONIA [16]) have been developed as an
emerging architecture for content delivery”
Analysis of Requirements
“One of the most difficult decisions is which object to cache and
evict, given the limited capacity of the cache network and large
number of objects to cache. Caching algorithms can be classified
ased on which entity controls the caching decision, and the
available information to make these decisions. … The cache
performance depends on the prediction accuracy, and how it is
eing utilized to make decisions. This task has many challenges,
(1) the future object characteristics need to be forecasted to be
available at the time of making cache decisions; (2) these
characteristics change over time, and hence, this forecasting
needs to run continuously; (3) !nally, the caching mechanism
needs to carefully utilize these predicted object characteristics to
improve cache performance “
4
Technology and Solution Survey
“Caching algorithms can be classified based on which entity controls
the caching decision, and the available information to make these
decisions. Least Recently Used (LRU), Least Frequently Used (LFU),
and their variants are examples of reactive caching, in which individual
caches decide which objects to cache purely based on the recent locally
observed object access patterns. They are easy to implement and widely
used in today’s CDNs [18]. On the other hand, static caching is
proactive caching, in which centralized controllers have global view of
user demands and object access patterns. They decide which objects to
cache, and push these objects to cache nodes. Reactive caching reacts
faster to changes in object access patterns, but leads to caching non-
popular objects, which are evicted before receiving their next request,
due to their lack of knowledge about future object popularity. This
leads to thrashing problem and wasting cache resources (see (§3) for
more details). Proactive caching is the optimal solution only if the
object access pattern is stationary. Thus, it cannot cope with sudden
changes in object popularity as reactive caching.”
5
Literature Survey of Existing
Research
“Our goal is to develop a self-adaptive caching mechanism,
which automatically learns the changes in request trace patterns,
especially bursty and non-stationary trace, and predicts future
content popularity, then decides which objects to cache and evict
accordingly to maximize the cache hit. In recent years, recu
ent
neural networks (RNN) have become the cornerstone for
sequence prediction. RNNs have shown their unchallenged
dominance in the area of natural language processing [15],
machine language translation [2], speech recognition [7], and
image captioning [8]. Many variants of RNN exist in literature,
among which Long Short-Term Memory (LSTM) [10], Gated
Recu
ent Unit (GRU) [5] are the most popular ones for sequence
prediction. Thus, it is natural to wonder their ability to predict
content popularity where content requests a
ive in a form of a
sequence”
6
What to Submit?
1. Submit your own introduction-research on your
esearch paper
2. You will review two introduction on Canvas
(Randomly selected and two blind reviews)
3. Focus on 5 sections in introduction
4. Are they understandable, complete, cohesive?