Great Deal! Get Instant $10 FREE in Account on First Order + 10% Cashback on Every Order Order Now

i need 150 for each reply there is 2 replies which mean 300 words Definition of Big Data Big data is a collection of organized, semi-structured, and unstructured data that may be mined for information...

1 answer below »

i need 150 for each reply there is 2 replies whichmean 300 words

Definition of Big Data

Big data is a collection of organized, semi-structured, and unstructured data that may be mined for information and used in machine learning, predictive modeling, and other advanced analytics initiatives.

What are the most important drivers of big data?

The 5 driving factors of Big data by Multi-tech are:

  1. Have a certain market in mind: It must be obvious what the project's outcomes will be, who will benefit, and how they will benefit.
  2. Data imperfection and Immature technology:Data in Big Data Analytics will be incomplete and imperfect (duplicates, incompleteness, etc.), and the software will have to deal with the data in its current state. By the time a project gets started, there are usually around 30% more data available in terms of quantity and variety. Because you don't know anything when you begin, the analytics need to be enhanced and altered. The growth of your analytics, Iterative development should be encouraged by the development framework and deployment mechanism.
  3. Professionals with expertise in big data: Most of these projects are stumbling blocks because Big Data appears to be a disruptive technology, and the system's previous users are unprepared for these new notions and will go to great lengths to see it fail. There are usually two distinct skillsets on the architectural side: technical and a collection of analytical talents to develop information-based practice. To have a chance, both of you will need a strong team. If you don't, you've squandered time, money, and effort, as well as jeopardized your chances of landing future projects.
  4. Work with the Right talent Processes: Poor planning and teamwork are linked to poor organization and project management, but the essence of most big data initiatives necessitates the ability to read and process a wide range of data types fast, as well as high volume data sources, necessitating coordination.
  5. Managing the Big Data Process:which involves five dimensions (the 5 Vs):
  • Volume - Terabytes, Records, Transactions, Table, Files.
  • Variety - Structured, Unstructured, Multi-factor, Probabilistic
  • Velocity - Batch, Near time, Processes, Streams
  • Veracity - Trustworthiness, Authenticity, Origin, Reputation, Availability, Accountability
  • Value – Money

What could happen if companies focus solely on the quantitative (i.e., "numbers") aspects of big data?

Several wildly successful companies built their business on the numbers, calculations, and algorithms of quantitative data analysis.

Quantitative data analysis is so common that it's likely rare that you'll interact with a brand that isn't utilizing it. One of those companies that used this type of data analysis to achieve massive success isNetflix.

Quantitative data opens major possibilities for your company. Tools like regression analysis, simulations, and hypothesis testing will show you patterns that you may not have seen otherwise. You can use this information to pinpoint areas where your business can function more optimally.

With mathematical modeling, you'll be able to make decisions with more confidence because the numbers can show you the likelihood of several possible outcomes. Every successful brand should be using quantitative data if they want to be able to compete in today's marketplace.


this is the second one

Mohammed AlalionNumber of replies: 0

According to the IBM website, "Big data analytics is the use of advanced analytic techniques against very large, diverse big data sets that include structured, semi-structured and unstructured data, from different sources, and in different sizes from terabytes to zettabytes.". Data has value in any business, and it needs many important stapes to use the data to improve the business. The first step is to collect it. It requires huge work to determine the sigma and target it in many cases. Additionally, The quality of the data that need to be collected is very important. For example, if they are doing a survey, it is very important to make it short and ensure that the target segment has the time to read, understand, and pick the honest answer.

The other factor is how to process the data to store. It is very important to work on the data and translate it to electronic data to use when needed, such as transferring it from the paper to the electronic and dividing it based on the factors that we need to use in the future. After the process, the data will be stored based on the importance of the information and how much it is worth. In some cases, there are lows organized in dealing with the data and what to do if the data is hacked. For example, we have the HIPAA law in the health care system that organizes how the medical providers should deal with the patients' data and what to do if there is a breach.

The cost of storing the data should be considered before starting the process because nowadays, it is not cheap to keep sensitive information such as the patient's medical information. Any company shouldn't start collecting the data without a complete plan for dealing with it and a clear understanding of the value of the information they are collecting. However, the data is the main factor in any project. For example, In the pandemic, the patients' data how divided based on age, health conditions, race, and the country. It helped when they started collecting the segment to try the vaccines and determined which vaccine was more effective against the COVID 19.

According to our textbook, the data is divided into two parts. The first one is voluminous data which is Conventional computing methods are unable to process and manage it efficiently. The second one is divided into five dimensions called (the 5 Vs): (Volume, Variety, Velocity, Veracity, Value). Other online resources, such as the article Veracity: The Most Important "V" of Big Data published on Aug 29, 2019, mentioned just four dimensions (Volume, Variety, Velocity, Veracity).

Answered Same Day Mar 12, 2022

Solution

Shubham answered on Mar 13 2022
111 Votes
Running Head: MANAGEMENT                                1
MANAGEMENT                                        2
MANAGEMENT
Table of Contents
Reply 1    3
Reply 2    3
References    5
Reply 1
The quantitative aspect of Big Data has its own set of limitations. The complexity of environments is not for Big Data. It can perform best in single format environments. Apart from values there are other factors, which impact the environment. Data are getting quickly obsolete i.e data has diminishing value. It is required to remove the out-dated data periodically from the storage (He
era Estrada, 2019). Deciding the labels for the data sets is important to consider while using them because it makes their usage convenient and help them to match with parameters.
The created Big Data Standards are getting deregulated due to...
SOLUTION.PDF

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here