ASSIGNMENT:1 A proper definition of “big data” is

                        ASSIGNMENT:1

                                       TOPIC :
BIG DATA

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

                                                                                                
HITESH KUMAR

                                                                                                                                              
2K15/MC/027

                                                                                                    
                                                                                                                                                                          

A proper definition
of “big data” is difficult to achieve because projects, vendors, developers,
and business professionals use it quite differently. With these things in mind,
generally speaking, big data is:

Bunch
of large datasets
Or a
category of computing strategies and technologies that are used to handle
large datasets

Where “large dataset”
means that a dataset too large to reasonably process on a single computer or
store with traditional tooling. This means that the common scale of big
datasets is constantly shifting and may vary significantly from organization to
organization.

Technology has taken over every field
today resulting in huge data growth. All of this data is valuable. 3 to 4
million data is used every day. One machine can’t store and process this huge
amount of data therefore the need to understand big data and methods to store
this data arises. Big data is a huge
amount of data which can’t be processed using traditional systems of approach (computer
system) in a given time frame.  

Here, big data is used to
better understand customers and their behaviors and preferences. Companies are
keen to expand their traditional data sets with social media data, browser logs
as well as text analytics and sensor data to get a more
complete picture of their customers.

There are specific attributes
that define big data. In most big data circles, these are the seven V’s: volume,variety,velocity,veracity,visibility,validity
and variability.

Now how big
does this data need to be? There’s a common misconception while referring the
word big
data. There’s not a threshold of data above which data will be
considered as big data. It is referred to data that is
either in gigabytes, terabytes, petabytes, exabytes or size even larger than
this. This definition is wrong. Big data depends purely on the context it is
being used in. Even a small amount of data can be referred to as big data. For
example, you can’t attach a file to an email with a size of 50 MB. Therefore
for the email, this 50 MB is referred to as big data.

A real-world example can be what goes
on in an air traffic controller. They are personnel responsible for managing
routes and altitudes between different airlines. Their main goal is to monitor
the speed, altitude, location etc of the aircraft and contact them if needed
when something goes wrong. Now they receive huge amount of data every minute
from different aircrafts and they have to make sense from that data within time
to avoid any accident. The size of the data is too big and there are time
constraints on that data. In such conditions traditional techniques fail to
provide result and something more powerful is required.

That’s why big data analytics
technology is so important to heath care. By analyzing large
amounts of information – both structured and unstructured – quickly, health
care providers can provide lifesaving diagnoses or treatment options almost
immediately.