Posts

Showing posts from March, 2021

Characteristics of Big Data Analysis

  As mentioned before, Gartner refers to the 3 Vs when describing Big Data. With millions of different sources and structures available to organisations, the variety of big data is extensive. The drastic increase in big data available creates an unimaginable volume of information to be stored, collected, analysed and processed. The rapid velocity at which data is generated and utilised in real time is only increasing and this creates exciting possibilities for the future of big data. However, this means new data analysis techniques have to be created to handle Big Data.  Typical characteristics of analysing big data include visualising the data in a way that's easier to understand and process. This could be through the use of graphics and eye-capturing images. These techniques better highlight important trends and facts about data. Another characteristic of big data analysis is artificial intelligence and machine learning which transfers the task of seeking trends and patterns...

Limitations of Traditional Data Analysis

  As reliable as they have been, conventional data analysis systems struggle with the amount of modern data generated in recent decades, computer storage/memory, and the complexity of some of the data. The costs that go along with some of these issues also limit traditional analysis techniques. Furthermore, big data is normally stored and processed on multiple computer systems and requires a lot of computing power on a shared network. Traditional data analysis is typically carried out on one computer system. References: Palak Kumar, Advantages and Limitations of Data Analytics, Sigma Magic Analysis Software (https://www.sigmamagic.com/blogs/analytics-advantages-and-limitations/)

Traditional Data Analysis

  Traditional data analysis refers to the original methods used to process data which first became popular in the 1970s and 80s. These include relational databases and data centres constructed to store and organise structured data.  Structured data (i.e. data and information that has a clear definition and can be easily browsed) is typically easier to store and process,  whereas modern data analysis techniques deal with a lot of unstructured data and requires various filtering techniques. References: Priya Chetty, Difference Between Traditional Data and Big Data, projectguru.in (https://www.projectguru.in/difference-traditional-data-big-data/)

The Value of Data

  According to Global Partnership for Sustainable Development Data, "Five of the top six companies in the world by market valuation are data companies." Not only is the growth of data rapidly expanding but so is the market for data and its value. It is difficult to place even an estimate on the value of all the data available in the world. This is because data or information belonging to an individual isn't very useful or of much value. It is when data belonging to a large group of people, in which trends and patterns can be noticed, is extremely valuable, and this can vary in importance depending on the type of people the data belongs to and what it can be used for. The fact that governments around the world pay hackers or partners millions of pounds in order to collect and process data is a good visualisation of how much data is worth today. The future of Big Data will only increase as new innovations and developments are made in the fields of Big Data and Software Deve...

Reasons for the Growth of Big Data

  One explanation for the huge exponential growth of big Data is that there are cheaper solutions to storing data individually now. For example, according to Simplicable Business Guide, the average hard drive capacity rose from 1GB to 1TB between 2000 and 2010.  Another reason is the constant transformation of the software development field. New developments like NoSql and Hadoop are allowing users processing data to cut down on the vast amounts of unstructured data that becomes available such as content on social media and audio and video files. References: Anna Mar, The 10 Reasons for the Rise of Big Data, arch.simplicable.com (https://arch.simplicable.com/arch/new/the-10-reasons-for-the-rise-of-big-data)

The Growth of Big Data

  The average person processes and utilises millions of bytes of information or data per day. The internet transmits and processes trillions upon trillions of bytes of data per minute. All of this data is used in order to allow companies and organisations to make important decisions after analysing trends and patterns in the data. The data that companies use to analyse is very small compared to the data available to them. According to Inc, only 27% of the data made available to enterprises is used to be analysed. One particular example of the growth of big data that captured my attention was that "90% of the world's data has been created in the last 2 years alone" according to the Big Data Made Simple blog. This is an eye-opening statistic that clearly shows the rapid exponential growth of Big Data, and is an overwhelming insight into the future of Big Data. References: Chad Pollitt, Big Data Made Simple - kunocreative.com (https://www.kunocreative.com/blog/bid/76907/big-...

The Historical Development of Big Data

  The first instance of data collection on a bigger scale dates back to 1965 when the first ever data centre was constructed by the American government. It was used to keep a record of finger prints and tax returns. The relational database was created and developed in the 1970s, and since then people within the field of data have been pursuing more efficient and effective ways of collecting, storing and analysing vast amounts of (mostly digital) information. In 1977, the development of big data sky-rocketed after the invention of the microcomputer. The microcomputer transformed technological data in a way that made it more personal; each computer or device belonging to an individual in their home provided a variety of personal information and later required huge amounts of data to be transmitted across the globe when the world wide web was created in 1989. During the technology age and specifically the last 2 decades, a variety of new technologies and applications have been develop...

What is Big Data?

Capable amounts of data are normally processed or analysed in a number of different ways including   mechanical processing, manual processing or electronic processing. Big Data is a field that deals with  such large amounts of data that new methods had to be created in order to store and process the  information; improved methods compared to conventional processing methods. Some modern examples of information that forms Big Data and requires improved processing techniques include analysing the shopping patterns and trends of consumers, social media platforms   handling billions of users' content and personal information, and health care records belonging to patients in hospitals. According to Gartner (2001): "Big Data is data that contains greater variety arriving in increasing volumes and with ever-higher velocity."  References: What Is Big Data? | Oracle United Kingdom, Oracle.com (https.//www.oracle.com/uk/big-data/what-is-big-data/)