Desktop version

Home arrow Computer Science arrow Applied Big Data Analytics in Operations Management

Source

Organisation of the Chapter

The Chapter is organised in the following parts: the first section introduces the ZAMBiDM and highlights the statement of the problem. The Literature Review section discusses various Big Data technologies employed in industries. The third section introduces the proposed ZAMBiDM model and explains the functions of the system components. The fourth section discusses the implantation of the model. The last section concludes the ZAMBiDM’s operations and highlights the benefits of envisaging such a model.

LITERATURE REVIEW

The emerging, handling and benefit of Big Data had been discussed by other scholars. In this work the Big Data had been looked at dwelling on the four following areas: management, strategies, analytics and virtualisation.

Big Data

Philip Russom (2013) characterised Big Data as a very large data sets in form of structured which included relational data; unstructured, which involved human language text; semi-structured which comprised RFID and XML; and streaming, which included machines, sensors, Web applications, and social media. Philip further, sited some examples, such as the insurance companies that processed unstructured Big Data used technologies for Natural Language Processing in the form of text analytics, where the output would feed into older applications for risk and fraud or actuarial calculations that benefited from large data samples. Second example was bout the Big Data that streams from sensors enabled companies manage mobile assets, deliver products, identify noncompliant operations, spot vehicles that need maintenance, quality assurance in manufacturing industry, but to site a few.

According to Oracle White Paper (2014) defined Big Data as a concept applied by people to describe data sets whose size was beyond the capability of commonly used software tools to capture, manage, and process the data. The Report stressed that the sheer size of the data, combined with complexity of analysis and commercial imperative to create value from it, had led to a new class of technologies and tools to tackle it. The Report further narrated that the concept tended to be used in multiple ways, often referring to both the type of data being managed as well as the technology used to store and process it. The Report sited some companies where such technologies originated from such as Google, Amazon, Facebook and Linked- ln, to name a few, where they were developed to analyse the massive amounts of social media data they were dealing with. The Report summed up the description of Big Data by expressing that the concept was being described by the four (4) V’s and these were: first, the volume, which classified the size of the data in larger quantities you have never encountered in your organisation; second the velocity, the rate at which the data was being processed; third, variety, expressed the syntax and semantics that determined the extent to which data could be reliably structured into a relational database and content exposed for analysis; and fourth, value, was the extent to which the commercial value of the data could be predicted ahead of time so that ROI could be calculated and project the budget acquired by the company.

 
Source
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >

Related topics