Wednesday 29 August 2018

This Is The Right Time To Standardise Big Data Technologies

Is it time to standardise big data technologies? The once siloed, inaccessible, and mostly underutilised data has now become crucial to enterprises for success. And experts say that there is still room to promote interoperability between the available tools. But how does one define ‘standardisation’? Northeastern University’s College of Computer and Information Science defines “standard” as a formal agreement of meaning of a collection of concepts among groups — in this case, tech companies and enterprises.

According to an ISO (a standards development organisation like IEEE) report, big data paradigm consists of the distribution of data systems across horizontally coupled independent resources to achieve the scalability needed for the processing of extensive data sets.

One of the first documentation on big data standardisation came from the International Telecommunication Union (ITU) which defined big data as “a paradigm for collection, storage, management, analysis and visualisation of extensive datasets with heterogeneous characteristics, that include high-volume, high-velocity and high-variety.” ISO defined big data engineering as storage and data manipulation technologies that leverage a collection of horizontally-coupled resources to achieve a nearly linear scalability in performance.

Standardisation Can Increase Interoperability Between Tools

According to a recent IDC study, we are fast approaching the data age with the global datasphere growing to a trillion gigabytes by 2025. This will be ten times the 16.1 ZB of data generated in 2016. IDC also stresses that around 20% of the data in the world will be critical to our daily lives by 2025. While data and data-related tools grow, how can business leaders find out what data matters the most?

In the last decade, there has been a spurt in the number of tools around big data and data analytics developed by different organisations, but as Jim Melton, editor of ISO/IEC 9075 (SQL) pointed out to ISO, tools lack interoperability. According to Dr Klaus-Peter Eckert, an expert who works on interoperability in distributed systems, all technical building blocks present but the key tool missing is interoperability. Eckert spoke to ISO, “There is a distinct lack of an architecture which can combine the many components of big data solutions. This is where standards will come into play in the game.” It is widely believed that the development of standards will pave a platform for interoperability.

Read More Here

Article Credit: Analytics India

Go to Source

The post This Is The Right Time To Standardise Big Data Technologies appeared first on Statii News.



source http://news.statii.co.uk/this-is-the-right-time-to-standardise-big-data-technologies/

No comments:

Post a Comment