The final eulogy to ‘Man is the Master of the Machine’ has been written.
In the movie Terminator 3, the sequel delves into the takeover of earth by machines, until the very end, when the machine itself has a change of heart. However ominous those signs are, what is undeniable is that the age of machines is upon us.
From mere input mongers to making sense of the mountain of data, cataloguing them, analysing them and delivering a seemingly analogues interpretation of it, machines have become the new indispensable smartphone for today’s Enterprise. Within this paradigm, the original input feeder, the man, is now relegated to building strategies on top of the results that the machine has spewed to him. The shift from Man to Machines, to Machines to Machines is now here to stay.
The component that has built itself into an indispensable position in this entire equation is that of the Data Center. Not the legacy coLocation versions but the new age, intelligent data player that offers compute, store, analyze, cohabits the Cloud and Applications within itself. One that is intelligent and elastic enough to accommodate the growing data demands of tech-dictated enterprises.
The Data Center, referred rather insipidly to its very reason of its existence, is now a chameleon in the entire IT structure. In some cases, it is the eventual residency point for the data. In others, it is starting point of information that has been decoded and awaits a decision. And in between, is the binding agent in an ever-spawning network.
Come to think of it. What if it was removed from the equation? Or perhaps, more benevolently, scaled down to just a rudimentary Data Center. Before we answer that question, here’s something of an analogy.
Imagine if all your data was not retrievable one not-so-fine morning. Will we see a repeat of the dark ages? Perhaps so. It is therefore not a far-fetched misnomer when Data is referred to as the new economy.
So, what size of data is the world coming to? Here’s a curtain raiser.
Shantanu Gupta, director of Connected Intelligent Solutions at Intel, introduces the next-generation prefixes for going beyond the yottabyte; brontobyte and gegobyte.
A brontobyte, which isn’t an official SI prefix, but is apparently recognized by some people in the measurement community, is a 1 followed by 27 zeros. Gupta uses it to describe the type of sensor data we’ll get from the internet of things. From there, a gegobyte (10 to the power of 30) is just a short distance away.
Now imagine the computational strength required to make sense of this volume. Companies will hence need to have a future-proof strategy in place for collecting, organizing, cleansing, storing, and securing data – and for applying analytics to derive real-time insights to transform their businesses.
A story in Information Management highlights “Big Data Analytics: The Currency of the 21st Century Enterprise.” Quite an interesting read. The gist of the argument: Personal data has an economic value that can be bought, sold, and traded.
Emerging technologies are driving transformation within organizations. The year 2019 will see Artificial Intelligence (AI) and Machine Learning (ML) driving change in enterprises. We already see numerous use cases of these emerging technologies in industries such as BFSI, healthcare, telecom, manufacturing, and home automation. These technologies can cull data and get real-time insights about the business and offer timely solutions or corrective action, often without human intervention. AI-backed automation and predictive analytics will help predict challenges that may arise; it will streamline operations, save costs, enhance customer experience, and perform repetitive tasks. While the adoption of ML technologies will lead to exponential growth of enterprise data, the accuracy of outputs is a factor of the sanctity of the input.
That calls for a trustworthy data center partner, not only to store the data but also to analyze and manage it. The ideal data center partner should do both — cater to current requirements and also adapt to the changing IT landscape.
According to a Frost & Sullivan report, from an APAC standpoint, it said, the DataCenter services market will grow at a compound annual growth rate (CAGR) of 14.7% from 2015-2022 to reach US$31.95 billion at the end of 2022. Specifically, the India data center market is expected to reach values of approximately $4 billion by 2024, growing at CAGR of around 9% during 2018-2024. Major cities such as Mumbai, Bangalore, and Hyderabad are witnessing high investments of local and international operators in the Indian market. The increasing construction of hyperscale facilities with the power capacity of over 50 MW will fuel the need for innovative infrastructure in the market over the next few years.
A recent study by IBM of 500 International Data Centers threw up key insights into what constitutes a well thought out Data Center strategy and one that ticks the right boxes for an Enterprises when selecting a DC partner.
It is therefore evident that the Data Center should be built to solve a business problem – both current and future, should have the flexibility to adapt to changing demands and should be agile enough to accommodate newer dynamics of the business. The paradox in the situation is that as the data center grows, the density of the data within it will also expand; all this on hardware that will significantly shrink. Computing power therefore becomes the differentiator and will help negate any push backs that volume will bring up.
It is not lost on DC players that Security is the other differentiator. If this data falls into the wrong hands, it could create havoc, resulting in million dollar loses for corporations. It would impact the credibility of trustworthy institutions, entrusted with sensitive consumer data. Here are two recent incidents.
- In January 2019, the HIV-positive status of 14,200 people in Singapore was leaked online. Details included identification numbers, contact details, and addresses were available in the public domain.
- In December 2018, a cyber-attack exposed the records of 500 million guests of the hotel giant Marriott International. The attack occurred over a period of four years and was traced back to a Chinese spy agency.
The emphasis on security and compliance is even stronger now with the European Union’s General Data Protection Regulation (GDPR). In fact, GDPR is hailed as one of the most critical pivots in data privacy rules in the past two decades. It is going to fundamentally change how data is handled, stored, and processed.
Given the geographic-agnostic nature of such attacks, it is not lost on Indian IT companies to be wary of an impending attack. The Government-steered Personal Data Protection Bill mandates stringent rules for security, consent of customers, data privacy and data localization. Indian businesses will need to realign their data center strategies to comply with this Bill, which could eventually become law. This law will push business leaders to rethink identity and access security, encryption, data systems and application security, cloud security, and DDoS, among other things. And that’s where machine to machine will score higher. Little wonder that CIOs are in favour of the benefits of automating the whole of at least a majority of the work chain.
Machine to machine allows for a predictable, systemic patterns, allowing for hyperscale computing, deep-dive analytics, trend spotting, vulnerability recognition and elimination, risk mitigation, even alternate computing, without the vulnerabilities of man to machine directions. The choice therefore in front of the CIO is to go with a service provider who is an SI or an IT architect who has provisioned the entire landscape and hence can implement machine-derived predictable automated results.
Does this mean it is the end of human thinking? Quite to the contrary, it started because of human thinking.
M P Vijay Kumar
Source: Sify Technologies