Artificial Intelligence: A Definition for Colocation Providers

This audio was created using Microsoft Azure Speech Services

If you keep track of industry trends at all, then I bet your newsfeed has been filled with exciting stories and bold predictions about artificial intelligence (AI), machine learning (ML), and neural networks. With hyperbolical headlines such as, “How Artificial Intelligence Will Self-manage the Data Center” and “Is 2018 When Machines Take Over?”, I’m sure many people are mentally rolling their eyes as they click to the next story. With any new trend, of course, there’s hype, confusion, and misleading claims. And companies sometimes want to grab on to it and claim it for their own before things are fully baked. But this doesn’t mean there isn’t substance behind all the talk. I believe in the power of AI to make data centers better. Here in the Data Center Science Center, we believe AI’s biggest impact for colo data centers will be on reliability and less so on efficiency. Also, we believe it will take a bit longer than some might think before AI’s value is really felt; there are some key challenges to overcome before this trend really hits its stride. I will address this in my next blog on this subject.

What is AI?

man using laptop outsideArtificial Intelligence (AI) and Machine Learning (ML) are two terms often used interchangeably or considered to be synonyms. AI refers generally to the concept that a machine or system can be “smart” in carrying out tasks and operations based on programming and the input of data about itself or its environment. ML, on the other hand, is an approach or method for making a machine or system more intelligent… to enable it to be more autonomous and self-adjusting as conditions change. ML is fundamentally the ability of a machine or system to automatically learn and improve its operation or functions without human input. ML could be thought of as being the current state of the art form of imbuing a machine with AI.

One technique for implementing ML that is said to be driving a lot of the current advancement in AI, is Deep Learning (DL). DL is a much more compute-intensive form of ML. Deep Learning, also known as deep structured learning or hierarchical learning, involves the algorithmic analysis of vast numbers of data points over multiple levels where the output of one level is fed to the next in a successive fashion.   This layered framework is often referred to as an artificial “neural network” because of its intended resemblance to neural networks in human brains. This approach reduces error and speeds up the learning process.

Artificial intelligence represents a very broad spectrum of capabilities. Mechanical system controls using PLCs and automation servers that have existed for years, for example, is a form of artificial intelligence. But that’s not what most people are talking about when they use that term today. Today it’s about using machine learning algorithms and deep learning, neural networks to automate operations in increasingly self-sufficient, reliable, efficient, and adaptive ways even as the environment changes in real-time.

Accelerated Technical Support Strengthens Foundation for AI

These methods of developing AI – machine learning, deep learning, neural networks, etc. – have been around for years, but technology limitations held back their progress. In the last few years, IoT, Big Data, and the availability of graphical processing units (GPUs) have greatly accelerated its development and application. More advanced artificial intelligence depends, in part, on massive amounts of related/correlated data from which algorithms are developed and then used to enable machines to learn and make decisions. The amount of device and environmental data has exploded thanks to decreasing costs for sensors, network connectivity, storage, and bandwidth. The growth of Big Data analysis means this rich data can be handled and its value extracted more quickly, and with less resources than in the past. Furthermore, processing this data in real-time with a low error rate requires powerful parallel processing capabilities that are now possible with today’s GPUs. These trends have created a technological foundation for applying AI in data centers.

Current machine learning-based AI methods are very strong at doing two fundamental things:

  1. Recognizing patterns in very large and well-labeled datasets – image recognition and natural language processing, for example.
  2. Automating current processes and services that require data in decision making – as in services, maintenance, and hardware replacement.

But there are some key challenges related to data which must be dealt with before machine learning-based AI is widely developed in the industry and adopted by colocation providers. My next blog will explain what they are.

AI Will Increase Performance for Colocation Providers

Nonetheless, I’m confident the industry will solve these challenges, and in many ways, they are being addressed now. Colocation data centers, of course, strive to be efficient with resources and quick to deploy new capacity all without compromising availability for their tenants. Data analytics and increasingly AI will become tools for providers to incrementally improve their performance on these points. Read this other blog that speaks to Schneider’s early understanding and predictions of AI.

Tags: , , , , , , ,

Conversation

  • rajdutt sharma

    6 years ago

    nice blog ,But AI or deep learning would make task easy to execute and we will see much better result in coming days.

    • Patrick Donovan

      6 years ago

      I agree AI will simplify operations, but at least in the nearer term, it will provide incremental value. It won’t make a Tier II data center a Tier IV level data center, for example…it won’t replace the need to do the fundamental best practices for power and cooling infrastructure and operations.

  • Kaplan and Haenlein define artificial intelligence as a “system’s ability to correctly interpret external data, to learn from such data, and to use those learnings to achieve specific goals and tasks through flexible adaptation”. What do you think? In their article (Kaplan Andreas; Michael Haenlein (2018) Siri, Siri in my Hand, who’s the Fairest in the Land? On the Interpretations, Illustrations and Implications of Artificial Intelligence, Business Horizons), they furthermore analyze how AI is different from related concepts, such as the Internet of Things and big data, and suggest that AI is not one monolithic term but instead needs to be seen in a more nuanced way. This can either be achieved by looking at AI through the lens of evolutionary stages (artificial narrow intelligence, artificial general intelligence, and artificial super intelligence) or by focusing on different types of AI systems (analytical AI, human-inspired AI, and humanized AI). Based on this classification, they show the potential and risk of AI using a series of case studies regarding universities, corporations, and governments. Finally, they present a framework that helps organizations think about the internal and external implications of AI, which they label the Three C Model of Confidence, Change, and Control.

    • Patrick Donovan

      6 years ago

      I agree…there are many different interpretations on what exactly is meant by AI. It is a very broad term…and what it is, is evolving quickly over time. It is an exciting time for this technology.

Comments are closed.