Why Market Collaboration is Imperative in the Hyperscale Data Center Era

This audio was created using Microsoft Azure Speech Services

A record capex spend of $27B in Q1 represents a hyperscale record and 80 percent increase year-over-year. Certainly, such investment is affecting the complicated web of all those involved in this growing data center market. We explored the changing role of the “value chain” when delivering at scale during a recent 7×24 Exchange panel session. The overarching takeaway: collaboration is imperative for success.

7×24 is a not-for-profit organization promoting industry dialogue around the many challenges facing those who design, build, operate and maintain mission critical infrastructures. 7×24 Exchange allows members to work together to advance the state-of-the-art by sharing best practices, lessons learned and evolving strategies to address the challenges of reliability and energy efficiency.

The panel included Addam Friedl (kW Mission Critical), Bill Mazzetti (Rosendin Electric), Greg Botteon (Whiting-Turner Contracting) and Nathan Hazelwood (QTS) — representing what we call the data center value chain — design engineers, construction managers, electrical contractors, manufacturers and end users.

https://www.youtube.com/watch?v=b_yGjfTBHHM

 

A look at the changing characteristics of data centers

The changing business model, customer requirements and purchasing behavior over the years have led up to the changes we’re seeing today in the value chain. From 2000-2005 — the days of build it and they will come — we saw the prevalence of enterprise and retail colocation data centers. Top requirements were reliability, speed-to-market and consolidation. For the most part, purchasing was made through general contractors.

In 2010, the size of data centers increased up to 5 to 10MW, enterprise and retail colocation data centers were holding their own and the wholesale real estate market was gaining legs.

The emphasis remained on reliability and speed and cost became more of a focus. Direct purchasing increased but the trades were still very much involved.

The cloud wars heated up in 2015 with a huge shift away from individual companies building data centers. The emphasis on speed and cost grew, and the needs for scale and standardization emerged. Direct purchasing started to become the norm.

Today, in the hyperscale era, we see a rise in cloud, edge and wholesale deployments with key drivers of increased standardization, scale, speed to market at lower costs. More subsystems than ever are being procured by end users.

How have market changes affected the value chain in the hyperscale market?

Unlike in previous market waves, suppliers, contractors, manufacturers and end users are working side by side from the start. Everyone is at the table on day one. In this hyperscale era, standardization, scale and speed of deployment are the drivers. A higher number of subsystems is altering operations. We must all work together to navigate the changes and complexity, otherwise, we won’t be able to keep up with the speed of hyperscale.

In fact, the faster the market goes, the closer the stakeholders need to be. With scale we’ve never seen before comes a degree of cooperation that’s also unprecedented. Getting used to this new dynamic means getting past the tendency to guard the “secret sauce” everyone believes they have.

Considering how we all operated in the past, collaboration of this type is a big leap of faith.

Bye-bye bid and buy

From a provider perspective, the traditional bid buy process has gone to the wayside in order to support the new speed to market. Old methods were somewhere between seven and 21 days for approval — that’s a lifetime in the context of hyperscale.

New tools have helped automate the process, and virtual technology allows for project modeling and improved coordination with end users, trade partners, constructors, etc. Prefabrication is another way to expedite.

Here again, collaboration is key to success. For example, a couple months can be shaved off the schedule by integrating early on with design and buying equipment sooner. Forming partnerships early also leads to standardization — now a given for hyperscale.

New possibilities come with new pitfalls

Of course, the sheer size of hyperscale builds often involves thousands of pieces of equipment, so control is essential. One minor issue can rapidly become systemic. The magnitude also calls for utilities to now be at the table.

The labor shortage is becoming a bigger obstacle, which makes planning more important than ever — not just for scheduling preceding and succeeding tasks, but for what can be done outside of the field. Getting direct labor off the job site into the shop helps — even better if you can go from the shop to the factory.

With the condensed build time, hyperscale is turning out to be so lucrative for contractors, many are making enough money to only work 10 months of the year. Every construction market is busy right now, and projections show a deficit of six million trade laborers by 2024. Supervisors are even harder to find.

To meet the demands of a healthy 3-6 month pipeline, it’s now necessary to ramp up labor forces and work with trade unions way ahead of the project start.

Partnership is the only way

QTS LobbyThe hyperscale market has taken value chain relationships to a new level. Collaboration is imperative. Panelists declared a new paradigm, one without “vendors,” but, instead, with “partners in the trenches.” The new business model calls for a great deal of communication and high levels of trust.

“The most successful meetings include the customer and representatives from design, engineering, supplier, contractor, etc. acting as one, the customer doesn’t even realize they are from different companies,” said Nathan Hazelwood of QTS.

To learn more about how QTS is navigating the hyperscale landscape by partnering with Schneider Electric and increasing their speed to market, watch the video, Standardization for Speed.

Tags: , , ,