The IT industry has been the reigning superstar of the global economy for the last 40 years.
The industry gave us 25-year old billionaires and “long-term” success stories like Microsoft, Intel, SAP and Oracle. It transformed computers from intimidating machines in the basement of university research centers to something you can pick up on sale at Wal-Mart. IT created new professions (web designer, network administrator, CIO) and decimated the ranks of others (file clerk, travel agent, record store slacker). You can even easily argue that jobs like CMO wouldn’t exist if the Internet hadn’t created a way for consumers and companies to interact more directly. This is the world I’ve grown up in.
It’s been a great run.
And in five to ten years, we’re going to be wondering what the big deal was.
Are we at the beginning of the end for IT? Yes. It won’t disappear. The companies that specialize in enterprise software, storage systems and all of the other necessary ingredients for a digital economy will still be around—just like mainframes still exist. But, they won’t be as compelling. Call it the paradox of success: an idea or technology becomes so pervasive that it starts to vanish in plain sight. It evolves into a component of everything else. Margins decline. Brand becomes less important. Wall Street gets bored.
Enter the cloud. As consumers and businesses move toward the cloud, you begin to see a cloudy picture for enterprise software vendors and others. The overall need for IT technology might be rising, but a growing portion of the demand is being driven by cloud providers that don’t need third-party technology or the security of a brand. In other words, now you can get fired for buying IBM. Combine this with a shift in behavior among the large enterprise customer base. One of the more interesting trends in IT over the past few years has been the willingness of large buyers to develop their own technology. Google started it—it wanted robust, inexpensive, easy-to-swap servers for its data centers that were pretty much designed with only one purpose in mind. The company is now the fifth largest manufacturer of servers in the world.
Likewise, Amazon, Facebook and others have opted for in-house solutions: take a look at some of the job listings at Facebook. I’ve heard that these companies are quickly becoming significant customers of EDA tools for chip design. Designs for some of these products have been open-sourced.
Meanwhile, the opportunities to interact with the ultimate end-user are declining. Most end-users, and there are millions more of them than cloud providers, won’t know what technology or brand of technology their business is leveraging. They may not miss the IT industry much either: the antics and licensing practices of many IT companies are often the fodder for complaints among buyers. It’s no coincidence that Salesforce.com succeeded by positioning itself as the company that wanted to eliminate software.
You can also detect a danger sign for IT in the sudden rise of Uber and other companies in the sharing economy. These companies attract some of the top young talent on the market, the same way Microsoft did ten to fifteen years ago. But, these companies aren’t technology companies, per se. Their business model strongly leverages technology, but their goal is to deliver a unique customer experience. When people talk about Uber they talk about the service, not technology. Think of where Amazon has come from. It’s the same way you talk about a new home or an office building: you’ll admire the design and the layout, not the plumbing.
To its credit, the IT industry has confounded expectations in the past and shown a remarkable ability to adapt itself to new challenges, and opportunities. But, we’re heading into an era where the solutions, skills and the go-to-market strategies of most IT companies don’t seem well adapted for the future.
So, maybe it’s not the end of IT. But, somehow “storing, retrieving, transmitting, and manipulating data” sounds more like a capability than an industry. What do you think?