Your IT infrastructure is the backbone of your data centre operations. But it also can be fraught with hurdles that stand in the way of moving your data centre into the future. This article examines stages of transformation, outlines some of the roadblocks that might thwart your progress, and spells out the fundamentals of IT transformation.
By Patrick Hermes, Services Director – Dimension Data
Where Are You In Your Data Centre Journey?
The evolution of technology is nothing short of dizzying. Everything from automobiles to vacuums to televisions to toothbrushes employs cutting-edge technologies, sometimes barely resembling the devices familiar from childhood. Telephones and computers, of course, dash from one generation to the next so quickly that it’s difficult to keep up. Successful organisations must be able to keep up with this ever-changing environment, and that includes not just products but IT infrastructure.
What kind of journey is your data centre on? For a lot of organisations, the journey is anything but fast-paced. There’s certainly no shame in that. After all, a data centre has traditionally been a huge capital expenditure, and an incredibly complicated endeavour, with countless interconnecting parts and overlapping processes.
Evolving the data centre to keep up with the growing volumes of data and the changing needs of the business has been a costly proposition, which means change has had to be carefully planned and painstakingly implemented. That’s not always compatible with lightning speed — at least it hasn’t been in the past. Here’s a glimpse of where a lot of IT people in the data centre business find themselves and their organisations today:
It’s a major task, consolidating and connecting storage and servers, achieving economies of scale, and establishing the necessary facilities with adequate power and cooling capabilities. Most businesses are still running on their tier 2 data centre infrastructure.
Many companies are in the midst of virtualising facilities, local area networks, storage area networks, and the like. Research indicates that roughly half of all servers were virtualised as 2015 began, with the expectation that the number would grow to nearly 60 percent by the end of the year.
This is an area of real promise in the data centre journey, but for many organisations, it will be awhile before there is significant automation of data centre policies and operations. It will likely be five to ten years before automation and IT management process maturity will truly be the mainstream. A key to remember regarding automation is that it must work hand-in-hand with process change. Those who don’t invest in process aren’t likely to realise many of the gains that automation and managed options can bring.
So What’s the Problem?
There are a number of potential roadblocks in a data centre’s journey into the future, hurdles that can cause frustration and second-guessing of direction.
Aging infrastructure is slowing you down
For many organisations, the network and data centre infrastructure has not been keeping pace with the aforementioned dizzying change. That’s hardly surprising in an environment where the data centre has been largely a capital expenditure — and a large one at that.
The decision-making process for large capital expenditures is necessarily slow, but with technology change accelerating, a capex IT investment can be well on its way to obsolescence even as it’s being implemented.
The consequences of an aging infrastructure are significant. To begin with, it costs more to manage. Even worse, though, aging data centre infrastructure makes it much more difficult to capitalise on the industry’s most transformational trends, including cloud, mobility, and the software-defined data centre.
Applications don’t perform as well, and that adds up to a less satisfactory user experience.
Orphan applications are stirring up trouble
Not unlike the problems with aging infrastructure, yesterday’s ways of developing and acquiring software applications are increasingly putting up roadblocks along the data centre journey. Buying an app has meant investing a large sum of money upfront, and even if the developer regularly updates the application, that creates upgrade expenses for you.
What if the developer abandons the application and gives up on updating? You’re stuck with an aging application running on your aging infrastructure, and that’s certainly not a recipe for a youthful sprint into the future.
Compatibility issues weigh you down
Compatibility is vital in marital relationships, in building effective work teams, in pairing wine with food. And it’s a major key to a well-oiled data centre operation. Too bad there are so many ways that incompatibility can rear its ugly head in the data centre. You need interoperability between technologies and data, between the physical and virtual world, between your own infrastructure and the cloud. The more entrenched older technologies are, the harder it is to achieve.
Your ability to examine new technologies is hampered
The move toward the next-generation data centre requires an open mind and a new way of thinking. At its very core, it requires a focus on and alignment with the business, what’s needed, what’s desired, and what amazing new opportunities are possible — and less of a focus on the infrastructure itself.
A narrow focus on the actual infrastructure can get bogged down with an emphasis on what can’t be done, or it might stray into a dream of shiny new products without adequate regard for whether those new products really deliver what the business needs. In other words, there’s a danger that technology may drive strategy, when the real driver should be a holistic view of the organisation, its goals, and its path forward.
A flood of data is fueling major storage growth
Storage requirements are exploding, no question about that, thanks to new applications and workloads tied to video, voice, and other rich content. Big Data is getting bigger all the time, with structured and unstructured data pouring into data centres. With the growth of Internet-enabled objects from toasters to doorknobs to thermostats, that flood is only going to accelerate.
Without a major change in approach, it’s hard to see how this reality is sustainable. Fortunately, the nextgeneration data centre offers solutions, not just for effectively managing all that growing data, but for truly extracting all of the value that lies therein.
Power and cooling are perpetual hassles
As data gets bigger, technology gets ever-hotter and more power-hungry. If all of that energy-devouring equipment is sitting there in the building with you, power and cooling are your problem. It would sure be nice to unload some of those issues on someone else, so you can focus more of your attention on using technology, not just feeding it.
Virtualisation isn’t living up to its promises
The concept of virtualisation seems so appealing, and the returns so promising. How could your organisation not come out ahead by replacing a collection of individual physical servers with virtual servers? Truth is, the return on investment varies greatly with the size of the data centre, and smaller installations may have a harder time passing the break-even point.
But it’s also worth pointing out that virtualisation isn’t in and of itself a destination on the data centre journey — it’s only part of a bigger picture, so deriving value from virtualisation is not necessarily the primary driver.
Skilled team members are hard to find and keep
If you’re reading this book, you’re clearly interested in bringing your data centre operation into the future, and most everyone knows that future includes more and more automation, and potentially less physical infrastructure onsite. But those concepts can make your IT staff a bit nervous about their own job security. For many organisations, the move toward the next-generation data centre includes the challenge of hanging onto the team members you need for a successful future.
Driving IT Transformation
The roadblocks outlined above are among the many reasons most organisations are making the move toward consuming IT as a service. Doing so unlocks a wide range of problem-solving ideas, and paves the way to harnessing game-changing abilities that have never before been available.
The consumption model
The name of this game is “pay as you consume.” No longer must you build out your own IT infrastructure with the worst-case scenario in mind. Now you can create a system that scales up and down as needed, which means that if your needs really only hit their peak about 10 percent of the year, you don’t have to pay for unused capacity the other 90 percent of the year. What you need is there, ondemand, whenever you need it.
What’s good for hardware is excellent for software, too. As you reap the benefits of Infrastructure as a Service, Platform as a Service, and Storage as a Service, your organisation will gain significant flexibility, simplicity, and savings through the Software as a Service model.
The applications you need are out there, in a catalogue form, and in many cases leaders on the business side can do their own “app store” shopping without having to trouble the IT department.
The consumption model, needless to say, brings with it an entirely different budgeting situation. Things that were once capital expenditures increasingly become operational expenditures. That frequently means less spending overall, usually means lower upfront expense, and generally brings spending accountability closer to the end users and aligns it more tightly with specific business ventures.
The technology model
Tomorrow’s technology model promises to ease a lot of IT headaches and deliver smoother operations. Technology that’s arranged on a “pay as you consume” basis tends to be more standardised, and standardisation typically adds efficiency and reduces complications. Abstraction further reduces complexities and simplifies the process for users. It’s important to note that within the technology standardisation and abstraction, there’s still plenty of flexibility to adapt to the unique needs of the organisation.
Tomorrow’s technology model won’t work if the required technology isn’t highly available when and where it’s needed. The fact of the matter is, under this model, additional or upgraded technology is more readily accessible than it would be if you had to go out and buy it, navigating the capex process along the way. This newer approach also would fail if security was lacking. The good news is that your data can be just as secure using a pay-as-you-consume approach as it would be if it sat there in a physical server right next to your desk. In fact, just because that server is right there before your eyes, that doesn’t mean it’s secure. Pick a reputable, experienced pay-as-you-consume partner and you can be assured that you’re tapping into high-end security expertise.
The IT operations model
In the transformed picture, the IT operations model is quite a bit different from what IT staffers and customers have known for so long. There’s a whole lot more automation and self-service. For example, provisioning has been a labour-intensive pursuit, but it can largely be automated in the latest operational model, as can many other operations. Noncore functions can be outsourced, for lower costs and often greater efficiency.
Best practices ensure that the work is done right, whether it’s done in-house, by an outside resource, or through an automated process. So what’s left for the IT department to do? The fun stuff! By unloading a lot of the mundane infrastructure responsibility, the IT department has more freedom to work closely with those on the business side, innovating new ways to truly meet the needs of the organisation.
The other critical work left after automation involves process. Automation can open the door to process change, and in fact, it’s the process that in many cases makes all the difference. Automate without addressing process, and you’ll end up wondering what the point was.
[toggle title=”Lire les précédents articles”]