Header Ads

Autonomic platforms : Revolution in 2017

IT may soon become a self-managing service provider without technical limitations of capacity, performance, and scale. By adopting a “build once, deploy anywhere” approach, retooled IT workforces—working with new architectures built upon virtualized assets, containers, and advanced management and monitoring tools—could seamlessly move workloads among traditional on-premises stacks, private cloud platforms, and public cloud services. Autonomic platforms build upon and bring together two important trends in IT: softwaredefined everything’s1 climb up the tech stack, and the overhaul of IT operating and delivery models under the DevOps2 movement. With more and more of IT becoming expressible as code—from underlying infrastructure to IT department tasks—organizations now have a chance to apply new architecture patterns and disciplines. In doing so, they can remove dependencies between business outcomes and underlying solutions, while also redeploying IT talent from rote low-value work to the higher-order capabilities needed to deliver right-speed IT.3 To truly harness autonomic platforms’ potential, one must first explore its foundational elements.

Virtualization up

The kernel of software-defined everything (SDE) has been a rallying cry in IT for decades. The idea is straightforward: Hardware assets like servers, network switches, storage arrays, and desktop facilities are expensive, difficult to manage, and often woefully suboptimized. Often procured and configured for singlepurpose usages, they are sized to withstand the most extreme estimates of potential load. Moreover, they are typically surrounded by duplicate environments (most of them idle) that were created to support new development and ongoing change. In the mainframe days, such inefficiencies were mitigated somewhat by logical partitions—a segment of a computer’s hardware resources that can be virtualized and managed independently as a separate machine. Over the years, this led to virtualization, or the creation of software-based, logical abstractions of underlying physical environments in which IT assets can be shared. Notably, virtualization shortened the time required to provision environments; rather than waiting months while new servers were ordered, racked, and configured, organizations could set up a virtual environment in a matter of hours. Though virtualization started at the server level, it has since expanded up, down,and across the technology stack. It is now also widely adopted for network, storage, and even data center facilities. Virtualization alone now accounts for more than 50 percent of all server workloads, and is projected to account for 86 percent of workloads by the end of 2016.

No comments