February 20, 2018

The Competitive Cloud Data Center

By Enzo Greco

The Competitive Cloud

The corporate data center was long the defacto vehicle for all application deployment across an enterprise. Whether reporting to Real Estate, Finance or IT, this relationship served both data centers and their users well, as it allowed the organization to invest in specialized skills and facilities and provided the enterprise with reliable and trusted services.

The first major shift in this structure occurred with the success of Salesforce.com in the early 2000s, where Salesforce removed a significant hurdle for their key target, the VP of Sales, by providing the application “off premise”. VPs of Sales typically have no significant relationship with their IT organization, so this hosted offering allowed the VP to authorize the purchase fairly autonomously. Fast forward to today, and nearly all functions, from HR to Finance to ERP, are offered “as a Service”, further pressuring the corporate data center.

competitive cloud

A recent report from Gartner estimated that today, 60% of workloads are running on-prem, and this percentage is expected to drop to 33% by 2021 and 20% by 2024. All major software companies, from Microsoft to IBM to Oracle, are focused on offering their software as a Service and in fact report on this growth as a key metric in their financials. Microsoft, in particular, noted “that by FY ’19 we’ll have two-thirds of commercial office in Office 365, and Exchange will be north of that, 70 percent.” Traditionally, these applications have represented some of the largest enterprise workloads, so the trend is unmistakable.

There are many reasons enterprise data centers will be required: location, security, financial considerations and regulatory requirements to name a few; certain organizations will continue to use them as strategic and differentiating assets. However, enterprise data centers will increasingly need to justify themselves against alternatives ranging from Applications-as-a-Service to Amazon Web Services to colocation providers like Digital Realty and Equinix.

The market has accepted that most organizations will have a happy medium of hybrid facilities, a combination of on-prem and cloud or colocation options. A most important “point of inflection” occurs in this journey: The moment an organization extends their footprint beyond their in-house data centers, they immediately need to address a host of new issues; a few include:

  • Management by SLA: Availability will always be key, but when the facilities are not theirs, an organization must rely on the SLAs and escalation procedures of the provider. This is usually one of the biggest leaps in the journey to colocation as many procedures and scenarios need to be rethought and redefined, now with an outside party.
  • Application placement: Now that there are options, where does an application run best? On prem? In a colo? In AWS? What are the right metrics to manage this placement? Cost? Security? Availability? How dynamic should this workload placement be? This is a nascent area which many organizations overlook; many vendors, from startup to established, are investing heavily to provide tools and intelligence to assist.
  • devops: Organizations have developed detailed DevOps procedures; when a platform such as AWS is brought into the mix, those procedures need to be reworked and tailored specifically for that platform.

The trend towards hybrid clouds is unmistakable, with commensurate benefits, but an organization must balance and justify this hybridization. One of the best examples of this consideration process come from the U.S. Federal Government in their Data Center Optimization Initiative (https://datacenters.cio.gov/). DCOI is a federal mandate that “requires agencies to develop and report on data center strategies to

  • Consolidate inefficient infrastructure,
  • Optimize existing facilities,
  • Improve security posture,
  • Achieve cost savings,
  • and transition to more efficient infrastructure, such as cloud services and inter-agency shared services.”

Like numerous organizations, the U.S. Federal government has a cloud-first policy, but it has instituted a rigorous reporting and oversight process to prudently manage their data center footprint and computing strategy. Many other organizations, both in the public and private sector, should consider similar processes and transparency as they optimize their many hybrid cloud-computing options.

Existing enterprise data centers represent significant assets; how should their role in the overall cloud computing fabric be determined? Availability, cost and security will always be the dominant factors for any physical computing topology, with agility a recent addition. Define the key metrics and drivers, transparently and consistently, for the overall environment, and the best set of options will present themselves.

By Enzo Greco

Enzo Greco

Enzo Greco is Chief Strategy Officer for Nlyte Software, where he has responsibility for setting Nlyte’s strategy and direction based on market trends and dynamics, partnerships and adjacent markets. He has deep knowledge of software and the Data Center market; his current focus is on Colocation Providers, Hybrid Cloud implementations and applying Analytics overall to Critical Infrastructure, Data Center and Application Performance Management markets.

Most recently, Enzo was the VP and GM for Software within Emerson Network Power, where he was responsible for the entire Data Center software portfolio, from strategy to development to deployment. While at Emerson, he aggressively repositioned the portfolio and strategy, and led the transition efforts for Software as Emerson Network Power was sold to Platinum Equity.

Enzo started his career at AT&T Bell Laboratories, where he was part of the software research group and one of the original founders of the TUXEDO System, an enterprise grade transaction processing system; he received a fellowship for this work.

After AT&T Bell Laboratories, Enzo transitioned to Wall Street where he ran a software market and strategy consultancy with blue-chip clients ranging from Goldman Sachs to AIG. During this period, he founded several companies, the largest of which, Planetworks, was acquired by IBM in April, 1999.

Enzo then worked in IBM’s headquarters for 14 years, where he was heavily involved in IBM’s entering and growing several key markets, including Portal, Business Process Management and Smarter Commerce.

Enzo has a BS from Manhattan College in Riverdale, NY and an MS from Stanford University in Stanford, CA.

Leading Data Virtualization Solutions: 10 Services Transforming Data Management

10 Services Transforming Data Management Data virtualization is a technology that allows for the integration [...]
Read more

SIEM Tools: Cloud-Based vs. On-Premises

What Are SIEM Tools? SIEM tools are designed to help security professionals identify, track, and [...]
Read more
Randy

Karen Buffo, CMO of MixMode, on the Rise of AI in Safeguarding Digital Assets

Welcome to our Q&A session with Karen Buffo, CMO of MixMode, hosted by CloudTweaks. Today, [...]
Read more
Mariusz Michalowski

Streamlining Infrastructure Management with Terraform Automation

Streamlining Infrastructure Management The growth of cloud computing and infrastructure as code (IaC) practices has [...]
Read more
Sushil Kumar

Generative AI and Cloud Computing: The Greatest Infusion

Generative AI The fusion of cloud computing, app modernization, and artificial intelligence (AI) drives digital [...]
Read more
Chris Bray

Quantum Leap: How Post-Quantum Cryptography Will Dominate 2024 Boardroom

2024 Cybersecurity Predictions As we step into 2024, the technological landscape is poised for transformative [...]
Read more

SPONSOR PARTNER

Explore top-tier education with exclusive savings on online courses from MIT, Oxford, and Harvard through our e-learning sponsor. Elevate your career with world-class knowledge. Start now!
© 2024 CloudTweaks. All rights reserved.