Welcome!

Artificial Intelligence Authors: Liz McMillan, Corey Roth, Yeshim Deniz, Pat Romanski, Elizabeth White

Blog Feed Post

Ever wonder how much IT infrastructure capacity you have and what it’s costing you?

It’s not something many people outside of IT think about but something nearly everyone in an organization relies on in the form of IT services, both internal and external, whether they know it or not. All organizations and, more specifically, all organizational initiatives and projects rely on IT resources at some level today. Which brings up some basic questions:

  • How much IT infrastructure capacity do you have?
  • How much IT infrastructure capacity do you need?
  • Does your IT infrastructure have the capacity to support the next IT resource-intensive project?
And also brings up some more complex questions:
  • How do you get clarity on historical demand, predicted demand, performance, utilization, etc for your IT service delivery?
  • How do you really know if your IT infrastructure utilization rates are improving?
  • Can you measure and improve your ROI on IT investments?
  • Can you properly measure the cost impact of an insource versus outsource decision?

There’s a good discussion on this topic specifically as it relates to IT capacity planning in this LinkedIn group thread. The first and most important step is understanding where you are today before you can make rational decisions about where you need to go and how to get there – like the old Yogi Berra quote “if you don’t know where you’re going, you might not get there”. But how do you determine where you are? What metrics do you use?

At 6fusion, we approach these questions from the perspective of IT as a utility (Side note: John Cowan, 6fusion Co-Founder and CEO, has a great 4-part series of posts on this topic GigaOm analyst Paul Miller has also recently published a white paper called “Metered IT: the path to the IT utility“ if you are interested in more detail). The take-aways from this approach as it relates to IT infrastructure and IT management in general are as follows:

  1. Pick a single unit of measure for IT resource consumption and stick with it. We like to use the WAC, but whatever you pick stick with it, because historical comparisons are key (see number 4 below) and if you change metrics your historical data loses relevance.
  2. Measure across all environments in use, whether those are virtual, physical, public cloud, private cloud, or hybrid cloud environments.
  3. Measure across all users, groups, business units, divisions – however your organization is segmented.
  4. Continue to measure at intervals appropriate for your business and technical requirements – could be every minute, could be every week, could be every month – the specific interval isn’t necessarily critical. What matters is that you continue to measure over time so you can compare, contrast and improve.
  5. Benchmark outside your organization – internal comparisons are good, but external comparisons and benchmarking are even better, particularly when you can benchmark against leaders in the field.

The IT industry is rapidly changing with the move to virtualization and the proliferation of public and private IT service providers.  As organizations consider moving to either a public or private cloud solutions to host their corporate infrastructure, they need to gain a fundamental understanding of how they use IT infrastructure first.  By adopting a utility metered approach to IT services, those organizations can gain a tremendous amount of insight into how their resources are being used, who us using them, and what they actually cost to run.  With this knowledge in hand, the enterprise can make educated decisions on what type of services are needed, what they cost to produce internally, and whether or not they can be produced externally at more competitive rates or service levels.  This approach allows IT to take a true business focused approach to IT and help drive value to the organization.

Shameless plug: 6fusion has been working with organizations to answer these questions for years with a combination of experience, expertise and innovative tools. We recently formalized a service to bring this approach to IT infrastructure capacity and costs analysis to market. You can download a sample output here by clicking the image below. Reach out to us if you want to explore further.

IT Infrastructure Analysis

 

 

 

 

 

 

 

 

How are you approaching IT infrastructure capacity and cost analysis? Welcome your comments and thoughts below.

The post Ever wonder how much IT infrastructure capacity you have and what it’s costing you? appeared first on 6fusion.

Read the original blog entry...

More Stories By John Cowan

John Cowan is co-founder and CEO of 6fusion. John is credited as 6fusion's business model visionary, bridging concepts and services behind cloud computing to the IT Service channel. In 2008, he along with his 6fusion collaborators successfully launched the industry's first single unit of meausurement for x86 computing, known as the Workload Allocation Cube (WAC). John is a 12 year veteran of business and product development within the IT and Telecommunications sectors and a graduate of Queen's University at Kingston.

IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of ...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.