You are not alone if you are still trying to get your team to participate in cloud cost management activities. The FinOps Foundation cites Empowering Engineers to Take Action and Organizational Adoption as the top two issues that cloud FinOps practices faced in 2022 and 2021. It’s no surprise that most cost-reduction programs stall after quick wins (such as reservation purchases) run dry. Businesses everywhere struggle with organizational and cultural behavior changes needed to increase ongoing participation in cost control. While there is no silver bullet for these changes, we want to share a common theme in organizations that are finding success – their teams have access to the right tools!
Your Team May Not Have What They Need to Manage Their Cloud Costs
Most tenured public cloud leaders have seen first-hand the pitfalls and risks around using native cloud financial management tooling at scale. If you haven’t, consider the following comparison: Imagine you did not have the proper software to manage your household finances and had to manually review your checking, savings, & credit card accounts, home & auto loans, and investment portfolios…
- How much effort and time would it take to get a complete picture of your finances?
- Which datasets would you need to consolidate and normalize to view your spending trends and analyze anomalies?
- What negative consequences might occur due to a lack of budgets or timely notifications?
Now, multiply your single household times the number of cost centers in your IT organization. Then, factor in that cloud costs, unlike household charges, meter in real-time and fluctuate every second. See how this problem snowballs at scale? When you have dozens, hundreds, or thousands of leaders not actively focused on cutting costs, it quickly becomes a major issue.
Exposing the Limitations of Native Cloud Tools
Just like for your household finances, your IT leaders require an approachable and centralized mechanism to manage their cloud usage. Even still, we see many IT leaders who are quick to argue that native tools suffice for their cost management process. This is one decision where the adage “you get what you pay for” is a fitting one. TNative cost management tools require an abundance of manual effort and overhead to use at scale. Every one of your users must maintain knowledge of each cloud platform and hold sufficient access to perform even the simplest activity. That is a whole lot of overhead just to generate a one-time report; especially when most cost owners don’t use the cloud often.
If your company has a complex multi-cloud environment producing that single report will likely require pulling data from multiple platforms and reports; each of which comes with its own billing models, metadata, structure, and processes. Even if your users manage to operationalize this process, it doesn’t solve for multi-cloud budgets, forecasting, and anomaly detection. Each cloud will have its own version. Managing these components at scale will inevitably lead to frustration for your team. What is the ultimate outcome of this frustration? You guessed it! Low participation in key cost management activities.
BI Tools are Not Your Savior
You may be asking yourself, what about PowerBI or Tableau options? Business Intelligence tools, such as these have earned their stripes in the past few years. Their versatility and integration bring data analytics to the masses and organizations consider using them to offset the gaps of native cost management functionality. Unlike cloud consoles, BI tools can ingest and normalize multiple datasets in a single, centralized location. Then, with some quick finessing, you can build spiffy visualizations that support analyzing costs across clouds. For point-in-time analysis and one-time reports, these types of tools have unmatched flexibility. (I use PowerBI daily.) It is only when organizations try to leverage them at scale that things start to unravel quickly.
Deciding to leverage a BI tool over an enterprise-ready solution is the same as making a build versus buy decision for any other COTS application. You may not realize it, but achieving success with BI tools at scale will require a dedicated team to oversee operations. This team will need to manage the design, publishing, and maintenance of the BI instances and reports and keep tabs on the constant changes to cloud data models, APIs, and integrations. Like any software team, they will also be busy reacting to evolving user requirements and fixing things that break. We have seen first-hand the investment this requires and have the ROI models to prove it.
What is most surprising about BI is not related to resource cost. Instead, it’s the lost functionality that comes with the choice. Cost visibility is a key component of any FinOps program, and it is one that BI tools can solve. But your teams likely need much more to be successful such as the ability to create cost budgets, forecast spend, detect anomalies, view resource optimization recommendations, manage reservations & savings plans, and automate savings activities. Unlike enterprise solutions, BI tools simply cannot provide these capabilities. If you are serious about reducing your cloud costs these are functions your team will need for success.
The, Not-So-Secret, Secret to Getting People Onboard
If you have read this far, I implore you to take a moment to ask yourself, “do my leaders have an accessible and easy way to visualize, analyze, track, manage, and protect their cloud spending?” If the answer is no, it may be time to consider investing in (missing something here) as a part of your cost management plan. This decision alone can lead to higher involvement from your organization. Armed with reporting, most leaders will take steps to implement cloud optimization opportunities they uncover. Users will typically make better deployment decisions once they are well-informed with cloud waste reduction and prevention happening organically. Pair all this with a healthy dose of process, a dash of enablement, and a handful of guardrails, and BANG! You have yourself a recipe for FinOps success.
Want to learn how CloudBolt can help ensure everyone has the tools to manage cloud spending? Learn more and request your demo today.
Despite the recent hype around cloud repatriation and companies moving back to private data centers, the reality is that cloud is here to stay and still growing fast. What the recent interest in cloud repatriation does show is that companies are starting to figure out that not all data should be in the cloud, and optimizing cloud costs requires a carefully planned strategy that involves both public and private clouds.
Let’s explore why companies are moving some workloads back to their private data centers, the pros and cons of repatriation, and how to best approach cloud repatriation.
Increasing cloud costs is the main reason for cloud repatriation
While there are many reasons for cloud repatriation, in a recent Dell survey of 96% of respondents (out of 136 decision makers) cited cost efficiency as the main driver in choosing to repatriate at least some of their data from the cloud. With public cloud costs in 2023 poised to increase by almost 33% in Europe and 20% in the US there doesn’t seem to be any relief in sight.
So, why is public cloud so expensive?
Public cloud isn’t necessarily more expensive than on-prem, especially early in a company’s cloud journey. It can provide immediate infrastructure and a lower barrier to entry. However, as a company grows and as the infrastructure becomes more complex, certain inefficiencies can get out of control very quickly.
- Unaccounted for cloud spend – As a company’s cloud infrastructure grows, they often end up with numerous different clouds, run by numerous dev teams, each with their own preferred tools/applications to help them automate and manage their data, processes, and applications. With so many clouds, teams, and tools, it becomes extremely complex and time-consuming to track all cloud spend, and proper cost allocation/optimization can become almost impossible when done manually. Furthermore, the cost of having teams of experts managing and rationalizing all the cloud costs and activities becomes untenable.
- Opaque, complex, and unpredictable billing – Public cloud billing is famously complex. To optimize costs, companies have to consider how much data is stored, the number of resources used, the number of operations taken, and how often it is accessed. There’s also no standardization across different providers, making it difficult to shop around for optimal pricing. For a medium-sized enterprise, a single public cloud bill can sometimes be 50 pages long. Multiply that by the number of different public clouds that are utilized and it’s no wonder why companies are overspending on cloud.
- Lack of planning – Migrating to the cloud as fast as possible (aka lift and shift) without proper planning or modifications to make the best use of cloud optimization tools leads to cloud costs that are needlessly inflated. Instead of a simple migration, it’s often more cost-efficient to keep the resource in the private cloud.
- Native Tools – Native tools are especially useful at the early stages of the cloud journey. But when cloud initiatives begin to scale, and you move to more than one cloud provider, the management tools offered are not enough. In this case, business owners may start considering repatriating some of their applications back to on-prem to reduce the costs of resources needed to manage multi-cloud deployments.
Challenges of repatriation
Repatriation does come with its own share of challenges that companies will have to address:
- Cloud lock-in – Once you’ve properly configured your processes and applications to work within a specific cloud and its native tools, it can be a daunting process to reconfigure that in a different cloud, private or public. Not to mention the expertise that your dev team has built up utilizing said cloud that will have to be relearned on another platform.
- Lack of a user interface – Public clouds have user interfaces that consumers of IT can use to order and manage their resources. Data centers, on the other hand, often lack a simple interface or self-service capabilities, and if you need a VM, network, or storage from your company’s data center you’ll have to enter a ticket with your IT team and wait for them to manually build it out.
The right way to approach repatriation
Companies obviously aren’t moving everything from their public cloud providers to a rack-mounted server in their basement. Public cloud offers amazing benefits and for most companies, public cloud will remain a powerful tool that should be balanced with a private cloud infrastructure depending on the company’s needs.
Managing numerous public clouds on top of a private data center, however, can be a daunting process, especially as a company scales. From keeping track of resource usage, to all the tools and expertise necessary to keep all the instances and VMs running can be a challenge for IT teams of any size.
For approaching a complex multi-cloud and hybrid-cloud environment, a Cloud Management Platform that sits above all cloud resources and acts as a “manager of managers” should be considered. A good tool should help you with:
- Optimizing public cloud costs – Ensure that you get the visibility, actionable insights, and cost optimization features (setting quotas on users and groups, threshold-based approvals, enforcing power schedules on resources, etc.) across all your clouds. This can sometimes be a first step in reducing the need for a time-consuming repatriation effort or at least help you evaluate where the most potential savings are.
- Ease of migration – A good tool can help you easily migrate your resources across different public clouds or to a private cloud while easily optimizing for where its deployed, ideally in an automated process. As public cloud costs fluctuate, easily migrating between resources will be key in avoiding cloud lock-in and optimizing for unpredictable cloud costs.
- A single place to manage both private and public clouds with self-service – A single place for consumers of IT resources to see, order, and manage their resources both for public and private clouds is crucial in ensuring a successful repatriation effort. Allowing consumers to get what they need when they want it without a long, drawn-out process with the IT team will ensure that consumers don’t lose the functionality they expect from public clouds.
Ready to see how you can optimize your cloud costs? Book your demo and see how CloudBolt can help you gain complete cloud control in 30 days or less.
We are thrilled to announce that CloudBolt Software is a Partner Member of the FinOps Foundation! This partnership is a strategic one that brings many benefits to our customer and partner community. By joining, we have opened up a whole new level of access and knowledge in financial operations (FinOps). Our team can now provide resources and expertise from within our own walls and the collective intelligence of the entire FinOps Foundation community.
FinOps is an approach to cloud financial operations focusing on cost management, optimization, governance, and operations. As such, it has become paramount for organizations running in the cloud to understand and apply these principles. The FinOps Foundation’s mission is “to advance every individual who manages the value of cloud wherever they are.” Our goal in becoming a partner member was to give the CloudBolt team and customers access to the evolving best practices in cloud financial operations while aligning with industry standards.
“CloudBolt’s value proposition aligns closely with the mission of the FinOps Foundation and our products solve many key challenges in the space. This partnership reaffirms our commitment to providing customers with best-in-class service and offers our team the opportunity to be active in a growing community of FinOps leaders.” – Michael Salleo, Chief Product Officer at CloudBolt
CloudBolt’s partnership with the FinOps Foundation enables us to broaden our knowledge, advance our product capabilities, and better serve the market.
As our team grows our partnership with the foundation, CloudBolt’s goal is to broaden our:
FinOps Expertise
- Customers will benefit from knowing that when they work with CloudBolt, they are working with a team well-versed in cloud financial operations. Our customer-facing leaders have been hard at work in the past few months achieving the FinOps Certified Practitioner certification. This means that as customers build, grow, and mature their organizational FinOps strategy with CloudBolt technologies, they will always have a FinOps expert by their side.
FinOps Capabilities
- CloudBolt product teams have always leveraged our customer base to prioritize how to develop our solutions. With access to the over 8,000+ active members in the FinOps Foundation community, we can hear first-hand what the FinOps user base needs. Leveraging this feedback at scale means more relevant and timely cost management features for our users.
Thought Leadership
- Our customers can look forward to increased participation in the FinOps community. This will include upcoming hosting events with the foundation—such as webinars, trade shows, and involvement FinOps dedicated events, such as FinOps X in June. We will continue to find ways to advance the cloud management body of knowledge, through Webinars, industry insights, and more. (p.s., Here is a link to last month’s webinar – 4 Critical Steps for Evolving Your FinOps Program.)
The FinOps Foundation and CloudBolt will continue to help organizations get the most from their cloud investment. See how CloudBolt’s unique approach to FinOps can help you.
Data#3, a leading Australian IT services and solutions provider, was looking to proactively control and protect their cloud investments. When a well-known platform they were leveraging to support cloud resale and cost management was not hitting the mark, they turned to CloudBolt to meet their needs for a focus on automation across their cloud management systems.