What you need to know about the new cloud egress fees

Brief history cloud egress fees

Traditionally, cloud egress fees were unavoidable costs for moving data out of a cloud environment. These fees were often likened to death and taxes—an inevitable part of doing business in the cloud. Whenever an organization wanted to move data from one cloud provider to another or back to an on-premises environment, they faced a financial penalty. These fees varied by provider but generally amounted to a large expense, especially for large volumes of data.

Historically, cloud providers justified egress fees as a means to cover the bandwidth and operational costs associated with data transfer. For years, these fees dissuaded companies from moving data frequently, effectively locking them into long-term commitments with their initial cloud providers. The financial burden of egress fees often made companies hesitant to switch providers, even if another offered better services or pricing.

What the recent changes in cloud egress fees mean

In January 2024, Google Cloud announced it would no longer charge for data egress when customers migrate out of its cloud platform – aiming to reduce the barriers for customers looking to change their cloud strategies or repatriate data on-premises. Amazon Web Services (AWS) and Microsoft Azure followed suit in March 2024.

These changes are influenced by regulations such as the European Data Act, which seeks to foster data mobility and reduce vendor lock-in. Cloud providers moved toward eliminating egress fees to comply with these regulations and offer more flexibility to their customers. This potentially benefits organizations that need to migrate data without incurring substantial costs, which in turn fosters a more competitive and dynamic cloud market.

Despite these changes, eliminating egress fees comes with specific downsides and limitations. Free egress is only applicable if customers migrate off the platforms entirely. If data is moved out of the cloud and then back again, standard egress fees apply.

Certain specialized services, such as Microsoft’s Express Route and Azure Content Delivery Network (CDN), still incur egress fees. Partial elimination indicates that while some cost barriers are removed, organizations must still manage the complex pricing structures of cloud services.

Egress fees have always been a relatively small portion of the overall cloud-related spending. Organizations also incur costs for data storage, access, and management – meaning that even with free egress, total cloud expenses can still be substantial.

What the new egress fees mean for your cloud strategy

Why your cloud savings might be smaller than expected

Eliminating egress fees does not necessarily translate to major storage cost savings. Cloud service pricing is multifaceted, with various charges on top of egress fees, such as those for data storage, access, and management. These fees are often intricately detailed in service agreements and can add up quickly.

While removing egress fees might provide some relief, it doesn’t address the broader cost structure associated with cloud services.

For instance, data storage itself incurs costs that vary depending on the type and tier of storage used. Accessing and modifying data also attracts charges, which can accumulate, especially for organizations with heavy data usage.

Logging, monitoring, and security services further contribute to the total bill. As such, adopting an holistic approach to managing cloud expenses is key to achieve meaningful cost reductions.

Capitalize on data lifecycle management for real cost savings

Data value and usage change throughout its lifecycle, requiring dynamic data placement and right-sizing of storage and protection. Effective data management is key to optimizing cloud costs. Without ongoing data management, moving data to or from the cloud only provides temporary cost relief.

Organizations must understand the lifecycle of their data, which typically involves categorizing data into active, warm, and cold tiers. Active data is frequently accessed and requires high-performance storage, while warm data is accessed less often and can be stored in medium-performance tiers. Cold data, seldom accessed, can be placed in lower-cost, long-term storage solutions.

Dynamic data placement involves regularly assessing data usage patterns and migrating data to appropriate storage tiers based on its activity level. For example, moving rarely accessed data from high-cost storage like AWS FSx to lower-cost alternatives like AWS S3 Glacier can result in greater savings.

Identifying and deleting obsolete or redundant data reduces storage volumes and costs.

Organizations should carefully implement data governance policies that include regular audits and data hygiene practices to make sure that data is stored efficiently and cost-effectively, which in turn better aligns storage expenses with actual usage patterns.

Smart strategies to cut your cloud data costs

How an analytics-driven approach can save cloud costs

Organizations benefit from using analytics to gain deep insights into their data assets. Analytics provide a clear understanding of data usage patterns, growth rates, and storage needs.

Companies must leverage detailed data analytics to make well-informed decisions about where and how to store their data, ensuring it resides in the most cost-effective environment.

Constantly moving data between platforms in search of the best deal is inefficient and drains resources. Frequent migrations consume time and strain IT teams, often without yielding worthwhile cost benefits. Instead, a strategic, analytics-driven approach makes sure that data placement decisions are based on real-time usage patterns and future needs, optimizing both cost and performance.

Analytics tools can identify underutilized resources, redundant data, and inefficient storage practices – enabling organizations to reconfigure their storage solutions, so that active data is easily accessible while infrequently accessed data is stored in cheaper, long-term storage options. Companies must continuously monitor and analyze data to more accurately adapt to changing demands, maintaining an optimal balance between cost and performance.

Leverage dynamic tiering to cut storage expenses

Dynamic tiering typically involves categorizing data based on its usage and moving it to the appropriate storage tier – an important practice for reducing storage expenses.

For example, cold data, which is rarely accessed, can be moved from high-cost storage like AWS FSx to lower-cost options such as AWS S3 Glacier – greatly cuts storage costs while keeping data accessible when needed.

Dynamic tiering requires continuous monitoring of data activity to ensure that data is stored in the most cost-effective tier. Automated tools can run this process by regularly evaluating data usage patterns and migrating data accordingly.

Proactive management makes sure that data storage costs are minimized without compromising accessibility or performance.

Effective dynamic tiering also involves setting policies for data movement, which define criteria for when and how data should be migrated between tiers, considering factors such as access frequency, data size, and compliance requirements. Organizations must adhere to these policies to maintain control over their storage environment

Save big by deleting unnecessary data

Regularly assessing and deleting obsolete or redundant data can greatly reduce overall data volume and storage costs. Many organizations accumulate large volumes of data over time, much of which becomes outdated or irrelevant. Identifying and removing this unnecessary data frees up valuable storage space and reduces costs associated with storing and managing inactive data.

Implementing a solid data governance framework helps organizations control over their data lifecycle, typically including policies and procedures for data retention, archiving, and deletion. Regular audits and data hygiene practices make sure that only relevant and valuable data is retained, while obsolete data is systematically removed.

Organizations can then streamline their storage infrastructure, lowering costs and improving efficiency. This, in turn, also improves data management by reducing clutter and simplifying access to relevant data. Ultimately, adopting a disciplined approach to data deletion supports cost savings and operational efficiency.

Plan your cloud future: Key long-term strategies

Long-term cloud strategies should align with broader business goals, such as AI investments, data growth projections, and potential acquisitions. As organizations plan their cloud journey, it’s important to consider how these factors will influence data storage and management needs.

Investing in AI and machine learning requires significant data storage and processing capabilities. Organizations should plan for these needs by choosing scalable and flexible cloud solutions that can accommodate future growth.

Predicting data growth trends helps in selecting storage options that offer cost-effective scalability.

Potential acquisitions add another layer of complexity, as integrating new data and systems can strain existing infrastructure. Planning for these scenarios makes sure that organizations can incorporate new data without incurring excessive costs or operational disruptions. A forward-thinking strategy here also helps organizations stay agile and responsive to today’s rapidly shifting business conditions.

Integrate cloud-native solutions for ultimate flexibility

Adopting unstructured data management solutions that support cloud-native access provides flexibility in managing and moving data. These solutions let organizations access and move their data without facing licensing penalties or vendor lock-in, so that data management practices stay adaptable and cost-effective.

Cloud-native solutions come with several advantages, including improved scalability, superior performance, and reduced costs. Organizations can leverage these solutions to optimize their data management processes, so that data is always stored in the most appropriate and cost-effective environment.

Adopt an holistic approach to managing your cloud data

Focusing on data management rather than just storage is key to optimizing cloud costs. Understanding data ownership, growth patterns, and the heat of data (hot, warm, cold) is key for accurate planning.

Organizations should consider how data usage changes over time and plan their storage strategies accordingly.

A holistic approach typically involves evaluating the entire data lifecycle, from creation to archival or deletion – helping to identify opportunities for cost savings and efficiency improvements. Organizations can then make sure that it’s stored in the most appropriate and cost-effective environment throughout its lifecycle.

Use analytics and automation for maximum savings

Using analytics and policy-driven automation lets organizations move data to the best storage tier at the right time, minimizing storage licensing, egress, and other penalties. This in turn optimizes long-term savings while meeting diverse user needs and corporate requirements.

Automated tools can continuously monitor data activity and apply predefined policies to manage data movement, making sure that data is always stored in the most cost-effective tier. Analytics provide valuable insights into data usage patterns, supporting informed decision-making and strategic planning.

Tim Boesen

July 3, 2024

8 Min