The NSA’s Adoption of Hybrid Cloud Signals Greater Openness to Public Cloud Across Verticals: Opportunities Exist for Vendors
By Shadine Taufik |
19 Nov 2024 |
IN-7607
Log In to unlock this content.
You have x unlocks remaining.
This content falls outside of your subscription, but you may view up to five pieces of premium content outside of your subscription each month
You have x unlocks remaining.
By Shadine Taufik |
19 Nov 2024 |
IN-7607
The U.S. NSA Debuts a Hybrid Cloud for the Intelligence Community |
NEWS |
The U.S. National Security Agency (NSA) recently announced the deployment of its Amazon Web Services (AWS)-built hybrid cloud platform. This emerges as part of its reported US$10 billion contract with the hyperscaler for its Hybrid Compute Initiative, hosted on its GovCloud offering, built to house controlled unclassified information in the country. This new addition enables commercial cloud offerings on the ecosystem platform, which in addition to being used by the NSA, supports the U.S. Intelligence Community, as well as the Defense Department. The agency explained that this addition to the platform arose out of the mass accumulation of data collected across public bodies, leading to a growing need for stronger analytics capabilities. The new addition looks to provide more options in compute solutions across missions, giving its users the flexibility to choose distinct solutions for varied purposes, enabling higher processing speed in cases that require it.
Similar to a sovereign cloud, AWS GovCloud is operated by U.S. citizens, adhering to tight regulations outlined in the International Traffic in Arms Regulations (ITAR), Federal Risk and Authorization Management Program (FedRAMP), and Department of Defense (DoD) Cloud Computing Security Requirements Guide (SRG) Impact Levels 2, 4, and 5. This signifies the implementation of tight regulations on connectivity options, personnel requirements, and security controls on information transmitted. With data centers across the eastern and northwestern regions of the United States, GovCloud lies in an AWS “partition,” which also isolates the data, network, and machine components of the cloud from other AWS partitions. However, if they choose to do so, public enterprises can now tap into AWS’s secure commercial cloud offerings. The DoD has been bullish on this form in particular—it spent 97% of its 2024 budget on commercial cloud solutions. This is largely due to the body’s multi-vendor Joint Warfighting Cloud Capability (JWCC), which aims to enable the body access to commercial cloud capabilities and services across classification levels.
The Sector-Agnostic Shift to More Open Data Hosting Practices |
IMPACT |
Globally, countries have remained vocal about doubling down on data sovereignty and localization cloud services to power their public programs—a third of Asia-Pacific governments plan to adopt such practices by 2026. Additionally, in Europe, the U.K. government has been in the process of negotiating up to US$9.5 billion in cloud services contracts with local hosts, as well as separately looking at tech firms to help public bodies transition to the cloud. Clearly, digitalization, driven by migrating data to the cloud, is a top priority for most nations, as cloud technology and accompanying software have become robust enough to mitigate cyberattacks and risks. On top of this, governments have become better educated about the capabilities and limitations of such technology, setting forth realistic and measured plans to harness cloud practices to enable more efficient, data-driven operations. This sentiment has been echoed within the private sector, as many client companies are becoming more comfortable with hosting proprietary and sensitive data in the public cloud. Although it may historically have been perceived as more secure, solely on-premises and private clouds have become less popular, due to a number of limiting factors.
The public cloud has become the most obvious, cost-effective choice for most enterprise customers due to the relatively high Capital Expenditure (CAPEX) requirements of private on-premises cloud infrastructure. This is further exacerbated by the added labor of scaling such services. Planning for new, more innovative applications (such as Artificial Intelligence (AI) and Machine Learning (ML)) in a private, on-premises cloud requires purchasing hardware in advance, as well as investing in costs related to maintenance, upgrading, and overall management. On the other hand, a hybrid cloud enables companies to easily expand their cloud capabilities through upgrading their public cloud subscriptions, while retaining highly sensitive or integral data in the private cloud.
In a similar vein, on-premises has also lost its security edge as public cloud providers have reinforced their offerings. Hyperscaler programs such as AWS with Identity & Access Management (IAM) and Security Hub, Microsoft’s Azure Sentinel and Security Center, and Google Cloud’s BeyondCorp and Chronicle Security Operations have reiterated the vendors’ prioritization of regulations and proactive security. This makes it difficult to justify expenditure on purely private clouds. Further, hybrid clouds serve as a best-of-both solution, as more confidential or core data are kept in a private cloud, while more ambient, and larger compute tasks are processed and stored in the public cloud.
Lastly, the lack of dynamic exposure to the larger data ecosystem is a big drawback of a pure private cloud. Greater benefits emerge through the ability for hybrid cloud models to rely on multiple vendors. This aids in avoiding lock-ins, as the burden of vendor-specific hardware is no longer a variable. The configuration also allows for better disaster recovery and business continuity, as data are kept across an array of hosts.
Security Services and Emphasizing Data Safety |
RECOMMENDATIONS |
Overall, a paradigm shift has occurred toward more public, open cloud networking—there is much less reliance on private infrastructure to host data, as it is relatively costly, and limits the scalability and agility of data use and storage. Hybrid cloud models have become the more popular option, bridging the gap between private and public.
However, this model does not come without drawbacks and several challenges remain.
- Data Management Barriers: Data management across private and public clouds may be difficult, especially when first adopting a hybrid cloud—understanding the best means of data architecture building may be an obstacle, especially when considering the presence of data silos within and across enterprises.
- Lack of Technical Talent: Understanding the new virtual infrastructure may be complex for enterprise tech operators.
- Slow Adoption Due to ROI Woes: Companies may also find it difficult to see the Return on Investment (ROI) and understand the value of hybrid cloud infrastructure if their firm has been implementing a private cloud for years—especially considering the need to retrain or hire personnel to manage this new system.
- New Vendors and Unfamiliarity: Firms may be hesitant to incorporate a new vendor into the mix out of reliability fears.
Companies are no longer discriminating against public cloud services, as perceptions of risk have shifted—sweetening hybrid cloud propositions. To capitalize on this opportunity, vendors must take decisive steps to simplify and accelerate adoption.
- Data Migration and Hybrid Cloud Transition Services: One of the most difficult parts of shifting between cloud models is the movement of relevant data from on-premises to a public cloud. Hyperscalers must consider offering consultations for data architecture transformation, or provide applications that simplify this process as much as possible. Data unification from intra-enterprise silos must be included within this realm, as firms look to adopt the cloud for digital transformation. Adopting companies cannot expect to fully harness the benefits of a hybrid cloud if data are lacking, which is why it is important to pool all available data.
- Security Services: To retain the image of infallibility, cloud service providers should continue to offer stringent security programs, as AWS, Google Cloud, and Microsoft Azure have done. Through employing popular cybersecurity techniques such as zero trust, IAM, and shared responsibility, vendors can ensure that their offerings are competitive, up-to-par with market incumbents, and continue to be perceived favorably by clients.
- Industry-Specific Offerings: Hyperscalers should look to capture players through creating offerings and use cases tailored through the compliance, performance, and security needs of specific verticals. Especially useful in healthcare and financial services contexts, enterprises will also be able to better understand the technology, incentivize adoption, and provide a clearer way to measure ROI. An interesting example is AWS’s Healthcare & Life Sciences offering, which adheres to over a thousand global compliance requirements, including the Health Insurance Portability and Accountability Act (HIPAA), containing dozens of related datasets, with applications such as clinical conversation scribing, medical imaging analysis, and health data dashboarding to support medical staff and researchers alike.
- Partnerships, Interoperability, and Fostering an Open Environment: Enterprises, as well as public government bodies, are wary of committing to single vendors due to potential lock-ins. Partnerships for the provision of multi-cloud solutions may ease this worry in customers, as well as offer a richer, more diverse set of capabilities. It is also important to allow for interoperability, in whichever hardware and software is to be offered. As enterprises look to adopt a more diverse set of applications, devices, and cloud services, vendors must consider the advantages of pivoting from a more closed proprietary model to an open one to encourage innovation.