Snowflake and Qlik Hold Large AI-Themed Lead-Generating Events
|
NEWS
|
During the autumn of 2024, data platform vendors Qlik and Snowflake toured the world with a series of region-specific events highlighting the importance of data management and integration solutions for enterprises today. Both events stressed the role of Artificial Intelligence (AI)-driven insights for business transformation—and underscored the importance of a coherent data strategy fueled by data management solutions from the respective vendor.
While these were predominantly lead-generating events—aiming to generate commercial interest within enterprises—they also provided profound insights into emerging trends and expectations for the broader data fabric landscape, highlighting some important gaps in data fabric providers’ offerings and leaving ample room for further activities of data fabric vendors.
Data Software Vendors Go All-in on Riding the AI Wave
|
IMPACT
|
These two regional events are indicative of the wider state that the data management and data fabric market finds itself in these days.
First, there is a push toward real-time data integration and analytics. Both Qlik and Snowflake highlighted enterprises’ demand for actionable, real-time insights. This can be seen as part of a much bigger trend toward agile and data-driven operations that require low latency integration.
Second, there is a growing focus on data governance and guaranteeing data integrity in both public and private clouds. While enterprises are prepared to see their data leave the premises for processing in public and/or private clouds, they put great emphasis on putting data governance frameworks and data lineage solutions in place to ensure full data integrity.
Third and most profoundly, data fabrics are predominantly considered a foundation for enterprise AI applications. To be most impactful, Large Language Models (LLMs) fueling enterprises’ AI use cases rely on a large foundation of—ideally structured—enterprise data. Open AI’s GPT-3 model, for example, was trained on a set of around 570 Gigabytes (GB) of data. This, in turn, means that enterprises need to have a solid data strategy and integration solution in place that prepares all of their highly fragmented data to be fed into Generative Artificial Intelligence (Gen AI) models.
The heavy reliance on AI, however, can be a double-edged sword. Without a doubt, adopting AI use cases holds great commercial opportunity for cloud-based data integration platform providers like Snowflake, Qlik, Hewlett Packard Enterprise (HPE), Palantir, and others. After all, LLMs need to be fed with standardized and aggregated enterprise data that span different sources and file types. However, tying the value proposition of tight data integration/data fabrics exclusively to the value proposition of enterprise AI not only oversimplifies the matter—and therefore, leaves out key business benefits of these solutions—but it also ties the commercial success of data integration platforms/data fabrics to the adoption of enterprise AI. As a result, any slowdown in the AI hype would inevitably translate into slower demand for data fabric solutions.
Long-Term Sustainable Enterprise Strategy Needs More than the AI Hype
|
RECOMMENDATIONS
|
Consequently, data integration/data fabric providers should look at diversifying their value proposition and educate their enterprise customers about the benefits of a coherent data strategy—underpinned by the right data integration solution. In doing so, these vendors should focus on the following three main aspects that will resonate with enterprises across different verticals:
- Enhance output and revenue levels.
- Increase output quality and customer experience quality.
- Streamline operational agility and cost savings.
In addition to centering their messaging around these elements, vendors should adopt a range of decisive steps in structuring their offering to assist enterprises with understanding the value of data integration as part of a coherent data strategy:
- Offer Data Fabrics in a Platform-as-a-Service (PaaS) Model and Offer Applications in a Modular Approach: Offering the data fabric as a platform—with applications developed by specialist partners as a modular approach—can be a win-win situation for a data fabric deployment. It gives enterprises the flexibility to scale their data integration solutions according to their needs without having to manage physical overheads. In addition, they can decide whether to use the platform and design their own software applications around it, or whether to use pre-built applications from suppliers. At the same time, it offers vendors the needed standardization that allows replicable sales, maximizing profitability.
- Design Blueprints and Reference Architecture: Reference architecture can be a relatively easy tool for data integration and data fabric vendors to help enterprises understand how to build up their data strategy and utilize data fabrics most beneficially. Vendors have started to do this already in parts by tying data fabric deployments to AI use cases. To unfold their full beneficial effect, however, they need to be tied a lot tighter to actual software applications and highlight enterprise use cases and business economic benefits.
- Create Clear and Concise Business Cases: ABI Research has determined that financials are the main driver behind enterprises’ digitization investments. Consequently, data fabric vendors should look at quantifying the costs of deployment (Total Cost of Ownership (TCO)) and expected Return on Investment (ROI)). In doing so, they are advised to anchor their communication around real-life use cases to illustrate practical applications and benefits for enterprises to learn from each other.