“The data that today’s businesses are built on has become more distributed geographically and across different cloud environments, making it harder to bring together in a meaningful way,” said Jay Kreps, co-founder and CEO, Confluent. “Event streaming has emerged as the platform that connects real-time data from every part of an organization for a company to act as one to build world-class applications.”
As organizations rush to modernize their IT stack amid pressure to develop the real-time applications today’s customers demand, the use of hybrid and multi-cloud becomes inevitable. According to recent Gartner research, “Nearly half of data management implementations use both on-premises and cloud environments (and sometimes actively manage data across them). More than 80% of responding public cloud users indicate that their organizations use more than one cloud service provider (again, sometimes managing data between them),” (Gartner “Understanding Cloud Data Management Architectures: Hybrid Cloud, Multicloud and Intercloud,” May 2020, Gartner subscription required). The unintentional consequence is often the creation of entirely new silos of data that can act as barriers to innovation. To remain competitive in the modern world, companies must liberate data from these silos and build a global data architecture that connects multiple public cloud providers and on-premises environments.
This is why many organizations have turned to Kafka—it has emerged as the best way to share data throughout an organization in real time. However, connecting Kafka clusters between different environments and across long distances is too complex for most organizations and does not always replicate data byte for byte, putting hybrid cloud and multi-cloud Kafka deployments out of reach. Further, the full power of Kafka can only be unlocked when it serves as a central nervous system for the entire organization and all the data in an organization is available to all applications and users. So while there’s an open source solution to this data problem available to everyone, it remains too complex for most.
Confluent Platform 6.0 Introduces Cluster Linking, Simplifying Hybrid Cloud and Multi-Cloud Deployments for Kafka
With Cluster Linking in Confluent Platform 6.0, organizations have a simple and scalable way of connecting data between clouds and across hybrid architectures, quickly and efficiently. Cluster Linking allows two or more Kafka clusters to replicate data without the need for other components, eliminating the burden of learning, monitoring, and managing another distributed system. Built on top of a highly performant Kafka broker protocol, Cluster Linking ensures that the replicated data is an exact mirror of the source data, making hybrid, multi-cloud, and cloud migration use cases easier to implement. Now, organizations can easily bridge Kafka clusters between these different environments and across any distance to build a central nervous system for their global business.
*Confluent Platform 6.0 Delivers the First Half of Project Metamorphosis in a Single Platform *
As a part of Confluent’s relentless pursuit to ensure that any organization can make event streaming the central nervous system for its business, Confluent initiated the next generation of event streaming through Project Metamorphosis. With Project Metamorphosis, Confluent is solving the most pressing issues that organizations run into when making event streaming a pervasive part of their business by bringing the foundational traits of cloud-native data systems to Kafka. In the first phase of Project Metamorphosis, Confluent has introduced greater elasticity, improved cost-effectiveness, infinite data retention, and global availability.
Confluent Platform 6.0 delivers the first half of Project Metamorphosis themes in one self-managed platform: