1. Confluent recently published its Data Streaming Report – what was the thinking behind this work?
We know that data is incredibly important to understanding a business — how products are performing, how healthy a supply chain is, what customers want, and so on. The faster business leaders can access high-quality data, the more value they can derive from it.
Unfortunately, data is often seen as a resource to pour into a massive repository, cut up into smaller segments, and examine piece-by-piece. That’s not going to paint an accurate picture of a business. Typically, this process means businesses are relying on data that is hours, or even days, out of date. They’re also potentially mixing it with data that isn’t relevant, is improperly governed, or is simply incomplete.
We need to stop thinking about data as something you stop, analyse, and send somewhere else. That’s where data streaming, and the Data Streaming report, disrupts things; it’s about getting the maximum possible value out of your data even as it’s in motion. If you stream data rather than batch process it, you can essentially build a real-time data pipeline that can analyse every single data point, properly contextualised, even as it travels to its destination within your infrastructure.
With that in mind, the Data Streaming report aims to help shift the dial on what businesses should expect from their data, and how they should think about their approach to handling, analysing, and actioning it.
2. And the key finding of the report is that data streaming plays a pivotal role when it comes to businesses maximising the potential of their data — is it strategically important?
Yes! At its best, data streaming is able to deliver a real-time digital understanding of a business as it lives and breathes — and that has massive implications for the way in which you run your business.
It makes sense, then, that 90% of those surveyed see DSP technologies as enabling continuous and up-to-date business visibility right across the business. But then think about what that visibility allows you to do.
For example, 80% cite the rapid detection and management of risks; 79% the ability to model and predict business outcomes; and 73% a reduced time to market for products and services. Allowing businesses to properly tap into richly contextualised, properly protected data, literally in real time, is a huge strategic advantage.
3. Other key takeaways from the report include the fact that data streaming platforms power faster AI adoption?
Absolutely. 63% of respondents have said that DSPs (data streaming platforms) either “extensively” or “significantly” fuel AI progress. They’re basically acting as the foundation for real-time data across the business — offering a set framework within which AI can work its magic, deriving insights and actioning them right across the organisation.
What that magic looks like is up to you, but the Data Streaming report has found that DSPs ease the path to enterprise-level adoption of AI or ML in some specific ways:
● 95% find it easier to broaden access to different data sources, helping to contextualise models
● 93% find it easier to ensure that data ingested meets quality standards
● 89% find it easier to keep AI models up-to-date with fresh, validated data streams
● 88% are more capable of democratising the use of AI/ML across the business
Ultimately, AI can only ever be as good as the data that powers it. The capacity of DSPs to connect, accelerate, and understand data has a huge impact on that quality threshold.
4. Data streaming simplifies the development of data products and encourages reusability and cost savings?
Data products are, in simple terms, trustworthy datasets that have been designed specifically to be easily re-shared and reused. It means you can guarantee the security, performance level, and compatibility of data systems right across a business.
Products vary from sector to sector — so in financial services, for example, you could use data streams in accounts and payments to build a fraud detection system. Then, you could duplicate that exact same data product to build a customer notification payment system for a banking application.
According to our data, 81% of decision-makers agree that managing data streams as products enhances their potential to be reused. In eliminating the ‘Spaghetti Junction’ of point-to-point connections, data products can make it far quicker – and much less of a headache – to access standardised, dependable data.
5. Do data streaming platforms prevent data silos or solve any data accessibility challenges?
Data silos are a huge roadblock to using enterprise data — especially if you’re using a conventional approach to data management. In our research, data being spread across separate silos was identified as one of the most prominent obstacles to making the most of data, with 66% of IT leaders identifying it as an issue.
That’s right alongside an unwillingness of owners to share data (58%), and an inability to access the data that exists (55%) — both of which are the natural consequences of siloing. When it comes to accelerating the adoption of AI and ML within a business, 64% also identified the fragmented ownership of data across disparate systems as a problem.
DSPs alleviate these concerns, because they bring access to data into one place. A DSP is able to draw from once-disparate silos and provide a ‘single source of truth’ from which decision-
makers can analyse data and coordinate the best possible response. It’s these qualities that have led 84% of decision-makers to agree that DSPs help them discover existing data, and 88% believe DSPs help them to access it.
6. Does data streaming deliver benefits in terms of business investments?
Absolutely. Of the 4,000 decision-makers we surveyed, 84% of them can cite a 2x to 10x return on data streaming investments — and 41% of those can cite an ROI of 5x or more.
The more ’mature’ the use of data streaming is in those businesses, the more likely they are to see higher ROI. The report breaks that maturity down into different levels across a curve:
● Level 1 – using data streaming for some experiments in pre-production
● Level 2 – deploying data streaming for some noncritical applications
● Level 3 – multiple deployments in production for a few critical systems, with data and usage siloed across teams
● Level 4 – several deployments in production for critical systems, with data reuse and integrations across business units
● Level 5 – data streaming as a strategy with all qualities of Level 4, plus streams that are managed as a product
64% of businesses at level 4, for example, are enjoying an ROI of 5-10x – compared to 56% at level 3. But even 67% of those at Level 2 are achieving 2-5x returns!
7. Confluent has recently launched the Build with Confluent systems integrator (SI) programme – what is this designed to achieve?
Build with Confluent is an initiative that looks to empower system integrators to accelerate the development of data streaming use cases, and deliver them to the intended audience more smoothly.
What we’re looking to deliver is the ability to power customer experiences and applications with real-time data — which, with everything we’ve talked about so far, has been clearly shown to correlate with business success. We want decision-makers to be able to capitalise on the development of the data streaming market as it grows.
8. And it does this by providing SIs with various assets/tools: specialised software bundles, Build with Confluent certification, and access to sales and marketing resources?
Yes, exactly. We provide SIs with access to a huge library of fast-deployment software, and pre-packaged code for common use cases, both of which mean that no business is starting from scratch. They achieve Build with Confluent certification at the end of the process, which is essentially a marker of quality for their infrastructure and performance.
The initiative also allows us to introduce businesses to Confluent’s wider Accelerate with Confluent program, so we can supercharge development in specific areas as and when they need it. They can build native integrations through Connect with Confluent; move away from legacy systems and open-source Apache Kafka through Confluent’s Migration Accelerator; and
now Build with Confluent allows them to rapidly develop and deploy data products as they need it.
9. Finally, what does the Confluent roadmap look like – what can you share in terms of plans for the future – technology-wise as well as developing the Channel and the overall business profile?
We’re introducing new capabilities to Confluent Cloud to make stream processing and data streaming more accessible and secure. Confluent’s new support of Table API will make Apache Flink® more readily available to Java and Python developers, while we’re also introducing a Confluent extension for Visual Studio Code, which will help to accelerate the development of real-time data streaming use cases. The platform will also enjoy greater security thanks to our private networking for Flink, which provides enterprise-level protection for use cases with sensitive data. Similarly, our Client-Side Field Level Encryption encrypts sensitive data for stronger security and privacy.
Beyond that, our partner program is enjoying fresh attention, too. The Confluent OEM Program for managed service providers (MSPs), cloud service providers (CSPs), and independent software vendors (ISVs) makes it easy to launch and enhance customer offerings with a complete data streaming platform for Apache Kafka® and Apache Flink®. Partners can bring real-time products and Kafka offerings to market faster and easily monetize customer demand for data streaming with limited risk.