Confluent, the streaming data platform built on top of the Apache Kafka project, has announced its Q2 product update featuring multiple capabilities, including improved role-based data access controls for enterprises.
Today, organizations worldwide are enticed to take advantage of cloud computing and avoid the operational overhead of infrastructure management. However, when migrating to the cloud, data security becomes a major point of concern. Companies want to make sure that only the right people have access to the right data, but controlling this aspect all the way down to individual Apache Kafka topics takes a lot of time and resources. After all, there are plenty of complex scripts that one would have to handle to set permissions.
Last year, Confluent introduced role-based access controls for Confluent Cloud customers to help them streamline this process for critical resources such as production environments, sensitive clusters and billing details. Now, the company is taking the feature a step ahead to cover access to individual Kafka resources including topics, consumer groups, and transactional IDs.
This allows organizations to set clear roles and responsibilities for administrators, operators and developers, allowing them to access only the data specifically required for their jobs on both data and control planes.
Along with security, the company is also enhancing the observability element of its solution with new capabilities for Confluent Cloud Metrics API that allow organizations to easily understand the usage of data streams across their business and sub-divisions to see where and how resources are used and provide real-time insights to help them ensure mission-critical services are always meeting customer expectations.
The company has also built a first-class integration with Grafana Cloud, which allows enterprises to gain visibility into their Confluent Cloud instance from within the monitoring tool. Previously, it had built integrations with Datadog and Prometheus as well.
Finally, Confluent is offering a 99.99% uptime SLA (service level agreement) for both standard and dedicated fully managed multi-zone clusters. This, the company explained, will cover not only infrastructure but also Apache Kafka performance, critical bug fixes and security updates, allowing organizations to run sensitive, mission-critical data streaming workloads in the cloud with confidence. It has also introduced recipes to help enterprises get started with stream processing and the use cases it enables.