CaaS Platform to Accelerate Deployment of Container Based Applications

0
3676
container-as-a-service, container based applications

 This platform allows application development and DevOps teams to easily deploy, manage and scale container-based applications and services.

To deliver Kubernetes innovations in a complete enterprise-class container management solution and accelerate delivery of modern applications for the digital economy, SUSE has launched SUSE CaaS Platform 3.

CaaS platform to expand choices for cluster optimisation

The platform provides new support for more efficient and secure container image management and simplifies deployment and management of long-running workloads.  With deep competencies in infrastructure, systems, process integration, platform security, lifecycle management and enterprise-grade support, it ensures IT operations teams can deliver the power of Kubernetes to their users quickly, securely and efficiently.

It allows the customer to optimize cluster configuration with expanded data centre integration and cluster reconfiguration options. Setting up a Kubernetes environment is simplified with improved integration of private and public cloud storage and automatic deployment of the Kubernetes software load balancer. It has a new toolchain module, which allows customers to tune the MicroOS container operating system to support custom configuration. With the new cluster reconfiguration capabilities, a user can transform a start-up cluster into a scalable and highly available environment.

It allows the user to manage container images more efficiently and securely with a local container registry. Customers will be able to download a container image from an external registry once and then save a copy in their local registry for sharing among all the nodes in the cluster.

Connecting to an internal proxy rather than an external registry and downloading from a local cache rather than a remote server improves security and performance whenever a cluster node pulls a trusted image from the local registry. In addition, the lightweight CRI-O container runtime, designed specifically for Kubernetes, is in the platform.

It also allows Simple deployment and management of long-running workloads through the Kubernetes Apps Workloads API. This API generally facilitates orchestration (self-healing, scaling, updates, termination) of common types of workloads.

LEAVE A REPLY

Please enter your comment!
Please enter your name here