Kubernetes Engine is a managed production-ready environment for deploying containerized applications. It brings our latest innovations in developer productivity resource efficiency automated operations and open source flexibility to accelerate your time to market. Launched in 2015 Kubernetes Engine builds on the learnings from Google experience of running services like Gmail and YouTube in containers for over 12 years. Kubernetes Engine allows you to get up and running with Kubernetes in no time by completely eliminating the need to install manage and operate your own Kubernetes clusters.
Google Kubernetes Engine
Secured and managed Kubernetes service with four-way auto scaling and multi-cluster support.
Speed up app development without sacrificing security
Develop a wide variety of apps with support for stateful, serverless, and application accelerators. Use Kubernetes-native CI/CD tooling to secure and speed up each stage of the build-and-deploy life cycle.
Streamline operations with release channels
Choose the channel that fits your business needs. Rapid, regular, and stable release channels have different cadences of node upgrades and offer support levels aligned with the channel nature.
Manage infrastructure with Google SREs
Get back time to focus on your applications with help from Google Site Reliability Engineers (SREs). Our SREs constantly monitor your cluster and its computing, networking, and storage resources.
Enterprise-ready containerized solutions with prebuilt deployment templates, featuring portability, simplified licensing, and consolidated billing. These are not just container images, but open source, Google-built, and commercial applications that increase developer productivity, available now on Google Cloud Marketplace.
Pod and cluster autoscaling
Horizontal pod autoscaling based on CPU utilization or custom metrics, cluster autoscaling that works on a per-node-pool basis and vertical pod autoscaling that continuously analyzes the CPU and memory usage of pods and dynamically adjusts their CPU and memory requests in response. Automatically scales the node pool and clusters across multiple node pools, based on changing workload requirements.
Workload and network security
GKE Sandbox provides a second layer of defense between containerized workloads on GKE for enhanced workload security. GKE clusters natively support Kubernetes Network Policy to restrict traffic with pod-level firewall rules. Private clusters in GKE can be restricted to a private endpoint or a public endpoint that only certain address ranges can access.