Google Cloud Platform (GCP)¶
Atlas supports deploying clusters onto Google Cloud Platform (GCP). This page provides reference material related to Atlas cluster deployments on GCP. You can't deploy serverless instances on GCP.
Depending on your cluster tier, Atlas supports the following GCP
regions. While all of the following regions support
some regions don't support Free or Shared-Tier clusters. A check mark
indicates support for Free or Shared-Tier clusters. The Atlas
API uses the corresponding Atlas Region.
Cluster Configuration Options¶
Each Atlas cluster tier comes with a default set of resources. Atlas provides the following resource configuration options:
Custom Storage Size¶
The size of the server root volume. Atlas clusters deployed onto GCP use SSD persistent storage .
The actual amount of RAM available to each cluster tier might be slightly less than the stated amount, due to memory that the kernel reserves.
The following cluster tiers are available:
Can use this tier for a multi-cloud cluster.
Unavailable in the following regions:
Atlas limits R-class instances to the following regions:
For purposes of management with the Atlas API, cluster
tier names that are prepended with
R instead of an
for example) run a low-CPU version of the cluster.
When creating or modifying a cluster with the API, be sure to specify
your desired cluster class by name with the
Low-CPU cluster tiers (R40, R50, R60, etc) are available in multi-cloud cluster configurations as long as the cluster tier is available for all the regions that the cluster uses.
Workloads typically require less than
2TB of storage.
Atlas configures the following resources automatically and does not allow user modification:
The input/output operations per second (IOPS)  the system can perform. This value is fixed at 30 IOPS per GB for reads and 30 IOPS per GB for writes for a total of 60 IOPS per GB.
M30 cluster has a default storage size of 40 GB. This
results in a maximum read speed of 1,200 IOPS and a maximum write
speed of 1,200 IOPS. Increasing the storage size to 100 GB per
cluster increases the maximum read speed of 3,000 IOPS and a
maximum write speed of 3,000 IOPS.
Encrypted Storage Volumes¶
GCP storage volumes are always encrypted.
Each GCP region includes a set number of independent zones. Each zone has power, cooling, networking, and control planes that are isolated from other zones. For regions that have at least three zones (3Z), Atlas deploys clusters across three zones. For regions that only have two zones (2Z), Atlas deploys clusters across two zones.
The Atlas Add New Cluster form marks regions that support 3Z clusters as Recommended, as they provide higher availability.
The number of zones in a region has no effect on the number of MongoDB nodes Atlas can deploy. MongoDB Atlas clusters are always made of replica sets with a minimum of three MongoDB nodes.
For general information on GCP regions and zones, see the Google documentation on regions and zones.
Regions with at Least Three Zones¶
If the selected GCP region has at least three zones, Atlas clusters are split across three zones. For example, a three node replica set cluster would have one node deployed onto each zone.
3Z clusters have higher availability compared to 2Z clusters. However, not all regions support 3Z clusters.
|||(1, 2) For detailed documentation on Google storage options, see Storage Options.|
Along with global region support, the following product integrations enable applications running on GCP, such as Google Compute Engine, Google Cloud Functions, Google Cloud Run, and Google App Engine, to use Atlas instances easily and securely.
- Google Virtual Private Cloud (VPC): Set up network peering connections with GCP
Security and Identity Services¶
- Google Identity: Sign up and log into Atlas with Google
Google Cloud Key Management Service (KMS):
- GCP Marketplace: Pay for Atlas usage via GCP
For more information on how to use GCP with Atlas most effectively, review the following best practices, guides, and case studies:
- Case Study: Why build apps on a cloud-native database like MongoDB Atlas?
- Goolge Data Stream: Streamline your real-time data pipeline with Datastream and MongoDB