Professional Cloud Architect

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9
At a glance
Powered by AI
The recommended approach is to use Google Cloud Directory Sync to synchronize on-premises Active Directory usernames with cloud identities and configure SAML SSO for identity federation. Exporting logs to Cloud Storage and using BigQuery for analysis is recommended for auditing IAM policy changes. Validating the service account permissions of the Cloud SQL proxy container and checking logs in Stackdriver is recommended to troubleshoot database connection issues in a Kubernetes application.

Use Google Cloud Directory Sync to synchronize Active Directory usernames with cloud identities and configure SAML SSO.

Enable Google Cloud Storage (GCS) log export to audit logs Into a GCS bucket and delegate access to the bucket.

Free Exam/Cram Practice Materials - Best Exam Practice Materials

IT Certification Guaranteed, The Easy Way!

NO.1 Your company wants to start using Google Cloud resources but wants to retain their on-
premises Active Directory domain controller for identity management. What should you do?
A. Use Compute Engine to create an Active Directory (AD) domain controller that is a replica of the
onpremises AD domain controller using Google Cloud Directory Sync.
B. Use Google Cloud Directory Sync to synchronize Active Directory usernames with cloud identities
and configure SAML SSO.
C. Use the Admin Directory API to authenticate against the Active Directory domain controller.
D. Use Cloud Identity-Aware Proxy configured to use the on-premises Active Directory domain
controller as an identity provider.
Answer: B
Explanation:
https://cloud.google.com/solutions/federating-gcp-with-active-directory-
introduction#implementing_federation

NO.2 Auditors visit your teams every 12 months and ask to review all the Google Cloud Identity and
Access Management (Cloud IAM) policy changes in the previous 12 months. You want to streamline
and expedite the analysis and audit process. What should you do?
A. Enable Google Cloud Storage (GCS) log export to audit logs Into a GCS bucket and delegate access
to the bucket.
B. Enable Logging export to Google BigQuery and use ACLs and views to scope the data shared with
the auditor.
C. Create custom Google Stackdriver alerts and send them to the auditor.
D. Use cloud functions to transfer log entries to Google Cloud SQL and use ACLS and views to limit an
auditor's view.
Answer: A
Explanation:
Export the logs to Google Cloud Storage bucket - Archive Storage, as it will not be used for 1 year,
price for which is $0.004 per GB per Month. The price for long term storage in BigQuery is $0.01 per
GB per Month (250% more). Also for analysis purpose, whenever Auditors are there(once per year),
you can use BigQuery and use GCS bucket as external data source. BigQuery supports querying Cloud
Storage data from these storage classes:
Standard Nearline Coldline Archive

NO.3 You have deployed an application to Kubernetes Engine, and are using the Cloud SQL proxy
container to make the Cloud SQL database available to the services running on Kubernetes. You are
notified that the application is reporting database connection issues. Your company policies require a
post-mortem. What should you do?
A. In the GCP Console, navigate to Cloud SQL. Restore the latest backup. Use kubect1 to restart all
pods.
B. Use gcloud sql instances restart.
C. Validate that the Service Account used by the Cloud SQL proxy container still has the Cloud Build
Editor role.
D. In the GCP Console, navigate to Stackdriver Logging. Consult logs for Kubernetes Engine and Cloud
SQL.

Get Latest & Valid Google Exam's Question and Answers2 from Freecram.net. 1
https://www.freecram.net/exam/Professional-Cloud-Architect-google-certified-professional-cloud-architect-gcp-e8807.html
Free Exam/Cram Practice Materials - Best Exam Practice Materials
IT Certification Guaranteed, The Easy Way!

Answer: D

NO.4 For this question, refer to the Dress4Win case study.


As part of their new application experience, Dress4Wm allows customers to upload images of
themselves. The customer has exclusive control over who may view these images. Customers should
be able to upload images with minimal latency and also be shown their images quickly on the main
application page when they log in. Which configuration should Dress4Win use?
A. Use a distributed file system to store customers' images. As storage needs increase, add more
persistent disks and/or nodes. Use a Google Cloud SQL database to maintain metadata that maps
each customer's ID to their image files.
B. Use a distributed file system to store customers' images. As storage needs increase, add more
persistent disks and/or nodes. Assign each customer a unique ID, which sets each file's owner
attribute, ensuring privacy of images.
C. Store image files in a Google Cloud Storage bucket. Add custom metadata to the uploaded images
in Cloud Storage that contains the customer's unique ID.
D. Store image files in a Google Cloud Storage bucket. Use Google Cloud Datastore to maintain
metadata that maps each customer's ID and their image files.
Answer: D

NO.5 You are helping the QA team to roll out a new load-testing tool to test the scalability of your
primary cloud services that run on Google Compute Engine with Cloud Bigtable. Which three
requirements should they include? Choose 3 answers
A. Instrument the production services to record every transaction for replay by the load-testing tool.
B. Ensure all third-party systems your services use are capable of handling high load.
C. Ensure that the load tests validate the performance of Cloud Bigtable.
D. Schedule the load-testing tool to regularly run against the production environment.
E. Create a separate Google Cloud project to use for the load-testing environment.
F. Instrument the load-testing tool and the target services with detailed logging and metrics
collection.
Answer: C,E,F

NO.6 For this question, refer to the EHR Healthcare case study. You need to define the technical
architecture for hybrid connectivity between EHR's on-premises systems and Google Cloud. You want
to follow Google's recommended practices for production-level applications. Considering the EHR
Healthcare business and technical requirements, what should you do?
A. Configure two Partner Interconnect connections in one metro (City), and make sure the
Interconnect connections are placed in different metro zones.
B. Configure Direct Peering between EHR Healthcare and Google Cloud, and make sure you are
peering at least two Google locations.
C. Configure two VPN connections from on-premises to Google Cloud, and make sure the VPN
devices on-premises are in separate racks.
D. Configure two Dedicated Interconnect connections in one metro (City) and two connections in
another metro, and make sure the Interconnect connections are placed in different metro zones.
Answer: D

Get Latest & Valid Google Exam's Question and Answers3 from Freecram.net. 2
https://www.freecram.net/exam/Professional-Cloud-Architect-google-certified-professional-cloud-architect-gcp-e8807.html
Free Exam/Cram Practice Materials - Best Exam Practice Materials
IT Certification Guaranteed, The Easy Way!

Explanation:
based on the requirement of secure and high-performance connection between on-premises systems
to Google Cloud
https://cloud.google.com/network-connectivity/docs/interconnect/tutorials/partner-creating-9999-
availability

NO.7 You have been asked to select the storage system for the click-data of your company's large
portfolio of websites. This data is streamed in from a custom website analytics package at a typical
rate of 6,000 clicks per minute, with bursts of up to 8,500 clicks per second. It must been stored for
future analysis by your data science and user experience teams. Which storage infrastructure should
you choose?
A. Google cloud Datastore
B. Google Cloud Storage
C. Google Cloud SQL
D. Google Cloud Bigtable
Answer: B
Explanation:
https://cloud.google.com/bigquery/docs/loading-data-cloud-storage

NO.8 Your company captures all web traffic data in Google Analytics 260 and stores it in BigQuery.
Each country has its own dataset. Each dataset has multiple tables. You want analysts from each
country to be able to see and query only the data for their respective countries.
How should you configure the access rights?
A. Create a group per country. Add analysts to their respective country-groups. Create a single group
'all_analysts', and add all country-groups as members. Grant the 'all-analysis' group the IAM role of
BigQuery jobUser. Share the appropriate dataset with view access with each respective analyst
country-group.
B. Create a group per country. Add analysts to their respective country-groups. Create a single group
'all_analysts', and add all country-groups as members. Grant the 'all-analysis' group the IAM role of
BigQuery dataViewer. Share the appropriate dataset with view access with each respective analyst
country-group.
C. Create a group per country. Add analysts to their respective country-groups. Create a single group
'all_analysts', and add all country-groups as members. Grant the 'all-analysis' group the IAM role of
BigQuery dataViewer. Share the appropriate table with view access with each respective analyst
countrygroup.
D. Create a group per country. Add analysts to their respective country-groups. Create a single group
'all_analysts', and add all country-groups as members. Grant the 'all-analysis' group the IAM role of
BigQuery jobUser. Share the appropriate tables with view access with each respective analyst
countrygroup.
Answer: A

NO.9 For this question, refer to the EHR Healthcare case study. EHR has single Dedicated
Interconnect connection between their primary data center and Googles network. This connection
satisfies EHR's network and security policies:
* On-premises servers without public IP addresses need to connect to cloud resources without public

Get Latest & Valid Google Exam's Question and Answers4 from Freecram.net. 3
https://www.freecram.net/exam/Professional-Cloud-Architect-google-certified-professional-cloud-architect-gcp-e8807.html
Free Exam/Cram Practice Materials - Best Exam Practice Materials
IT Certification Guaranteed, The Easy Way!

IP addresses
* Traffic flows from production network mgmt. servers to Compute Engine virtual machines should
never traverse the public internet.
You need to upgrade the EHR connection to comply with their requirements. The new connection
design must support business critical needs and meet the same network and security policy
requirements. What should you do?
A. Add three new Cloud VPN connections
B. Add a new Carrier Peering connection
C. Add a new Dedicated Interconnect connection
D. Upgrade the bandwidth on the Dedicated Interconnect connection to 100 G
Answer: C
Explanation:
The case does not call out the throughput being an issue. However, to achieve 99.99%, you need to
have 4 connections as per Google recommendations. However, in the options only A has the option
to add an additional Interconnect connection. https://cloud.google.com/network-
connectivity/docs/interconnect/concepts/dedicated-overview#availability

NO.10 Your company is designing its data lake on Google Cloud and wants to develop different
ingestion pipelines to collect unstructured data from different sources. After the data is stored in
Google Cloud, it will be processed in several data pipelines to build a recommendation engine for end
users on the website. The structure of the data retrieved from the source systems can change at any
time. The data must be stored exactly as it was retrieved for reprocessing purposes in case the data
structure is incompatible with the current processing pipelines. You need to design an architecture to
support the use case after you retrieve the dat a. What should you do?
A. Store the data in a Cloud Storage bucket. Design the processing pipelines to retrieve the data from
the bucket
B. Send the data through the processing pipeline, and then store the processed data in a BigQuery
table for reprocessing.
C. Send the data through the processing pipeline, and then store the processed data in a Cloud
Storage bucket for reprocessing.
D. Store the data in a BigQuery table. Design the processing pipelines to retrieve the data from the
table.
Answer: A

NO.11 You team needs to create a Google Kubernetes Engine (GKE) cluster to host a newly built
application that requires access to third-party services on the internet. Your company does not allow
any Compute Engine instance to have a public IP address on Google Cloud. You need to create a
deployment strategy that adheres to these guidelines. What should you do?
A. Create a Compute Engine instance, and install a NAT Proxy on the instance. Configure all
workloads on GKE to pass through this proxy to access third-party services on the Internet
B. Configure the GKE cluster as a private cluster, and configure Cloud NAT Gateway for the cluster
subnet
C. Configure the GKE cluster as a route-based cluster. Configure Private Google Access on the Virtual
Private Cloud (VPC)

Get Latest & Valid Google Exam's Question and Answers5 from Freecram.net. 4
https://www.freecram.net/exam/Professional-Cloud-Architect-google-certified-professional-cloud-architect-gcp-e8807.html
Free Exam/Cram Practice Materials - Best Exam Practice Materials
IT Certification Guaranteed, The Easy Way!

D. Configure the GKE cluster as a private cluster. Configure Private Google Access on the Virtual
Private Cloud (VPC)
Answer: B
Explanation:
A Cloud NAT gateway can perform NAT for nodes and Pods in a private cluster, which is a type of
VPC-native cluster. The Cloud NAT gateway must be configured to apply to at least the following
subnet IP address ranges for the subnet that your cluster uses:
Subnet primary IP address range (used by nodes)
Subnet secondary IP address range used for Pods in the cluster
Subnet secondary IP address range used for Services in the cluster
The simplest way to provide NAT for an entire private cluster is to configure a Cloud NAT gateway to
apply to all of the cluster's subnet's IP address ranges.
https://cloud.google.com/nat/docs/overview

NO.12 You are implementing Firestore for Mountkirk Games. Mountkirk Games wants to give a new
game programmatic access to a legacy game's Firestore database. Access should be as restricted as
possible. What should you do?
A. Create a service account (SA) in the legacy game's Google Cloud project, add a second SA in the
new game's IAM page, and then give the Organization Admin role to both SAs
B. Create a service account (SA) in the lgacy game's Google Cloud project, give the SA the
Organization Admin rule and then give it the Firebase Admin role in both projects
C. Create a service account (SA) in the legacy game's Google Cloud project, add this SA in the new
game's IAM page, and then give it the Firebase Admin role in both projects
D. Create a service account (SA) in the legacy game's Google Cloud project, give it the Firebase Admin
role, and then migrate the new game to the legacy game's project.
Answer: A

NO.13 Your company recently acquired a company that has infrastructure in Google Cloud. Each
company has its own Google Cloud organization Each company is using a Shared Virtual Private Cloud
(VPC) to provide network connectivity tor its applications Some of the subnets used by both
companies overlap In order for both businesses to integrate, the applications need to have private
network connectivity. These applications are not on overlapping subnets. You want to provide
connectivity with minimal re-engineering. What should you do?
A. Set up VPC peering and peer each Shared VPC together
B. Configure SSH port forwarding on each application to provide connectivity between applications i
the different Shared VPCs
C. Set up a Cloud VPN gateway in each Shared VPC and peer Cloud VPNs
D. Migrate the protects from the acquired company into your company's Google Cloud organization
Re launch the instances in your companies Shared VPC
Answer: B

NO.14 You need to ensure reliability for your application and operations by supporting reliable task
a scheduling for compute on GCP. Leveraging Google best practices, what should you do?
A. Using the Cron service provided by App Engine, publish messages to a Cloud Pub/Sub topic.

Get Latest & Valid Google Exam's Question and Answers6 from Freecram.net. 5
https://www.freecram.net/exam/Professional-Cloud-Architect-google-certified-professional-cloud-architect-gcp-e8807.html
Free Exam/Cram Practice Materials - Best Exam Practice Materials
IT Certification Guaranteed, The Easy Way!

Subscribe to that topic using a message-processing utility service running on Compute Engine
instances.
B. Using the Cron service provided by GKE, publish messages to a Cloud Pub/Sub topic. Subscribe to
that topic using a message-processing utility service running on Compute Engine instances.
C. Using the Cron service provided by App Engine, publishing messages directly to a message-
processing utility service running on Compute Engine instances.
D. Using the Cron service provided by Google Kubernetes Engine (GKE), publish messages directly to
a message-processing utility service running on Compute Engine instances.
Answer: A
Explanation:
https://cloud.google.com/solutions/reliable-task-scheduling-compute-engine

NO.15 You have developed an application using Cloud ML Engine that recognizes famous paintings
from uploaded images. You want to test the application and allow specific people to upload images
for the next 24 hours. Not all users have a Google Account. How should you have users upload
images?
A. Create an App Engine web application where users can upload images. Configure App Engine to
disable the application after 24 hours. Authenticate users via Cloud Identity.
B. Have users upload the images to Cloud Storage. Protect the bucket with a password that expires
after 24 hours.
C. Have users upload the images to Cloud Storage using a signed URL that expires after 24 hours.
D. Create an App Engine web application where users can upload images for the next 24 hours.
Authenticate users via Cloud Identity.
Answer: B
Explanation:
https://cloud.google.com/blog/products/storage-data-transfer/uploading-images-directly-to-cloud-
storage-by-using-signed-url

NO.16 Your company is using Google Cloud. You have two folders under the Organization: Finance
and Shopping. The members of the development team are in a Google Group. The development
team group has been assigned the Project Owner role on the Organization. You want to prevent the
development team from creating resources in projects in the Finance folder. What should you do?
A. Assign the development team group the Project Viewer role on the Finance folder, and assign the
development team group the Project Owner role on the Shopping folder.
B. Assign the development team group only the Project Owner role on the Shopping folder.
C. Assign the development team group the Project Owner role on the Shopping folder, and remove
the development team group Project Owner role from the Organization.
D. Assign the development team group only the Project Viewer role on the Finance folder.
Answer: C
Explanation:
https://cloud.google.com/resource-manager/docs/cloud-platform-resource-hierarchy
"Roles are always inherited, and there is no way to explicitly remove a permission for a lower-level
resource that is granted at a higher level in the resource hierarchy. Given the above example, even if
you were to remove the Project Editor role from Bob on the "Test GCP Project", he would still inherit

Get Latest & Valid Google Exam's Question and Answers7 from Freecram.net. 6
https://www.freecram.net/exam/Professional-Cloud-Architect-google-certified-professional-cloud-architect-gcp-e8807.html
Free Exam/Cram Practice Materials - Best Exam Practice Materials
IT Certification Guaranteed, The Easy Way!

that role from the "Dept Y" folder, so he would still have the permissions for that role on "Test GCP
Project"."

NO.17 Your company has a stateless web API that performs scientific calculations. The web API runs
on a single Google Kubernetes Engine (GKE) cluster. The cluster is currently deployed in us-central1.
Your company has expanded to offer your API to customers in Asi a. You want to reduce the latency
for the users in Asia. What should you do?
A. Increase the memory and CPU allocated to the application in the cluster
B. Use a global HTTP(s) load balancer with Cloud CDN enabled
C. Create a second GKE cluster in asia-southeast1, and use kubemci to create a global HTTP(s) load
balancer
D. Create a second GKE cluster in asia-southeast1, and expose both API's using a Service of type Load
Balancer. Add the public Ips to the Cloud DNS zone
Answer: C
Explanation:
https://cloud.google.com/kubernetes-engine/docs/concepts/multi-cluster-ingress#how_works
https://github.com/GoogleCloudPlatform/k8s-multicluster-ingress
https://cloud.google.com/blog/products/gcp/how-to-deploy-geographically-distributed-services-on-
kubernetes-engine-with-kubemci

NO.18 You need to design a solution for global load balancing based on the URL path being
requested. You need to ensure operations reliability and end-to-end in-transit encryption based on
Google best practices.
What should you do?
A. Create an HTTPS load balancer with URL maps.
B. Create a cross-region load balancer with URL Maps.
C. Create appropriate instance groups and instances. Configure SSL proxy load balancing.
D. Create a global forwarding rule. Configure SSL proxy balancing.
Answer: A
Explanation:
Reference https://cloud.google.com/load-balancing/docs/https/url-map

NO.19 Your team will start developing a new application using microservices architecture on
Kubernetes Engine. As part of the development lifecycle, any code change that has been pushed to
the remote develop branch on your GitHub repository should be built and tested automatically.
When the build and test are successful, the relevant microservice will be deployed automatically in
the development environment. You want to ensure that all code deployed in the development
environment follows this process. What should you do?
A. Install a post-commit hook on the remote git repository that tests the code and builds the
container when code is pushed to the development branch. After a successful commit, have the
developer deploy the newly built container image on the development cluster.
B. Create a Cloud Build trigger based on the development branch that tests the code, builds the
container, and stores it in Container Registry. Create a deployment pipeline that watches for new
images and deploys the new image on the development cluster. Ensure only the deployment tool has
access to deploy new versions.

Get Latest & Valid Google Exam's Question and Answers8 from Freecram.net. 7
https://www.freecram.net/exam/Professional-Cloud-Architect-google-certified-professional-cloud-architect-gcp-e8807.html
Free Exam/Cram Practice Materials - Best Exam Practice Materials
IT Certification Guaranteed, The Easy Way!

C. Have each developer install a pre-commit hook on their workstation that tests the code and builds
the container when committing on the development branch. After a successful commit, have the
developer deploy the newly built container image on the development cluster.
D. Create a Cloud Build trigger based on the development branch to build a new container image and
store it in Container Registry. Rely on Vulnerability Scanning to ensure the code tests succeed. As the
final step of the Cloud Build process, deploy the new container image on the development cluster.
Ensure only Cloud Build has access to deploy new versions.
Answer: B
Explanation:
https://cloud.google.com/container-registry/docs/overview
Create a Cloud Build trigger based on the development branch that tests the code, builds the
container, and stores it in Container Registry. Create a deployment pipeline that watches for new
images and deploys the new image on the development cluster. Ensure only the deployment tool has
access to deploy new versions.

NO.20 For this question, refer to the EHR Healthcare case study. You are responsible for designing
the Google Cloud network architecture for Google Kubernetes Engine. You want to follow Google
best practices. Considering the EHR Healthcare business and technical requirements, what should
you do to reduce the attack surface?
A. Use a private cluster with a public endpoint with master authorized networks configured.
B. Use a private cluster with a private endpoint with master authorized networks configured.
C. Use a public cluster with master authorized networks enabled and firewall rules.
D. Use a public cluster with firewall rules and Virtual Private Cloud (VPC) routes.
Answer: B
Explanation:
https://cloud.google.com/kubernetes-engine/docs/concepts/private-cluster-concept#overview

NO.21 For this question, refer to the Dress4Win case study. Which of the compute services should
be migrated as -is and would still be an optimized architecture for performance in the cloud?
A. Web applications deployed using App Engine standard environment
B. RabbitMQ deployed using an unmanaged instance group
C. Jenkins, monitoring, bastion hosts, security scanners services deployed on custom machine types
D. Hadoop/Spark deployed using Cloud Dataproc Regional in High Availability mode
Answer: D

NO.22 Your company is forecasting a sharp increase in the number and size of Apache Spark and
Hadoop jobs being run on your local datacenter You want to utilize the cloud to help you scale this
upcoming demand with the least amount of operations work and code change. Which product should
you use?
A. Google Cloud Dataproc
B. Google Cloud Dataflow
C. Google Container Engine
D. Google Compute Engine
Answer: A

Get Latest & Valid Google Exam's Question and Answers9 from Freecram.net. 8
https://www.freecram.net/exam/Professional-Cloud-Architect-google-certified-professional-cloud-architect-gcp-e8807.html
Free Exam/Cram Practice Materials - Best Exam Practice Materials
IT Certification Guaranteed, The Easy Way!

Explanation:
Google Cloud Dataproc is a fast, easy-to-use, low-cost and fully managed service that lets you run the
Apache Spark and Apache Hadoop ecosystem on Google Cloud Platform. Cloud Dataproc provisions
big or small clusters rapidly, supports many popular job types, and is integrated with other Google
Cloud Platform services, such as Google Cloud Storage and Stackdriver Logging, thus helping you
reduce TCO.

10 from Freecram.net.
Get Latest & Valid Google Exam's Question and Answers 9
https://www.freecram.net/exam/Professional-Cloud-Architect-google-certified-professional-cloud-architect-gcp-e8807.html

You might also like