Summer Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dealsixty

Associate-Cloud-Engineer Exam Dumps - Google Cloud Certified Questions and Answers

Question # 74

You have a Compute Engine instance hosting an application used between 9 AM and 6 PM on weekdays. You want to back up this instance daily for disaster recovery purposes. You want to keep the backups for 30 days. You want the Google-recommended solution with the least management overhead and the least number of services. What should you do?

Options:

A.

1. Update your instances’ metadata to add the following value: snapshot–schedule: 0 1 * * *

2. Update your instances’ metadata to add the following value: snapshot–retention: 30

B.

1. In the Cloud Console, go to the Compute Engine Disks page and select your instance’s disk.

2. In the Snapshot Schedule section, select Create Schedule and configure the following parameters:

–Schedule frequency: Daily

–Start time: 1:00 AM – 2:00 AM

–Autodelete snapshots after 30 days

C.

1. Create a Cloud Function that creates a snapshot of your instance’s disk.

2.Create a Cloud Function that deletes snapshots that are older than 30 days.

3.Use Cloud Scheduler to trigger both Cloud Functions daily at 1:00 AM.

D.

1. Create a bash script in the instance that copies the content of the disk to Cloud Storage.

2.Create a bash script in the instance that deletes data older than 30 days in the backup Cloud Storage bucket.

3.Configure the instance’s crontab to execute these scripts daily at 1:00 AM.

Buy Now
Question # 75

You have an object in a Cloud Storage bucket that you want to share with an external company. The object contains sensitive data. You want access to the content to be removed after four hours. The external company does not have a Google account to which you can grant specific user-based access privileges. You want to use the most secure method that requires the fewest steps. What should you do?

Options:

A.

Create a signed URL with a four-hour expiration and share the URL with the company.

B.

Set object access to ‘public’ and use object lifecycle management to remove the object after four hours.

C.

Configure the storage bucket as a static website and furnish the object’s URL to the company. Delete the object from the storage bucket after four hours.

D.

Create a new Cloud Storage bucket specifically for the external company to access. Copy the object to that bucket. Delete the bucket after four hours have passed.

Buy Now
Question # 76

Your company uses a large number of Google Cloud services centralized in a single project. All teams have specific projects for testing and development. The DevOps team needs access to all of theproduction services in order to perform their job. You want to prevent Google Cloud product changes from broadening their permissions in the future. You want to follow Google-recommended practices. What should you do?

Options:

A.

Grant all members of the DevOps team the role of Project Editor on the organization level.

B.

Grant all members of the DevOps team the role of Project Editor on the production project.

C.

Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the production project.

D.

Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the organization level.

Buy Now
Question # 77

You are working with a user to set up an application in a new VPC behind a firewall. The user is concerned about data egress. You want to configure the fewest open egress ports. What should you do?

Options:

A.

Set up a low-priority (65534) rule that blocks all egress and a high-priority rule (1000) that allows only the appropriate ports.

B.

Set up a high-priority (1000) rule that pairs both ingress and egress ports.

C.

Set up a high-priority (1000) rule that blocks all egress and a low-priority (65534) rule that allows only the appropriate ports.

D.

Set up a high-priority (1000) rule to allow the appropriate ports.

Buy Now
Question # 78

For analysis purposes, you need to send all the logs from all of your Compute Engine instances to a BigQuery dataset called platform-logs. You have already installed the Stackdriver Logging agent on all the instances. You want to minimize cost. What should you do?

Options:

A.

1. Give the BigQuery Data Editor role on the platform-logs dataset to the service accounts used by your instances.2. Update your instances’ metadata to add the following value: logs-destination: bq://platform-logs.

B.

1. In Stackdriver Logging, create a logs export with a Cloud Pub/Sub topic called logs as a sink.2. Create a Cloud Function that is triggered by messages in the logs topic.3. Configure that Cloud Function to drop logs that are not from Compute Engine and to insert Compute Engine logs in the platform-logs dataset.

C.

1. In Stackdriver Logging, create a filter to view only Compute Engine logs.2. Click Create Export.3. Choose BigQuery as Sink Service, and the platform-logs dataset as Sink Destination.

D.

1. Create a Cloud Function that has the BigQuery User role on the platform-logs dataset.2. Configure this Cloud Function to create a BigQuery Job that executes this query:INSERT INTO dataset.platform-logs (timestamp, log)SELECT timestamp, log FROM compute.logsWHERE timestamp > DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)3. Use Cloud Scheduler to trigger this Cloud Function once a day.

Buy Now
Question # 79

Your company is moving its continuous integration and delivery (CI/CD) pipeline to Compute Engine instances. The pipeline will manage the entire cloud infrastructure through code. How can you ensure that the pipeline has appropriate permissions while your system is following security best practices?

Options:

A.

• Add a step for human approval to the CI/CD pipeline before the execution of the infrastructure

provisioning.

• Use the human approvals IAM account for the provisioning.

B.

• Attach a single service account to the compute instances.

• Add minimal rights to the service account.

• Allow the service account to impersonate a Cloud Identity user with elevated permissions to create, update, or delete resources.

C.

• Attach a single service account to the compute instances.

• Add all required Identity and Access Management (IAM) permissions to this service account to create, update, or delete resources

D.

• Create multiple service accounts, one for each pipeline with the appropriate minimal Identity and

Access Management (IAM) permissions.

• Use a secret manager service to store the key files of the service accounts.

• Allow the CI/CD pipeline to request the appropriate secrets during the execution of the pipeline.

Buy Now
Question # 80

The core business of your company is to rent out construction equipment at a large scale. All the equipment that is being rented out has been equipped with multiple sensors that send event information every few seconds. These signals can vary from engine status, distance traveled, fuel level, and more. Customers are billed based on the consumption monitored by these sensors. You expect high throughput – up to thousands of events per hour per device – and need to retrieve consistent databased on the time of the event. Storing and retrieving individual signals should be atomic. What should you do?

Options:

A.

Create a file in Cloud Storage per device and append new data to that file.

B.

Create a file in Cloud Filestore per device and append new data to that file.

C.

Ingest the data into Datastore. Store data in an entity group based on the device.

D.

Ingest the data into Cloud Bigtable. Create a row key based on the event timestamp.

Buy Now
Question # 81

You want to host your video encoding software on Compute Engine. Your user base is growing rapidly, and users need to be able 3 to encode their videos at any time without interruption or CPU limitations. You must ensure that your encoding solution is highly available, and you want to follow Google-recommended practices to automate operations. What should you do?

Options:

A.

Deploy your solution on multiple standalone Compute Engine instances, and increase the number of existing instances wnen CPU utilization on Cloud Monitoring reaches a certain threshold.

B.

Deploy your solution on multiple standalone Compute Engine instances, and replace existing instances with high-CPU

instances when CPU utilization on Cloud Monitoring reaches a certain threshold.

C.

Deploy your solution to an instance group, and increase the number of available instances whenever you see high CPU utilization in Cloud Monitoring.

D.

Deploy your solution to an instance group, and set the autoscaling based on CPU utilization.

Buy Now
Question # 82

You need to set up a policy so that videos stored in a specific Cloud Storage Regional bucket are moved to Coldline after 90 days, and then deleted after one year from their creation. How should you set up the policy?

Options:

A.

Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 275 days (365 – 90)

B.

Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 365 days.

C.

Use gsutil rewrite and set the Delete action to 275 days (365-90).

D.

Use gsutil rewrite and set the Delete action to 365 days.

Buy Now
Question # 83

Your coworker has helped you set up several configurations for gcloud. You've noticed that you're running commands against the wrong project. Being new to the company, you haven't yet memorized any of the projects. With the fewest steps possible, what's the fastest way to switch to the correct configuration?

Options:

A.

Run gcloud configurations list followed by gcloud configurations activate .

B.

Run gcloud config list followed by gcloud config activate.

C.

Run gcloud config configurations list followed by gcloud config configurations activate.

D.

Re-authenticate with the gcloud auth login command and select the correct configurations on login.

Buy Now
Exam Name: Google Cloud Certified - Associate Cloud Engineer
Last Update: Apr 25, 2025
Questions: 314
Associate-Cloud-Engineer pdf

Associate-Cloud-Engineer PDF

$34  $84.99
Associate-Cloud-Engineer Engine

Associate-Cloud-Engineer Testing Engine

$38  $94.99
Associate-Cloud-Engineer PDF + Engine

Associate-Cloud-Engineer PDF + Testing Engine

$54  $134.99