Special Summer Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: Board70

Associate-Data-Practitioner Exam Dumps - Google Cloud Platform Questions and Answers

Question # 4

You have created a LookML model and dashboard that shows daily sales metrics for five regional managers to use. You want to ensure that the regional managers can only see sales metrics specific to their region. You need an easy-to-implement solution. What should you do?

Options:

A.

Create asales_regionuser attribute, and assign each manager’s region as the value of their user attribute. Add anaccess_filterExplore filter on theregion_namedimension by using thesales_regionuser attribute.

B.

Create five different Explores with thesql_always_filterExplore filter applied on theregion_namedimension. Set eachregion_namevalue to the corresponding region for each manager.

C.

Create separate Looker dashboards for each regional manager. Set the default dashboard filter to the corresponding region for each manager.

D.

Create separate Looker instances for each regional manager. Copy the LookML model and dashboard to each instance. Provision viewer access to the corresponding manager.

Buy Now
Question # 5

Your company is building a near real-time streaming pipeline to process JSON telemetry data from small appliances. You need to process messages arriving at a Pub/Sub topic, capitalize letters in the serial number field, and write results to BigQuery. You want to use a managed service and write a minimal amount of code for underlying transformations. What should you do?

Options:

A.

Use a Pub/Sub to BigQuery subscription, write results directly to BigQuery, and schedule a transformation query to run every five minutes.

B.

Use a Pub/Sub to Cloud Storage subscription, write a Cloud Run service that is triggered when objects arrive in the bucket, performs the transformations, and writes the results to BigQuery.

C.

Use the “Pub/Sub to BigQuery” Dataflow template with a UDF, and write the results to BigQuery.

D.

Use a Pub/Sub push subscription, write a Cloud Run service that accepts the messages, performs the transformations, and writes the results to BigQuery.

Buy Now
Question # 6

You are responsible for managing Cloud Storage buckets for a research company. Your company has well-defined data tiering and retention rules. You need to optimize storage costs while achieving your data retention needs. What should you do?

Options:

A.

Configure the buckets to use the Archive storage class.

B.

Configure a lifecycle management policy on each bucket to downgrade the storage class and remove objects based on age.

C.

Configure the buckets to use the Standard storage class and enable Object Versioning.

D.

Configure the buckets to use the Autoclass feature.

Buy Now
Question # 7

You have a Cloud SQL for PostgreSQL database that stores sensitive historical financial data. You need to ensure that the data is uncorrupted and recoverable in the event that the primary region is destroyed. The data is valuable, so you need to prioritize recovery point objective (RPO) over recovery time objective (RTO). You want to recommend a solution that minimizes latency for primary read and write operations. What should you do?

Options:

A.

Configure the Cloud SQL for PostgreSQL instance for multi-region backup locations.

B.

Configure the Cloud SQL for PostgreSQL instance for regional availability (HA). Back up the Cloud SQL for PostgreSQL database hourly to a Cloud Storage bucket in a different region.

C.

Configure the Cloud SQL for PostgreSQL instance for regional availability (HA) with synchronous replication to a secondary instance in a different zone.

D.

Configure the Cloud SQL for PostgreSQL instance for regional availability (HA) with asynchronous replication to a secondary instance in a different region.

Buy Now
Question # 8

Your team is building several data pipelines that contain a collection of complex tasks and dependencies that you want to execute on a schedule, in a specific order. The tasks and dependencies consist of files in Cloud Storage, Apache Spark jobs, and data in BigQuery. You need to design a system that can schedule and automate these data processing tasks using a fully managed approach. What should you do?

Options:

A.

Use Cloud Scheduler to schedule the jobs to run.

B.

Use Cloud Tasks to schedule and run the jobs asynchronously.

C.

Create directed acyclic graphs (DAGs) in Cloud Composer. Use the appropriate operators to connect to Cloud Storage, Spark, and BigQuery.

D.

Create directed acyclic graphs (DAGs) in Apache Airflow deployed on Google Kubernetes Engine. Use the appropriate operators to connect to Cloud Storage, Spark, and BigQuery.

Buy Now
Question # 9

Your organization needs to implement near real-time analytics for thousands of events arriving each second in Pub/Sub. The incoming messages require transformations. You need to configure a pipeline that processes, transforms, and loads the data into BigQuery while minimizing development time. What should you do?

Options:

A.

Use a Google-provided Dataflow template to process the Pub/Sub messages, perform transformations, and write the results to BigQuery.

B.

Create a Cloud Data Fusion instance and configure Pub/Sub as a source. Use Data Fusion to process the Pub/Sub messages, perform transformations, and write the results to BigQuery.

C.

Load the data from Pub/Sub into Cloud Storage using a Cloud Storage subscription. Create a Dataproc cluster, use PySpark to perform transformations in Cloud Storage, and write the results to BigQuery.

D.

Use Cloud Run functions to process the Pub/Sub messages, perform transformations, and write the results to BigQuery.

Buy Now
Question # 10

You manage a web application that stores data in a Cloud SQL database. You need to improve the read performance of the application by offloading read traffic from the primary database instance. You want to implement a solution that minimizes effort and cost. What should you do?

Options:

A.

Use Cloud CDN to cache frequently accessed data.

B.

Store frequently accessed data in a Memorystore instance.

C.

Migrate the database to a larger Cloud SQL instance.

D.

Enable automatic backups, and create a read replica of the Cloud SQL instance.

Buy Now
Question # 11

Your retail company wants to predict customer churn using historical purchase data stored in BigQuery. The dataset includes customer demographics, purchase history, and a label indicating whether the customer churned or not. You want to build a machine learning model to identify customers at risk of churning. You need to create and train a logistic regression model for predicting customer churn, using the customer_data table with the churned column as the target label. Which BigQuery ML query should you use?

A)

B)

C)

D)

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Buy Now
Question # 12

You recently inherited a task for managing Dataflow streaming pipelines in your organization and noticed that proper access had not been provisioned to you. You need to request a Google-provided IAM role so you can restart the pipelines. You need to follow the principle of least privilege. What should you do?

Options:

A.

Request the Dataflow Developer role.

B.

Request the Dataflow Viewer role.

C.

Request the Dataflow Worker role.

D.

Request the Dataflow Admin role.

Buy Now
Question # 13

You need to design a data pipeline to process large volumes of raw server log data stored in Cloud Storage. The data needs to be cleaned, transformed, and aggregated before being loaded into BigQuery for analysis. The transformation involves complex data manipulation using Spark scripts that your team developed. You need to implement a solution that leverages your team’s existing skillset, processes data at scale, and minimizes cost. What should you do?

Options:

A.

Use Dataflow with a custom template for the transformation logic.

B.

Use Cloud Data Fusion to visually design and manage the pipeline.

C.

Use Dataform to define the transformations in SQLX.

D.

Use Dataproc to run the transformations on a cluster.

Buy Now
Exam Name: Google Cloud Associate Data Practitioner (ADP Exam)
Last Update: Mar 30, 2025
Questions: 106
Associate-Data-Practitioner pdf

Associate-Data-Practitioner PDF

$25.5  $84.99
Associate-Data-Practitioner Engine

Associate-Data-Practitioner Testing Engine

$28.5  $94.99
Associate-Data-Practitioner PDF + Engine

Associate-Data-Practitioner PDF + Testing Engine

$40.5  $134.99