exam questions

Exam Professional Cloud Architect All Questions

View all questions & answers for the Professional Cloud Architect exam

Exam Professional Cloud Architect topic 1 question 83 discussion

Actual exam question from Google's Professional Cloud Architect
Question #: 83
Topic #: 1
[All Professional Cloud Architect Questions]

Your BigQuery project has several users. For audit purposes, you need to see how many queries each user ran in the last month. What should you do?

  • A. Connect Google Data Studio to BigQuery. Create a dimension for the users and a metric for the amount of queries per user.
  • B. In the BigQuery interface, execute a query on the JOBS table to get the required information.
  • C. Use 'bq show' to list all jobs. Per job, use 'bq ls' to list job information and get the required information.
  • D. Use Cloud Audit Logging to view Cloud Audit Logs, and create a filter on the query operation to get the required information.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Googler2
Highly Voted 5 years, 2 months ago
D- reasons: 1.-Cloud Audit Logs maintains audit logs for admin activity, data access and system events. BIGQUERY is automatically send to cloud audit log functionality. 2.- In the filter you can filter relevant BigQuery Audit messages, you can express filters as part of the export https://cloud.google.com/logging/docs/audit https://cloud.google.com/bigquery/docs/reference/auditlogs#ids https://cloud.google.com/bigquery/docs/reference/auditlogs#auditdata_examples
upvoted 53 times
heretolearnazure
1 year, 9 months ago
Answer is D
upvoted 2 times
...
GooglecloudArchitect
4 years, 10 months ago
D is the right as you can get the monthly view of the query usage across all the users and projects for auditing purpose. C does need appropriate permission to see the detail level data. Monthly view is tough to get directly from the bq ls or bq show commands.
upvoted 9 times
...
...
Zarmi
Highly Voted 5 years, 1 month ago
Answer is D: https://cloud.google.com/bigquery/docs/reference/auditlogs#example_query_cost_breakdown_by_identity
upvoted 27 times
BobbyFlash
3 years, 5 months ago
Nailed it
upvoted 2 times
...
ErenYeager
2 years, 7 months ago
No mention about exporting to bq
upvoted 1 times
...
...
rajrocks171
Most Recent 3 weeks, 2 days ago
Selected Answer: D
Option D. Cloud audit logs track all BiqQuery jobs including who ran each query, when and how. You can filter logs by operation type and group by user identity. Not Option B - Unless you are a project admin, by default JOBS table shows jobs for the current user. Also, retention period here is 7 days.
upvoted 2 times
...
lokiinaction
3 weeks, 5 days ago
Selected Answer: D
Audit log (exported to BQ) is source for audit trails and security, includes full SQL text, can be retained for very long periods if exported.
upvoted 2 times
...
markware
1 month, 2 weeks ago
Selected Answer: D
This is mentioned in Engineer course from Google
upvoted 2 times
...
francisco94
2 months, 1 week ago
Selected Answer: D
I agree B could be the solution, but the best option is D. This is the correct and scalable approach: Cloud Audit Logs capture who ran what, including queries. You can filter on methodName = "jobservice.jobcompleted" and analyze logs in Logs Explorer or export to BigQuery for querying. Supports organization-wide, project-wide, and per-user visibility.
upvoted 2 times
...
Mikeliz
3 months, 2 weeks ago
Selected Answer: B
B is the better answer. You get more logs relating to BigQuery jobs from BigQuery than you get from Cloud Audit Logs
upvoted 1 times
...
david_tay
3 months, 2 weeks ago
Selected Answer: B
B is correct. Go search in Gemini "what are the steps to execute a query on the JOBS table in Bigquery to see how many queries each user ran in the last month." to see how easy are the steps.
upvoted 1 times
...
PetarMarinkovic
3 months, 3 weeks ago
Selected Answer: D
D is the right answer
upvoted 2 times
...
david_tay
3 months, 3 weeks ago
Selected Answer: B
answer is B, fastest and efficient method. Question said that they just need to know "how many queries each user ran in the last month" which B can do in a short time.
upvoted 2 times
...
1P5811
5 months ago
Selected Answer: B
BigQuery's INFORMATION_SCHEMA: BigQuery provides metadata about datasets, tables, and jobs through the INFORMATION_SCHEMA. The JOBS_BY_USER view within this schema is specifically designed to give you information about jobs run by each user. You can easily query this view to get the number of queries run by each user in the last month.
upvoted 3 times
...
JonathanSJ
5 months, 2 weeks ago
Selected Answer: B
I will go for B because it is more efficient and easy.
upvoted 2 times
...
alpay
6 months, 3 weeks ago
Selected Answer: D
"Audit logs versus INFORMATION_SCHEMA views Audit logs help you answer the question "Who did what, where, and when?" within your Google Cloud resources. Audit logs are the definitive source of information for system activity by user and access patterns and should be your primary source for audit or security questions." https://cloud.google.com/bigquery/docs/introduction-audit-workloads
upvoted 2 times
...
nareshthumma
7 months, 3 weeks ago
Answer is B In the BigQuery interface, execute a query on the JOBS table to get the required information. Explanation: JOBS Table:BigQuery automatically logs job information, including queries, in a special table called JOBS. By querying this table, you can retrieve details about each job, including the user who ran it, the query text, and the timestamp. Why the Other Options Are Less Suitable: Connect Google Data Studio to BigQuery: While this can visualize data, you still need to execute a query to pull the data first. This option is not directly querying for the information you need. Use ‘bq show’ and ‘bq ls’: These commands provide metadata about jobs but do not efficiently retrieve the count of queries per user, especially for a large number of jobs over a month. Use Cloud Audit Logging: This approach could work but would be more complex and less efficient for simply counting queries. The JOBS table is specifically designed for this purpose, making it easier to extract the necessary data.
upvoted 2 times
...
awsgcparch
10 months, 3 weeks ago
Selected Answer: B
Using the INFORMATION_SCHEMA.JOBS_BY_USER table within BigQuery is the most efficient and straightforward method to get the required audit information about the number of queries each user ran in the last month. Therefore, option B is the best choice.. D.While Cloud Audit Logs can provide detailed logs of activities, querying them directly for this purpose is less efficient than using the JOBS table in BigQuery. Additionally, setting up and querying audit logs involves more steps and may require exporting logs to BigQuery for complex queries.
upvoted 6 times
...
awsgcparch
10 months, 3 weeks ago
Selected Answer: B
Why B is the Best Answer: Direct Access to Job Metadata: BigQuery maintains metadata about jobs (including query jobs) in the INFORMATION_SCHEMA views, specifically in the INFORMATION_SCHEMA.JOBS table. Detailed Information: This table contains information about all jobs, including who ran them, when they were run, and the type of job. This makes it easy to filter and count queries by user. Querying JOBS Table: You can write a SQL query to count the number of queries executed by each user over the specified period.
upvoted 5 times
...
eff12c1
1 year ago
Selected Answer: B
Querying the INFORMATION_SCHEMA.JOBS_BY_USER view in BigQuery is the most efficient and straightforward way to obtain the number of queries each user ran in the last month. This method leverages built-in BigQuery capabilities designed specifically for auditing and monitoring query jobs. Cloud Audit Logs provide detailed logging information but are more complex to query for specific metrics like the number of queries run by each user. BigQuery’s INFORMATION_SCHEMA.JOBS_BY_USER is designed for this purpose and is easier to use for querying job data.
upvoted 6 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...