exam questions

Exam Professional Data Engineer All Questions

View all questions & answers for the Professional Data Engineer exam

Exam Professional Data Engineer topic 1 question 186 discussion

Actual exam question from Google's Professional Data Engineer
Question #: 186
Topic #: 1
[All Professional Data Engineer Questions]

Your new customer has requested daily reports that show their net consumption of Google Cloud compute resources and who used the resources. You need to quickly and efficiently generate these daily reports. What should you do?

  • A. Do daily exports of Cloud Logging data to BigQuery. Create views filtering by project, log type, resource, and user.
  • B. Filter data in Cloud Logging by project, resource, and user; then export the data in CSV format.
  • C. Filter data in Cloud Logging by project, log type, resource, and user, then import the data into BigQuery.
  • D. Export Cloud Logging data to Cloud Storage in CSV format. Cleanse the data using Dataprep, filtering by project, resource, and user.
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
AWSandeep
Highly Voted 2 years, 4 months ago
A. Do daily exports of Cloud Logging data to BigQuery. Create views filtering by project, log type, resource, and user. You cannot import custom or filtered billing criteria into BigQuery. There are three types of Cloud Billing data tables with a fixed schema that must further drilled-down via BigQuery views. Reference: https://cloud.google.com/billing/docs/how-to/export-data-bigquery#setup
upvoted 8 times
...
22c1725
Most Recent 1 month, 1 week ago
Selected Answer: A
Sadly, it's unclear what: "Do daily exports of Cloud Logging data to BigQuery" Does that means creating job? or you would be creating a sink into destination?
upvoted 1 times
...
MaxNRG
1 year ago
Selected Answer: A
For generating daily reports that show net consumption of Google Cloud compute resources and user details, the most efficient approach would be: A. Do daily exports of Cloud Logging data to BigQuery. Create views filtering by project, log type, resource, and user.
upvoted 1 times
MaxNRG
1 year ago
Here's why this option is the most effective: Integration with BigQuery: BigQuery is a powerful tool for analyzing large datasets. By exporting Cloud Logging data directly to BigQuery, you can leverage its fast querying capabilities and advanced analysis features. Automated Daily Exports: Setting up automated daily exports to BigQuery streamlines the reporting process, ensuring that data is consistently and efficiently transferred. Creating Views for Specific Filters: By creating views in BigQuery that filter data by project, log type, resource, and user, you can tailor the reports to the specific needs of your customer. Views also simplify repeated analysis by encapsulating complex SQL queries. Efficiency and Scalability: This method is highly efficient and scalable, handling large volumes of data without the manual intervention required for CSV exports and data cleansing.
upvoted 1 times
MaxNRG
1 year ago
Option B (exporting data in CSV format) and Option D (using Cloud Storage and Dataprep) are less efficient due to the additional steps and manual handling involved. Option C is similar to A but lacks the specificity of creating views directly in BigQuery for filtering, which is a more streamlined approach.
upvoted 1 times
...
...
...
Aman47
1 year ago
You can choose a sink in which you want Cloud logging to continously send Logging data. You can choose which columns you want to see (filter).
upvoted 1 times
Aman47
1 year ago
Option C
upvoted 1 times
...
...
vaga1
1 year, 7 months ago
Selected Answer: A
B, C, D do no generate a daily scalable solution.
upvoted 3 times
...
Siant_137
1 year, 8 months ago
Selected Answer: C
I see A as quite inefficient as you are exporting ALL logs (hundreds of thousands) to bq and the filtering them with views. I would go for C, assuming that it does not involve doing it manually but rather creating a SINK with the correct filters and then using BQ Dataset as sink destination. But a lot of assumptions are taking place here as I believe the questions does not provide much context.
upvoted 3 times
...
midgoo
1 year, 9 months ago
Selected Answer: A
I almost got it wrong by choosing C. By doing C, that means we will manually filter first one by one. We should just import them all and filter using BigQuery
upvoted 2 times
...
maci_f
1 year, 11 months ago
Selected Answer: A
B and D do not consider the log type field. C looks good and I would go for it. However, A looks equally good and I've found a CloudSkillsBoost lab that is exactly describing what answer A does, i.e. exporting logs to BQ and then creating a VIEW. https://www.cloudskillsboost.google/focuses/6100?parent=catalog I think the advantage of exporting complete logs (i.e. filtering them after they reach BQ) is that in case we would want to adjust the reporting in the future, we would have the complete logs with all fields available, whereas with C we would need to take extra steps.
upvoted 4 times
...
zellck
2 years ago
Selected Answer: A
A is the answer.
upvoted 2 times
...
Atnafu
2 years, 1 month ago
A Bad exporting data in csv or json due lack of some data so export is best practice 1:10 https://www.youtube.com/watch?v=ZyMO9XabUUM
upvoted 1 times
Atnafu
2 years ago
You need to quickly and efficiently generate these daily reports by using Materialized view /View A materialized view is the best solution and having filtered value with a view is good solution so A is an answer
upvoted 1 times
...
...
hauhau
2 years, 1 month ago
Selected Answer: A
A because you filter data daily by view not just once by cloud logging
upvoted 1 times
...
devaid
2 years, 3 months ago
Selected Answer: A
A. The D isn't filtering by log type. B and C are discarded because you need to drill down the exported loggs in Big Query or other.
upvoted 3 times
devaid
2 years, 2 months ago
2nd tought: Definitely A. If you go to google documentation for export billing, you see a message that "Exporting to JSON or CSV is obsolet. Use Big Query instead". Also why A? Look https://cloud.google.com/billing/docs/how-to/export-data-bigquery https://cloud.google.com/billing/docs/how-to/bq-examples#total-costs-on-invoice You can make a fast report template al Data Studio that read a Big Query view.
upvoted 5 times
NicolasN
2 years, 1 month ago
A comment regarding the links you provided (and not the correctness of the selected answer). Using Cloud Billing is something different than detecting compute consumption data from Cloud Logging. In fact, manual exporting to CSV (and JSON) is possible through the Logs Explorer interface (I think without user data break-down): 🔗 https://cloud.google.com/logging/docs/view/logs-explorer-interface#download_logs
upvoted 1 times
...
...
...
AHUI
2 years, 3 months ago
Selected Answer: D
The Google Cloud Storage bucket where you would like your reports to be delivered. You can select any Cloud Storage bucket for which you are an owner, including buckets that are from different projects. This bucket must exist before you can start exporting reports and you must have owner access to the bucket. Google Cloud Storage charges for usage, so you should review the Cloud Storage pricesheet for information on how you might incur charges for the service. https://cloud.google.com/compute/docs/logging/usage-export
upvoted 1 times
...
TNT87
2 years, 3 months ago
Ans is C https://cloud.google.com/logging/docs/export/aggregated_sinks D isn't correct because Cloud storage is used as a sink when logs are in json format not csv. https://cloud.google.com/logging/docs/export/aggregated_sinks#supported-destinations
upvoted 4 times
jkhong
2 years, 1 month ago
The question explicitly mentions daily generation of data so this would highlight, B and C seems that it is only suggesting a one-off filtering
upvoted 1 times
TNT87
1 year, 12 months ago
so wjhaytsa your argument about daily generartion of data??
upvoted 1 times
...
...
TNT87
2 years, 3 months ago
On the other hand Ans A makes sense https://cloud.google.com/logging/docs/export/bigquery#overview
upvoted 1 times
...
...
changsu
2 years, 3 months ago
Selected Answer: D
Quickly and efficiently! It's a flag to guide to DataPrep. And importing data to Bigquery does not mean a report.
upvoted 2 times
...
pluiedust
2 years, 3 months ago
Why not C?
upvoted 1 times
...
Wasss123
2 years, 3 months ago
why not D ?
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...