Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.

Unlimited Access

Get Unlimited Contributor Access to the all ExamTopics Exams!
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Exam Associate Cloud Engineer topic 1 question 114 discussion

Actual exam question from Google's Associate Cloud Engineer
Question #: 114
Topic #: 1
[All Associate Cloud Engineer Questions]

You are managing several Google Cloud Platform (GCP) projects and need access to all logs for the past 60 days. You want to be able to explore and quickly analyze the log contents. You want to follow Google-recommended practices to obtain the combined logs for all projects. What should you do?

  • A. Navigate to Stackdriver Logging and select resource.labels.project_id="*"
  • B. Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
  • C. Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days.
  • D. Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️
Reference:
https://cloud.google.com/blog/products/gcp/best-practices-for-working-with-google-cloud-audit-logging

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
Verve
Highly Voted 3 years, 9 months ago
Its B.
upvoted 26 times
...
[Removed]
Highly Voted 3 years, 7 months ago
The question is to view log past 60 days. B, c, D talks about deleting an object or truncation of table data
upvoted 11 times
[Removed]
3 years, 7 months ago
Answer should be A
upvoted 3 times
[Removed]
3 years, 7 months ago
Also A specifically talks about aggregation
upvoted 4 times
[Removed]
3 years, 7 months ago
Also by default, you have a lot of flexibility when viewing logging in stack driver , to filter and query.
upvoted 2 times
xtian2900
3 years, 7 months ago
what about minimum retention is 30 days ? is it true ?
upvoted 3 times
[Removed]
3 years, 7 months ago
Ur correct so minimally is 30 for data access logs https://cloud.google.com/logging/quotas then B is the way to go.
upvoted 3 times
...
...
...
...
...
...
IshwarChandra
Most Recent 1 month ago
resource.labels.project_id="*" is not a correct query because "*" returns 0 records so option A is not a correct answer
upvoted 1 times
...
Cynthia2023
3 months, 4 weeks ago
Selected Answer: B
When it comes to log data, you're typically dealing with high-volume time-series data that is partitioned by time (e.g., by day). In such cases, setting a partition expiration is often more appropriate because it ensures that you're continuously retaining a rolling window of log data (for example, the last 60 days' worth) and automatically purging older data, rather than deleting the entire table at once after a certain period.
upvoted 1 times
Cynthia2023
3 months, 4 weeks ago
In BigQuery, setting an expiration time for tables can be applied in two contexts: Table Expiration: When you set a table expiration time at the table level, it applies to the entire table. This means that the entire table will be deleted once the specified expiration time has elapsed since the table's creation time. Partition Expiration: For partitioned tables, you can set a partition expiration time, which applies to individual partitions within the table. Each partition's data will be deleted once the specified expiration time has elapsed since the creation of that specific partition. This is particularly useful for time-series data, like logs, where you might want to only keep recent data and allow older data to be automatically purged.
upvoted 1 times
...
...
Romio2023
4 months, 3 weeks ago
I dont get the options
upvoted 2 times
...
kelliot
5 months ago
Selected Answer: B
I guess it's B
upvoted 2 times
...
BAofBK
5 months, 3 weeks ago
The correct answer is B
upvoted 1 times
...
scanner2
7 months, 3 weeks ago
Selected Answer: B
Provides storage of log entries in BigQuery datasets. You can use big data analysis capabilities on the stored logs. Logging sinks stream logging data into BigQuery in small batches, which lets you query data without running a load job. You can set a default table expiration time at the dataset level, or you can set a table's expiration time when the table is created. A table's expiration time is often referred to as "time to live" or TTL. When a table expires, it is deleted along with all of the data it contains. https://cloud.google.com/logging/docs/export/configure_export_v2#overview https://cloud.google.com/bigquery/docs/managing-tables#updating_a_tables_expiration_time
upvoted 1 times
...
Captain1212
7 months, 4 weeks ago
Selected Answer: B
B is thecorrect answer, we can use bq to get 60 days logs and analyse
upvoted 1 times
...
Neha_Pallavi
8 months, 1 week ago
B. Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
upvoted 1 times
...
Prat25200607
1 year, 1 month ago
Selected Answer: B
https://cloud.google.com/architecture/security-log-analytics
upvoted 1 times
...
sai_learner
1 year, 9 months ago
All options are wrong , they are talking about deletion after 60 days, but questions asks us to analyse logs of past 60 days
upvoted 5 times
FeaRoX
1 year, 3 months ago
You are absolutely wrong - meaning of "past 60 days" is same as "last 60 days" in that sentence.
upvoted 1 times
...
...
AzureDP900
1 year, 10 months ago
B is right for sure
upvoted 1 times
...
Tirthankar17
1 year, 10 months ago
Selected Answer: B
B is the correct answer.
upvoted 2 times
...
dttncl
2 years, 6 months ago
I believe B is the answer. All that matters in this scenario is the logs for the past 60 days. We can use BigQuery to analyze contents so C is incorrect. We need to configure a BQ as the sink for the logs export so we can query and analyze log data in the future. Therefore D is incorrect. https://cloud.google.com/logging/docs/audit/best-practices#export-best-practices Since we only care about the logs within 60 days, we can set the expiration time to 60 to retain only the logs within that time frame. Once data is beyond 60 days old, it wouldn't be included in future analyzations. https://cloud.google.com/bigquery/docs/managing-tables#updating_a_tables_expiration_time
upvoted 6 times
ryzior
2 years, 1 month ago
I think here we have the case described in details: https://cloud.google.com/architecture/exporting-stackdriver-logging-for-security-and-access-analytics
upvoted 1 times
...
...
ankatsu2010
2 years, 6 months ago
D should be the correct answer. To 'quickly analyze', you need to use BQ, next, you always need access to the logs 'for past 60days'. This means you have to export logs on a daily basis. You don't want to do this job manually right?
upvoted 1 times
ankatsu2010
2 years, 6 months ago
My apologies, B is correct... 'Sink' can route logging data to BQ automatically.
upvoted 3 times
...
...
AD_0525
2 years, 10 months ago
B is the correct one, option A does not give you the flexibility to analyze.
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...