Get Unlimited Contributor Access to the all ExamTopics Exams!
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
You write a Python script to connect to Google BigQuery from a Google Compute Engine virtual machine. The script is printing errors that it cannot connect to BigQuery. What should you do to fix the script?
A.
Install the latest BigQuery API client library for Python
B.
Run your script on a new virtual machine with the BigQuery access scope enabled
C.
Create a new service account with BigQuery access and execute your script with that user
D.
Install the bq component for gcloud with the command gcloud components install bq.
A - If client library was not installed, the python scripts won't run - since the question states the script reports "cannot connect" - the client library must have been installed. so it's B or C.
B - https://cloud.google.com/bigquery/docs/authorization an access scope is how your client application retrieve access_token with access permission in OAuth when you want to access services via API call - in this case, it is possible that the python script use an API call instead of library, if this is true, then access scope is required. client library requires no access scope (as it does not go through OAuth)
C - service account is Google Cloud's best practice
So prefer C.
Access scopes are the legacy method of specifying permissions for your instance. read from > https://cloud.google.com/compute/docs/access/service-accounts . So , I would go with C
agree
access scope is enabled by default
https://cloud.google.com/bigquery/docs/authorization#authenticate_with_oauth_20
If you use the BigQuery client libraries, you do not need this information, as this is done for you automatically.
Sorry, B is ok. You can create service account, add user to service account, and grant the user role as Service Account User. You still need to enable BigQuery scope to make the Python script running the instance to access BigQuery.
Stop confusing people, B) doesn't make any sense. Why would you use or create a whole new VM just because of a permission issue? If anything, just stop the instance and edit the scope of the default Compute Service Account and grant it the role through IAM. C) is the most appropriate answer since you can only set scopes of the default Compute Service Account, if you're using any other, there's no scope option - its access is dictated strictly by IAM in such scenario. So C) is the answer: Stop the VM, change the Service Account with the appropriate permissions and done. B) would still need to have permission the set through IAM & Admin, the scope isn't enough with the default Compute Service Account.
cloud guy1, relax. tartar is the hero for google cloud and if you read his answer, he explains the service account user's role granting on this one as that is the best practice
Configure the Python API to use a service account with relevant BigQuery access enabled. is the right answer.
It is likely that this service account this script is running under does not have the permissions to connect to BigQuery and that could be causing issues. You can prevent these by using a service account that has the necessary roles to access BigQuery.
Ref: https://cloud.google.com/bigquery/docs/reference/libraries#cloud-console
A service account is a special kind of account used by an application or a virtual machine (VM) instance, not a person.
Ref: https://cloud.google.com/iam/docs/service-accounts
Create a new service account with BigQuery access and execute your script with that user: If you want to run the script on an existing virtual machine, you can create a new service account with the necessary permissions to access BigQuery and then execute the script using that service account. This will allow the script to connect to BigQuery and access the data it needs.
You don't need to create a new VM to have different access scopes:
https://cloud.google.com/compute/docs/access/service-accounts#accesscopesiam
This weakens answer B.
When a user-managed service account is attached to the instance, the access scope defaults to cloud-platform:
https://cloud.google.com/compute/docs/access/service-accounts#scopes_best_practice
See Step 6 in: https://cloud.google.com/compute/docs/instances/change-service-account#changeserviceaccountandscopes
These facts leave C as the valid answer.
C. Create a new service account with BigQuery access and execute your script with that user.
Service accounts are used for server-to-server interactions, such as those between a virtual machine and BigQuery. You would need to create a service account that has the necessary permissions to access BigQuery, then download the service account key in JSON format. Once you have the key, you can set an environment variable (GOOGLE_APPLICATION_CREDENTIALS) to the path of the JSON key file before running your script, which will authenticate your requests to BigQuery.
The answer is C
https://cloud.google.com/bigquery/docs/authentication
For most services, you must attach the service account when you create the resource that will run your code; you cannot add or replace the service account later. Compute Engine is an exception—it lets you attach a service account to a VM instance at any time.
Connecting to BigQuery from a script requires proper authorization. Service accounts provide a secure way to grant access without sharing user credentials.
B is silly because there's no need to create a new VM just to change the access scope. You can edit the existing VM's access scope, although you do have to stop it first.
Closest is C. https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#gcloud
The confusion part is that it should never use the word user to represent service account
Service accounts provide a way to authenticate your application to Google Cloud services. When you create a service account, you can assign it specific roles that dictate what resources the service account can interact with, and how it can interact with them. In this case, you would assign the BigQuery access role to the service account, which would then be used to authenticate your script to BigQuery.
You would need to create a new VM if you wanted to use the machines default service account as they can only be changed when the VM is created. But using a custom account with IAM roles is more inline with best practice so should be C.
I am preparing for Google Professional Cloud Architect exam. I was able to access only limited questions here, if anyone has the entire questions please share them to my email address: [email protected] . Thank you in advance!
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
kalschi
Highly Voted 4 years, 5 months agoVika
3 years, 2 months agorishab86
2 years, 7 months agoMQQNB
1 year, 8 months agoMusk
3 years, 10 months agoKouShikyou
Highly Voted 4 years, 6 months agotartar
3 years, 9 months agotartar
3 years, 8 months agocloudguy1
3 years, 8 months agocertificatores
3 years, 5 months agotechalik
3 years, 5 months agonitinz
3 years, 2 months agonitinz
3 years, 2 months agoEry
1 year, 4 months agoresearched_answer_boi
Most Recent 1 week, 5 days agosantoshchauhan
1 month, 4 weeks agoPowerboy
1 month, 3 weeks agotosinogunfile
3 months agohzaoui
3 months, 3 weeks agopublic_figure
4 months, 1 week agothewalker
5 months, 4 weeks agospatters
6 months agoyilexar
7 months, 2 weeks agoduzapo
8 months agoheretolearnazure
8 months, 2 weeks agolucaluca1982
9 months agoBiddlyBdoyng
10 months, 2 weeks agothiago286
11 months, 3 weeks agoFigVam
1 year agoFigVam
1 year ago