I am trying to use BigQuery inside python to query a table that is generated via a sheet:
from google.cloud import bigquery
# Prepare connexion and query
bigquery_client = bigquery.Client(project="my_project")
query = """
select * from `table-from-sheets`
"""
df = bigquery_client.query(query).to_dataframe()
I can usually do queries to BigQuery tables, but now I am getting the following error:
Forbidden: 403 Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
What do I need to do to access drive from python?
Is there another way around?
You are missing the scopes for the credentials. I'm pasting the code snippet from the official documentation.
In addition, do not forget to give at least VIEWER access to the Service Account in the Google sheet.
from google.cloud import bigquery
import google.auth
# Create credentials with Drive & BigQuery API scopes.
# Both APIs must be enabled for your project before running this code.
credentials, project = google.auth.default(
scopes=[
"https://www.googleapis.com/auth/drive",
"https://www.googleapis.com/auth/bigquery",
]
)
# Construct a BigQuery client object.
client = bigquery.Client(credentials=credentials, project=project)
Related
I am trying to run a simple query on BigQuery from Python and follow this document. To set the client I generated the JSON file for my project via service account:
import pandas as pd
from google.cloud import bigquery
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]=*****
client = bigquery.Client()
QUERY = (
'SELECT name FROM `mythic-music-326213.mytestdata.trainData` '
'LIMIT 100')
query_job = client.query(QUERY)
However, I am getting the following error:
DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
Technically, I want to be able to query my dataset from Python. Any help would be appreciated.
I've tried your code snippet with my service account JSON file and dataset in my project. It worked as expected. Not clear why it's not working in your case.
Hovewer you can try to use service account JSON file directly like that:
import pandas as pd
from google.cloud import bigquery
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file('<path to JSON file>')
client = bigquery.Client(credentials=credentials)
QUERY = (
'SELECT state FROM `so-project-a.test.states` '
'LIMIT 100')
query_job = client.query(QUERY)
O want to write a python script that upload data from a file to a bigquery table.
here is the code:
from google.cloud import bigquery
client = bigquery.Client(project=project_id, location='US').from_service_account_json('my-key.json')
dataset_ref = client.dataset(dataset_id)
table_ref = dataset_ref.table(table_name)
client.load_table_from_file(filename, table_ref)
I am using a gcp vm created in the same project as where my bigquery table is. I am also using a service account that has Bigquery admin role.
I have an error that says that the user doesn't have the bigquery.jobs.create permission.
I don't know if it is a useful information but i am able to read my table.
I don't know what to do.
Thanks for your help.
While I trying to reach the external table's data, I'm getting error like as below. I can not solve this issue. Here are the details about the situation;
google.api_core.exceptions.NotFound: 404 Not found: Files /gdrive/id/id123456id
PS: id123456id is a dummy id.
The file with ID id123456 id exists in my Google Drive. Bigquery table looking this id.
bq_test.json -> service account credential's JSON file. This service account has those roles;
BigQuery Data Editor
BigQuery Data Owner
BigQuery Data Viewer
BigQuery User
Owner
Here is my code block:
from google.cloud import bigquery
from google.oauth2.service_account import Credentials
scopes = (
'https://www.googleapis.com/auth/bigquery',
'https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/drive'
)
credentials = Credentials.from_service_account_file('bq_test.json')
credentials = credentials.with_scopes(scopes)
client = bigquery.Client(credentials=credentials)
QUERY = (
"""SELECT * FROM
`project_name.dataset_name.ext_table`
LIMIT 5"""
)
query_job = client.query(QUERY)
rows = query_job.result()
for row in rows:
print(row.name)
I solved the problem as follows;
Go to https://console.cloud.google.com/iam-admin/iam?project=PROJECT_ID
Take service account mail value. ( like bq_test#PROJECT_ID.iam.gserviceaccount.com )
Go to https://drive.google.com and find the related file. (id = id123456)
Right-click and choose Share
Paste the above mail value. ( bq_test#PROJECT_ID.iam.gserviceaccount.com )
Choose read-only or whatever you need.
This flow provides the solution in my case.
I am trying to run a simple query on Google BigQuery via a python script, but am getting the below error that my service account is missing bigquery.jobs.create permission.
My service Account has the following roles applied:
Owner
BigQuery Admin
BigQuery Job User
I've also tried creating a custom role with bigquery.jobs.create and applying that to the service account, but still consistently get this error. What am I doing wrong?
from google.cloud import bigquery
from google.oauth2 import service_account
project_id = "my-test-project"
credentials = service_account.Credentials.from_service_account_file("credentials.json")
client = bigquery.Client(
credentials=credentials,
project=project_id
)
print(client.project) # returns "my-test-project"
query = client.query("select 1 as test;")
Access Denied: Project my-test-project: The user my-service-account #
my-test-project. iam.gserviceaccount.com does not have
bigquery.jobs.create permission in project my-test-project.
Authenticating the client using client = bigquery.Client.from_service_account_json("credentials.json") is the preferred method to avoid "Access Denied" errors. For one reason or another (I'm not sure why since bigquery does use oauth 2.0 access tokens to authorize requests), setting credentials through google.oauth2.service_account can lead to permission issues.
(There are a lot of similar threads here but unfortunately I couldn't find the answer to my error anywhere here or on Goolge)
I'm trying to query a federated table in BigQuery which is pointing to a spreadsheet in Drive.
I've run the following command to create default application credentials for gcloud:
$ gcloud auth application-default login
But this doesn't include Drive into the scope so I'm getting the following error message (which makes sense): Forbidden: 403 Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found.
Then I've tried to auth with explicit Drive scope:
$ gcloud auth application-default login --scopes=https://www.googleapis.com/auth/drive,https://www.googleapis.com/auth/cloud-platform,https://www.googleapis.com/auth/bigquery
After that I'm getting the following error when I try to use bigquery python api:
"Forbidden: 403 Access Denied: BigQuery BigQuery: Access Not Configured. Drive API has not been used in project 764086051850 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/drive.googleapis.com/overview?project=764086051850 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry."
The project number above does not exist in our organisation and the provided link leads to a page which says:
The API "drive.googleapis.com" doesn't exist or you don't have permission to access it
Drive API is definitely enabled for the default project, so the error message doesn't make much sense. I can also query the table from the terminal using bq query_string command.
I'm currently out of ideas on how to debug this further, anyone suggestions?
Configuration:
Google Cloud SDK 187.0.0
Python 2.7
google-cloud 0.27.0
google-cloud-bigquery 0.29.0
There might be issues when using the default credentials. However, you can use a service account, save the credentials in a JSON file and add the necessary scopes. I did a quick test and this code worked for me:
from google.cloud import bigquery
from google.oauth2.service_account import Credentials
scopes = (
'https://www.googleapis.com/auth/bigquery',
'https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/drive'
)
credentials = Credentials.from_service_account_file('/path/to/credentials.json')
credentials = credentials.with_scopes(scopes)
client = bigquery.Client(credentials=credentials)
query = "SELECT * FROM dataset.federated_table LIMIT 5"
query_job = client.query(query)
rows = query_job.result()
for row in rows: print(row)
If you get a 404 not found error is because you need to share the spreadsheet with the service account (view permission)