-
Notifications
You must be signed in to change notification settings - Fork 316
[Breaking Change] Don't remove destinationTable attribute in create_job
#483
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
From the description of the ticket when this was added:
While the client library always removes the destination table, that might not be the correct in all cases, but then again, it might be difficult to determine in every case what action to take. @tswast Do you remember the details and what the backend expects? Is it possible/feasible for the client to be in sync with the backend without duplicating the latter's logic? |
The main problem is permissions. When the destination table is automatically populated, it creates a table in a special dataset where data automatically expires and the customer is not charged for storage. The customer does not have permission to write directly to such a table. For example, I just ran a query and it wrote the results to the temporary table In every case I've seen, the temporary table We could possibly only strip the destination table in these cases. There's still a risk of false positives, but it should be much less likely. |
OK thanks, this sounds like a better heuristic. If it turns out that it doesn't work well enough in practice, it should be easy to modify/remove, if necessary. |
I might be failing to fully understand why the python API/library works like it does because I lack this context you are talking about. What I want to achieve is the same functionality as expected and provided by other ways I interact with bigquery:
@tswast can you help me understand this use-case with the temp table? It looks to me like a client logic. But again, I think I'm failing to fully grasp the example. I can seamlessly migrate mi code between several languages simply moving the job configuration (dictionary/json/struct) and I don't understand why the python library is an exception. I don't want to be rude by any means. And I respect your work, which is fantastic. |
@dinigo Full context is here: googleapis/google-cloud-python#5555 and here: #14 You're probably right that the correct choice back then would have been to keep |
Thanks @tswast ! I get it now. Also You have to keep the API backwards compatible ;) No problem. We will use a fork and rebase regularly since we need and expect this funcionality. I would like though to have this feature integrated at some point in the future. If you could tag it as such. Thanks again |
create_job
@jimfulton I think if we get retryable queries working in #539, we should be able to remove this odd behavior of Also, perhaps if |
Some notes:
|
Closed by #891 Will be released in 3.0.0 |
What happened:
I run a job with a query and a destination table and the table is not generated from the file.
My code executes successfully and produces the following job in the console

How to fix it
There is a line where the
destinationTable
is removed from the job config. Removing the line that deletes the property in the job configuration will do.python-bigquery/google/cloud/bigquery/client.py
Line 1729 in 530e1e8
Can anyone explain why is the
destinationTable
property removed?In the meanwhile I'm opening a PR
The text was updated successfully, but these errors were encountered: