Releases: googleapis/python-bigquery-pandas
Releases · googleapis/python-bigquery-pandas
Version 0.4.0
PyPI release, Conda Forge release
- Fix bug in
read_gbqwhen building a dataframe with integer columns on Windows. Explicitly use 64bit integers when converting from BQ types. (#119) - Fix bug in
read_gbqwhen querying for an array of floats (#123) - Fix bug in
read_gbqwith configuration argument. Updates read_gbq to account for breaking change in the way google-cloud-python version 0.32.0+ handles query configuration API representation. (#152) - Fix bug in
to_gbqwhere seconds were discarded in timestamp columns. (#148) - Fix bug in
to_gbqwhen supplying a user-defined schema (#150) - Deprecate the
verboseparameter inread_gbqandto_gbq. Messages use the logging module instead of printing progress directly to standard output. (#12)
Version 0.3.1
PyPI release, Conda Forge release
- Fix an issue where Unicode couldn't be uploaded in Python 2 (issue 106)
- Add support for a passed schema in :func:
to_gbqinstead inferring the schema from the passedDataFramewithDataFrame.dtypes(issue 46) - Fix an issue where a dataframe containing both integer and floating point columns could not be uploaded with
to_gbq(issue 116) to_gbqnow usesto_csvto avoid manually looping over rows in a dataframe (should result in faster table uploads) (issue 96)
Version 0.3.0
PyPI release, Conda Forge release
- Use the
google-cloud-bigquerylibrary for API calls. Thegoogle-cloud-bigquerypackage is a new dependency, and dependencies ongoogle-api-python-clientandhttplib2are removed. See the installation guide for more details. (#93) - Structs and arrays are now named properly (#23) and BigQuery functions like
array_aggno longer run into errors during type conversion (#22 ). - :func:
to_gbqnow uses a load job instead of the streaming API. RemoveStreamingInsertErrorclass, as it is no longer used by :func:to_gbq. (#7, #75 )