admin管理员组文章数量:1416304
I am trying to write several pandas dataframes (of the same format) to a postgres db, however some of them randomly are not written to the db. In that case to_sql silently fails and returns -1 (indicating the failure).
I don't use any schema which should rule out this issue as a possible cause, and I am not using SQL Server either. What totally strikes me is the fact that some of these dataframes are written to the db and some are not.
code:
from sqlalchemy import create_engine, inspect, DateTime
import psycopg
engine = create_engine('postgresql+psycopg://plantwatch:[email protected]/plantwatch')
df.to_sql('power', con=engine2, if_exists='append', index=False, dtype={'produced_at': DateTime})
example df (for each dataframe one id is written to the db) and expected db content :
produced_at id value
2015-01-01 00:00:00 someid 1
2015-01-01 01:00:00 someid 2
2015-01-01 00:00:00 someid2 1
2015-01-01 01:00:00 someid2 2
actual db content:
produced_at id value
2015-01-01 00:00:00 someid 1
2015-01-01 01:00:00 someid 2
A whacky workaround would be to dump all dataframes to .csv files and import each of them one by one to postgres, but there has to be another way.
本文标签: pythonPandas DataFrametosql randomly and silently failes without error messageStack Overflow
版权声明:本文标题:python - Pandas DataFrame.to_sql randomly and silently failes without error message - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745253772a2649979.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论