I'm doing bulk insert of my dataframe
to postgreSQL
like this
def df2db(conn: psycopg2.extensions.connection, df: pd.DataFrame, table: str):
buf = StringIO()
df.to_csv(buf, sep=',', index=False, header=False)
buf.seek(0)
columns = [f'"{col}"' for col in df.columns]
columns.remove('"index"')
with conn.cursor() as cur:
cur.copy_from(buf, table, columns=columns)
conn.commit()
I'm getting error that my data are too long for the database column
<_io.StringIO object at 0x7f3ae8d10ee0>
value too long for type character varying(12)
I would like to know if there is a way to check if DB
throws an error and if yes, continue inserting next data without stopping copy_from
. Do I really need loop all data and check size?
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…