Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
330 views
in Technique[技术] by (71.8m points)

python - Continue inserting next data if value too long in Postgre

I'm doing bulk insert of my dataframe to postgreSQL like this

def df2db(conn: psycopg2.extensions.connection, df: pd.DataFrame, table: str):
    buf = StringIO()
    df.to_csv(buf, sep=',', index=False, header=False)
    buf.seek(0)
    columns = [f'"{col}"' for col in df.columns]
    columns.remove('"index"')
    with conn.cursor() as cur:
        cur.copy_from(buf, table, columns=columns)
    conn.commit()

I'm getting error that my data are too long for the database column


<_io.StringIO object at 0x7f3ae8d10ee0>
value too long for type character varying(12)

I would like to know if there is a way to check if DB throws an error and if yes, continue inserting next data without stopping copy_from. Do I really need loop all data and check size?


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)
等待大神答复

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...