Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
868 views
in Technique[技术] by (71.8m points)

postgresql - ERROR: extra data after last expected column in Postgres

I have a json file as:

[xyz@innolx20122 ~]$ cat test_cgs.json
{"technology":"AAA","vendor":"XXX","name":"RBNI","temporal_unit":"hour","regional_unit":"cell","dataset_metadata":"{"name": "RBNI", "intervals_epoch_seconds": [[1609941600, 1609945200]], "identifier_column_names": ["CELLID", "CELLNAME", "NETWORK"], "vendor": "XXX", "timestamp_column_name": "COLLECTTIME", "regional_unit": "cell"}","rk":1}

which I am trying to upload to below table in Postgres

 CREATE TABLE temp_test_table
( 
  technology       character varying(255),
  vendor           character varying(255),
  name             character varying(255),
  temporal_unit    character varying(255),
  regional_unit    character varying(255),
  dataset_metadata json,
  rk    character varying(255)
);

and here is my copy command

db-state=> copy temp_test_table(technology,vendor,name,temporal_unit,regional_unit,dataset_metadata,rk) FROM '/home/eksinvi/test_cgs.json' WITH CSV DELIMITER ',' quote E'' ESCAPE '';
ERROR:  extra data after last expected column
CONTEXT:  COPY temp_test_table, line 1: "{"technology":"AAA","vendor":"XXX","name":"RBNI","temporal_unit":"hour","regional_unit":"cell","data..."

I even tried loading this file to big query table but no luck

bq load --autodetect --source_format=NEWLINE_DELIMITED_JSON --allow_quoted_newlines --allow_jagged_rows --ignore_unknown_values test-project:vikrant_test_dataset.cg_test_table "gs://test-bucket-01/test/test_cgs.json"

any of the solution would work for me. I want to load this json either to Postgres table or bigquery table.

question from:https://stackoverflow.com/questions/65602905/error-extra-data-after-last-expected-column-in-postgres

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I had similar problems. In my case, it was related to NULL columns and encoding of the file. I also had to specify a custom delimiter because my columns sometimes included the default limiter and it would make the copy fail.

\copy mytable FROM 'filePath.dat' (DELIMITER E'\t', FORMAT CSV, NULL '', ENCODING 'UTF8'  );

In my case, I was exporting data to a CSV file from SQL Server and importing it to postgres. In SQL Server, we had unicode characters that would show up as "blanks" but that would screw up the copy command. I had to search the SQL table for those characters with regex queries and eliminate invalid characters. It's an edge case but that was part of the problem in my case.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...