I have a json file as:
[xyz@innolx20122 ~]$ cat test_cgs.json
{"technology":"AAA","vendor":"XXX","name":"RBNI","temporal_unit":"hour","regional_unit":"cell","dataset_metadata":"{"name": "RBNI", "intervals_epoch_seconds": [[1609941600, 1609945200]], "identifier_column_names": ["CELLID", "CELLNAME", "NETWORK"], "vendor": "XXX", "timestamp_column_name": "COLLECTTIME", "regional_unit": "cell"}","rk":1}
which I am trying to upload to below table in Postgres
CREATE TABLE temp_test_table
(
technology character varying(255),
vendor character varying(255),
name character varying(255),
temporal_unit character varying(255),
regional_unit character varying(255),
dataset_metadata json,
rk character varying(255)
);
and here is my copy command
db-state=> copy temp_test_table(technology,vendor,name,temporal_unit,regional_unit,dataset_metadata,rk) FROM '/home/eksinvi/test_cgs.json' WITH CSV DELIMITER ',' quote E'' ESCAPE '';
ERROR: extra data after last expected column
CONTEXT: COPY temp_test_table, line 1: "{"technology":"AAA","vendor":"XXX","name":"RBNI","temporal_unit":"hour","regional_unit":"cell","data..."
I even tried loading this file to big query table but no luck
bq load --autodetect --source_format=NEWLINE_DELIMITED_JSON --allow_quoted_newlines --allow_jagged_rows --ignore_unknown_values test-project:vikrant_test_dataset.cg_test_table "gs://test-bucket-01/test/test_cgs.json"
any of the solution would work for me. I want to load this json either to Postgres
table or bigquery
table.
question from:
https://stackoverflow.com/questions/65602905/error-extra-data-after-last-expected-column-in-postgres