There is a very good tool that imports tables into Postgres from a csv file.
It is a command-line tool called pgfutter (with binaries for windows, linux, etc.). One of its big advantages is that it recognizes the attribute/column names as well.
The usage of the tool is simple. For example if you'd like to import myCSVfile.csv
:
pgfutter --db "myDatabase" --port "5432" --user "postgres" --pw "mySecretPassword" csv myCSVfile.csv
This will create a table (called myCSVfile
) with the column names taken from the csv file's header. Additionally the data types will be identified from the existing data.
A few notes: The command pgfutter
varies depending on the binary you use, e.g. it could be pgfutter_windows_amd64.exe
(rename it if you intend to use this command frequently). The above command has to be executed in a command line window (e.g. in Windows run cmd
and ensure pgfutter
is accessible). If you'd like to have a different table name add --table "myTable"
; to select a particular database schema us --schema "mySchema"
. In case you are accessing an external database use --host "myHostDomain"
.
A more elaborate example of pgfutter
to import myFile
into myTable
is this one:
pgfutter --host "localhost" --port "5432" --db "myDB" --schema "public" --table "myTable" --user "postgres" --pw "myPwd" csv myFile.csv
Most likely you will change a few data types (from text to numeric) after the import:
alter table myTable
alter column myColumn type numeric
using (trim(myColumn)::numeric)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…