Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
987 views
in Technique[技术] by (71.8m points)

insert json data into postgresql table using python

I need to write an automated python code to create database table having column names as the keys from the json file and column data should be the values of those respective key. My json looks like this:

    {
"Table_1": [
    {
        "Name": "B"
    }, 
    {
        "BGE3": [
            "Itm2", 
            "Itm1", 
            "Glass"
        ]
    }, 
    {
        "Trans": []
    }, 
    {
        "Art": [
            "SYS"
        ]
    }]}

My table name should be: Table_1.

So my column name should look like: Name | BGE3 | Trans | Art.

And data should be its respected values. Creation of table and columns has to be dynamic because I need to run this code on multiple json file. So far I have managed to connect to the postgresql database using python. So please help me with the solutions.Thankyou.

Postgres version 13.

Existing code:

cur.execute("CREATE TABLE Table_1(Name varchar, BGE3 varchar, Trans varchar, Art varchar)") 

for d in data: cur.execute("INSERT into B_Json_3(Name, BGE3, Trans , Art) VALUES (%s, %s, %s, %s,)", d) 

Where data is a list of arrays i made which can only be executed for this json. I need a function that will execute any json i want that can have 100 elements of list in the values of any key.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

The table creation portion, using Python json module to convert JSON to Python dict and psycopg2.sql module to dynamically CREATE TABLE:

import json
import psycopg2
from psycopg2 import sql

tbl_json = """{
"Table_1": [
    {
        "Name": "B"
    }, 
    {
        "BGE3": [
            "Itm2", 
            "Itm1", 
            "Glass"
        ]
    }, 
    {
        "Trans": []
    }, 
    {
        "Art": [
            "SYS"
        ]
    }]}
"""
# Transform JSON string into Python dict. Use json.load if pulling from file.
# Pull out table name and column names from dict.
tbl_dict = json.loads(tbl_json)
tbl_name = list(tbl_dict)[0]
tbl_name 
'Table_1'

col_names = [list(col_dict)[0] for col_dict in tbl_dict[tbl_name]]
# Result of above.
col_names
['Name', 'BGE3', 'Trans', 'Art']

# Create list of types and then combine column names and column types into 
# psycopg2 sql composed object. Warning: sql.SQL() does no escaping so potential 
# injection risk.
type_list = ["varchar", "varchar", "varchar"]
col_type = []
for i in zip(map(sql.Identifier, col_names), map(sql.SQL,type_list)):
    col_type.append(i[0] + i[1])
# The result of above.
col_type
[Composed([Identifier('Name'), SQL('varchar')]),
 Composed([Identifier('BGE3'), SQL('varchar')]),
 Composed([Identifier('Trans'), SQL('varchar')])]

# Build psycopg2 sql string using above.
sql_str = sql.SQL("CREATE table {} ({})").format(sql.Identifier(tbl_name), sql.SQL(',').join(col_type) )
con = psycopg2.connect("dbname=test host=localhost user=aklaver")
cur = con.cursor()
# Shows the CREATE statement that will be executed.
print(sql_str.as_string(con))
CREATE table "Table_1" ("Name"varchar,"BGE3"varchar,"Trans"varchar)

# Execute statement and commit.
cur.execute(sql_str)
con.commit()

# In psql client the result of the execute:
 d "Table_1" 
                   Table "public.Table_1"
 Column |       Type        | Collation | Nullable | Default 
--------+-------------------+-----------+----------+---------
 Name   | character varying |           |          | 
 BGE3   | character varying |           |          | 
 Trans  | character varying |           |          | 




与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...