I am playing with different buffer sizes to be inserted into the local SQLite DB and have found that it takes nearly 8 minutes to inserts 10,000,000 rows of data, when buffer size is 10,000. In other words, it takes 1,000 writes to store everything.
8 minutes to store 10,000,000 seems a bit too long (or is it?)
Can any of the below be optimized to increase the speed? Please note that data being inserted is a random collection of characters.
public int flush() throws SQLException {
String sql = "insert into datastore values(?,?,?,?);";
PreparedStatement prep = con.prepareStatement(sql);
for (DatastoreElement e : content) { // content is 10,000 elements long
_KVPair kvp = e.getKvp();
prep.setInt(1, e.getMetaHash());
prep.setInt(2, kvp.hashCode());
prep.setString(3, kvp.getKey());
prep.setString(4, kvp.getValue());
prep.addBatch();
}
int[] updateCounts = prep.executeBatch();
con.commit();
return errorsWhileInserting(updateCounts);
}
When table is created it is done via
statement.executeUpdate("create table datastore
(meta_hash INTEGER," +
"kv_hash INTEGER," +
"key TEXT," +
"value TEXT);");
Can any of the above be further optimized please?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…