I'm new to mongodb
area, I want to know if there's a way to send batched requests of size 1K having 50% read and 50% write requests similar to redis
pipeline concept in a single call.
I saw some online resources sending batched requests using bulk.write
but didn't find any example showing read and write requests together.
This is my mongo client:
from pymongo import MongoClient
from random import randint
import time
#Step 1: Connect to MongoDB - Note: Change connection string as needed
client = MongoClient("mongodb://192.168.122.50:27017/")
db=client.business
#Step 2: Create sample data
names = ['Kitchen','Animal','State', 'Tastey', 'Big','City','Fish', 'Pizza','Goat', 'Salty','Sandwich','Lazy', 'Fun']
company_type = ['LLC','Inc','Company','Corporation']
company_cuisine = ['Pizza', 'Bar Food', 'Fast Food', 'Italian', 'Mexican', 'American', 'Sushi Bar', 'Vegetarian']
for x in range(1, 501):
business = {
'name' : names[randint(0, (len(names)-1))] + ' ' + names[randint(0, (len(names)-1))] + ' ' + company_type[randint(0, (len(company_type)-1))],
'rating' : randint(1, 5),
'cuisine' : company_cuisine[randint(0, (len(company_cuisine)-1))]
}
#Step 3: Insert business object directly into MongoDB via isnert_one
result=db.reviews.insert_one(business)
cnt =1
t = 0
while True:
if cnt%2 == 1:
business = {
'name' : names[randint(0, (len(names)-1))] + ' ' + names[randint(0, (len(names)-1))] + ' ' + company_type[randint(0, (len(company_type)-1))],
'rating' : randint(1, 5),
'cuisine' : company_cuisine[randint(0, (len(company_cuisine)-1))]
}
result=db.reviews.insert_one(business)
else:
rat = randint(1, 5)
fivestar = db.reviews.find_one({'rating':rat})
print(fivestar)
cnt = cnt+1
if cnt == 1000:
break
In the while True
section, i want to send a pipelined request of size 1000 having 50% insert and 50% read requests, It can be seen from the code, I'm inserting and reading into/from the database by generating random requests.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…