I'm currently load testing one of my API (Node.js + Express). This API makes a HTTP request to another server. Here's an example code:
var start = new Date()
axios.get('https://google.com')
.then(function (response) {
var end = (new Date() - start)/1000
console.info('Finished in %ds', end)
})
During the test, I find out that the more concurrent HTTP requests to the other server (in this example it's google.com), the slower the response becomes. I use Apache Jmeter for testing.
For example, if I do 1 request in one second:
Finished in 0.150s
But if I do 100 requests in one second:
Finished in 0.320s
...
Finished in 1.190s
Finished in 2.559s
Finished in 1.230s
Finished in 5.530s
At first I thought there must be a problem in the other server but that is not the case, even after I changed it to google.com (as per example), the same thing happened.
The more outbound http request that node.js has to make, the slower the response becomes. I have tried to improve my API by using node cluster, the workers help but I want to improve the response time even further.
Is there anything that I can do? or perhaps an explanation on why does this happen? I thought since my API makes asynchronous http requests, there should be no blocking, thus the response time should not be increased by such a significant amount.
Thanks.
question from:
https://stackoverflow.com/questions/65838938/concurrent-outbound-http-request-in-node-js-makes-the-response-slower 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…