Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
262 views
in Technique[技术] by (71.8m points)

python requests POST with header and parameters

I have a post request which I am trying to send using requests in python. But I get an invalid 403 error. The requests works fine through the browser.

POST /ajax-load-system HTTP/1.1
Host: xyz.website.com
Accept: application/json, text/javascript, */*; q=0.01
Accept-Language: en-GB,en;q=0.5
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0
Referer: http://xyz.website.com/help-me/ZYc5Yn
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
X-Requested-With: XMLHttpRequest
Content-Length: 56
Cookie: csrf_cookie_name=a3f8adecbf11e29c006d9817be96e8d4; ci_session=ba92hlh6o0ns7f20t4bsgjt0uqfdmdtl; _ga=GA1.2.1535910352.1530452604; _gid=GA1.2.1416631165.1530452604; _gat_gtag_UA_21820217_30=1
Connection: close

csrf_test_name=a3f8adecbf11e29c006d9817be96e8d4&vID=9999

What I am trying in python is:

import requests
import json

url = 'http://xyz.website.com/ajax-load-system'

payload = {
'Host': 'xyz.website.com',
'User-Agent': 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0',
'Accept': 'application/json, text/javascript, */*; q=0.01',
'Accept-Language': 'en-GB,en;q=0.5',
'Referer': 'http://xyz.website.com/help-me/ZYc5Yn',
'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8',
'X-Requested-With': 'XMLHttpRequest',
'Content-Length': '56',
'Cookie': 'csrf_cookie_name=a3f8adecbf11e29c006d9817be96e8d4; ci_session=ba92hlh6o0ns7f20t4bsgjt0uqfdmdtl; _ga=GA1.2.1535910352.1530452604; _gid=GA1.2.1416631165.1530452604; _gat_gtag_UA_21820217_30=1',
'Connection': 'close',
'csrf_test_name': 'a3f8adecbf11e29c006d9817be96e8d4',
'vID': '9999',
}    

headers = {}

r = requests.post(url, headers=headers, data=json.dumps(payload))
print(r.status_code)  

But this is printing a 403 error code. What am I doing wrong here?

I am expecting a return response as json:

{"status_message":"Thanks for help.","help_count":"141","status":true}

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You are confusing headers and payload, an the payload is not JSON encoded.

These are all headers:

Host: xyz.website.com
Accept: application/json, text/javascript, */*; q=0.01
Accept-Language: en-GB,en;q=0.5
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0
Referer: http://xyz.website.com/help-me/ZYc5Yn
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
X-Requested-With: XMLHttpRequest
Content-Length: 56
Cookie: csrf_cookie_name=a3f8adecbf11e29c006d9817be96e8d4; ci_session=ba92hlh6o0ns7f20t4bsgjt0uqfdmdtl; _ga=GA1.2.1535910352.1530452604; _gid=GA1.2.1416631165.1530452604; _gat_gtag_UA_21820217_30=1
Connection: close

Most of these are automated and don't need to be set manually. requests will set Host for you based on the URL, Accept is set to an acceptable default, Accept-Language is rarely needed in these situations, Referer, unless using HTTPS, is often not even set or filtered out for privacy reasons, so sites no longer rely on it being set, Content-Type must actually reflect the contents of your POST (and is not JSON!), so requests sets this for you depending on how you call it, Content-Length must reflect the actual content length, so is set by requests as it is in the best position to calculate this, and Connection should definitely be handled by the library as you don't want to prevent it from efficiently re-using connections if it can.

At best you could set X-Requested-With and User-Agent, but only if the server would not otherwise accept the request. The Cookies header reflect the values of cookies the browser holds. Your script can get their own set of cookies from the server by using a requests Session object to make an initial GET request to the url named in the Referer header (or other suitable URL on the same site), at which point the server should set cookies on the response, and those would be stored in the session for reuse on the post request. Use that mechanism to get your own CSRF cookie value.

Note the Content-Type header:

Content-Type: application/x-www-form-urlencoded; charset=UTF-8

When you pass in a dictionary to the data keyword of the requests.post() function, the library will encode the data to exactly that content type for you.

The actual payload is

csrf_test_name=a3f8adecbf11e29c006d9817be96e8d4&vID=9999

These are two fields, csrf_test_name, and vID, that need to part of your payload dictionary.

Note that the csrf_test_name value matches the csrf_cookie_name value in the cookies. This is how the site protects itself from Cross-site forgery attacks, where a third party may try to post to the same URL on your behalf. Such a third party would not have access to the same cookies so would be prevented. Your code needs to obtain a new cookie; a proper CSRF implementation would limit the time any CSRF cookie can be re-used.

So what would at least be needed to make it all work, is:

# *optional*, the site may not care about these. If they *do* care, then
# they care about keeping out automated scripts and could in future 
# raise the stakes and require more 'browser-like' markers. Ask yourself
# if you want to anger the site owners and get into an arms race.
headers = {
    'User-Agent': 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0',
    'X-Requested-With': 'XMLHttpRequest',
}

payload = {
    'vID': 9999,
}

url = 'http://xyz.website.com/ajax-load-system'
# the URL from the Referer header, but others at the site would probably
# also work
initial_url = 'http://xyz.website.com/help-me/ZYc5Yn'

with requests.Session() as session:
    # obtain CSRF cookie
    initial_response  = session.get(initial_url)
    payload['csrf_test_name'] = session.cookies['csrf_cookie_name']

    # Now actually post with the correct CSRF cookie
    response = session.post(url, headers=headers, data=payload)

If this still causes issues, you'll need to try out two additional headers, , Accept and Accept-Language. Take into account this will mean that the site has already thought long and hard about how to keep automated site scrapers out. Consider contacting them and asking if they offer an API option instead.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...