Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
208 views
in Technique[技术] by (71.8m points)

python - TypeError: '_SessionRequestContextManager' object is not iterable

From this tutorial for multiple asynchronous get requests I copied and ran the following code:

import asyncio  
import aiohttp

def fetch_page(url, idx):  
    url = 'https://yahoo.com'
    response = yield from aiohttp.request('GET', url)

    if response.status == 200:
        print("data fetched successfully for: %d" % idx)
    else:
        print("data fetch failed for: %d" % idx)
        print(response.content, response.status)

def main():  
    url = 'https://yahoo.com'
    urls = [url] * 100

    coros = []
    for idx, url in enumerate(urls):
        coros.append(asyncio.Task(fetch_page(url, idx)))

    yield from asyncio.gather(*coros)

if __name__ == '__main__':  
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

However, I get the following errors:

RuntimeWarning: coroutine 'ClientSession._request' was never awaited

Unclosed client session TypeError: '_SessionRequestContextManager'

TypeError: '_SessionRequestContextManager' object is not iterable

question from:https://stackoverflow.com/questions/65930078/typeerror-sessionrequestcontextmanager-object-is-not-iterable

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

In tutorial you can see "Created 3 years ago" - so it is simply outdated.
For example currently you use await instead of yield from.

Better read official documentation for aiohttp.

In The aiohttp Request Lifecycle you can see similar example with fetch() and main():

import aiohttp
import asyncio

async def fetch(session, url):
    async with session.get(url) as response:
        return await response.text()

async def main():
    async with aiohttp.ClientSession() as session:
        html = await fetch(session, 'http://python.org')
        print(html)

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

And it could be with your example

import aiohttp
import asyncio

async def fetch(url, idx):
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:

            if response.status == 200:
                print("data fetched successfully for:", idx)
                #print(await response.text(), response.status)
            else:
                print("data fetch failed for:", idx)
                print(await response.text(), response.status)

async def main():
    url = 'https://yahoo.com'
    urls = [url] * 10

    for idx, url in enumerate(urls):
        await fetch(url, idx)

if __name__ == '__main__':  
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

or without session

import asyncio  
import aiohttp

async def fetch_page(url, idx):  
    async with aiohttp.request('GET', url) as response:
    
        if response.status == 200:
            print("data fetched successfully for:", idx)
        else:
            print("data fetch failed for:", idx)
            print(response.content, response.status)

async def main():  
    url = 'https://yahoo.com'
    urls = [url] * 100

    for idx, url in enumerate(urls):
        await fetch_page(url, idx)

if __name__ == '__main__':  
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...