aiohttp package

All public names from submodules client, connector, errors, parsers, protocol, server, utils, websocket, worker and wsgi are exported into aiohttp namespace.

aiohttp.client module

HTTP Client for asyncio.

aiohttp.client.request(method, url, *, params=None, data=None, headers=None, cookies=None, files=None, auth=None, allow_redirects=True, max_redirects=10, encoding='utf-8', version=HttpVersion(major=1, minor=1), compress=None, chunked=None, expect100=False, connector=None, loop=None, read_until_eof=True, request_class=None, response_class=None)[source]

Constructs and sends a request. Returns response object.

Parameters:
  • method (str) – http method
  • url (str) – request url
  • params – (optional) Dictionary or bytes to be sent in the query string of the new request
  • data – (optional) Dictionary, bytes, or file-like object to send in the body of the request
  • headers (dict) – (optional) Dictionary of HTTP Headers to send with the request
  • cookies (dict) – (optional) Dict object to send with the request
  • auth (aiohttp.helpers.BasicAuth) – (optional) BasicAuth named tuple represent HTTP Basic Auth
  • allow_redirects (bool) – (optional) Set to True if POST/PUT/DELETE redirect following is allowed.
  • version (aiohttp.protocol.HttpVersion) – Request http version.
  • compress (bool) – Set to True if request has to be compressed with deflate encoding.
  • chunked (bool or int) – Set to chunk size for chunked transfer encoding.
  • expect100 (bool) – Expect 100-continue response from server.
  • connector (aiohttp.connector.BaseConnector) – BaseConnector sub-class instance to support connection pooling and session cookies.
  • read_until_eof (bool) – Read response until eof if response does not have Content-Length header.
  • request_class – (optional) Custom Request class implementation.
  • response_class – (optional) Custom Response class implementation.
  • loop – Optional event loop.

Usage:

>>> import aiohttp
>>> resp = yield from aiohttp.request('GET', 'http://python.org/')
>>> resp
<ClientResponse(python.org/) [200]>
>>> data = yield from resp.read()
class aiohttp.client.HttpClient(hosts, *, method=None, path=None, ssl=False, conn_pool=True, conn_timeout=None, failed_timeout=5.0, resolve=True, resolve_timeout=360.0, keepalive_timeout=30, verify_ssl=True, loop=None)[source]

Bases: builtins.object

Allow to use multiple hosts with same path. And automatically mark failed hosts.

request(method=None, path=None, *, params=None, data=None, headers=None, cookies=None, files=None, auth=None, allow_redirects=True, max_redirects=10, encoding='utf-8', version=HttpVersion(major=1, minor=1), compress=None, chunked=None, expect100=False, read_until_eof=True)[source]

aiohttp.connector module

class aiohttp.connector.BaseConnector(*, conn_timeout=None, keepalive_timeout=30, share_cookies=False, force_close=False, loop=None)[source]

Bases: builtins.object

Base connector class.

Parameters:
  • conn_timeout – (optional) Connect timeout.
  • keepalive_timeout – (optional) Keep-alive timeout.
  • share_cookies (bool) – Set to True to keep cookies between requests.
  • force_close (bool) – Set to True to froce close and do reconnect after each request (and between redirects).
  • loop – Optional event loop.
close()[source]

Close all opened transports.

connect(req)[source]

Get from pool or create new connection.

update_cookies(cookies)[source]

Update shared cookies.

class aiohttp.connector.TCPConnector(*args, verify_ssl=True, resolve=False, family=<AddressFamily.AF_INET: 2>, **kwargs)[source]

Bases: aiohttp.connector.BaseConnector

TCP connector.

Parameters:
  • verify_ssl (bool) – Set to True to check ssl certifications.
  • resolve (bool) – Set to True to do DNS lookup for host name.
  • familiy – socket address family
  • args – see BaseConnector
  • kwargs – see BaseConnector
clear_resolved_hosts(host=None, port=None)[source]

Remove specified host/port or clear all resolve cache.

family[source]

Socket family like AF_INET.

resolve[source]

Do DNS lookup for host name?

resolved_hosts[source]

The dict of (host, port) -> (ipaddr, port) pairs.

verify_ssl[source]

Do check for ssl certifications?

class aiohttp.connector.ProxyConnector(proxy, *args, proxy_auth=None, **kwargs)[source]

Bases: aiohttp.connector.TCPConnector

Http Proxy connector.

Parameters:

Usage:

>>> conn = ProxyConnector(proxy="http://some.proxy.com")
>>> resp = yield from request('GET', 'http://python.org', connector=conn)
proxy[source]

Proxy URL.

class aiohttp.connector.UnixConnector(path, *args, **kw)[source]

Bases: aiohttp.connector.BaseConnector

Unix socket connector.

Parameters:

Usage:

>>> conn = UnixConnector(path='/path/to/socket')
>>> resp = yield from request('GET', 'http://python.org', connector=conn)
path[source]

Path to unix socket.

aiohttp.connector.SocketConnector

Alias of TCPConnector.

Note

Keeped for backward compatibility. May be deprecated in future.

alias of TCPConnector

aiohttp.connector.UnixSocketConnector

Alias of UnixConnector.

Note

Keeped for backward compatibility. May be deprecated in future.

alias of UnixConnector

aiohttp.errors module

http related errors.

exception aiohttp.errors.HttpException[source]

Bases: builtins.Exception

Base http exception class.

code = None
headers = ()
message = ''
exception aiohttp.errors.HttpErrorException(code, message='', headers=None)[source]

Bases: aiohttp.errors.HttpException

Http error.

Shortcut for raising http errors with custom code, message and headers.

Parameters:
  • code (int) – HTTP Error code.
  • message (str) – (optional) Error message.
  • of [tuple] headers (list) – (optional) Headers to be sent in response.
exception aiohttp.errors.HttpBadRequest[source]

Bases: aiohttp.errors.HttpException

code = 400
message = 'Bad Request'
exception aiohttp.errors.HttpMethodNotAllowed[source]

Bases: aiohttp.errors.HttpException

code = 405
message = 'Method Not Allowed'
exception aiohttp.errors.IncompleteRead(partial, expected=None)[source]

Bases: aiohttp.errors.ConnectionError

exception aiohttp.errors.BadStatusLine(line='')[source]

Bases: aiohttp.errors.HttpBadRequest

exception aiohttp.errors.LineTooLong(line, limit='Unknown')[source]

Bases: aiohttp.errors.HttpBadRequest

exception aiohttp.errors.InvalidHeader(hdr)[source]

Bases: aiohttp.errors.HttpBadRequest

exception aiohttp.errors.ConnectionError[source]

Bases: builtins.Exception

http connection error.

exception aiohttp.errors.OsConnectionError[source]

Bases: aiohttp.errors.ConnectionError

OSError error.

exception aiohttp.errors.ClientConnectionError[source]

Bases: aiohttp.errors.ConnectionError

BadStatusLine error.

exception aiohttp.errors.TimeoutError

Bases: concurrent.futures._base.Error

The operation exceeded the given deadline.

exception aiohttp.errors.ProxyConnectionError[source]

Bases: aiohttp.errors.ClientConnectionError

Proxy connection error.

Raised in aiohttp.connector.ProxyConnector if connection to proxy can not be established.

exception aiohttp.errors.HttpProxyError(code, message='', headers=None)[source]

Bases: aiohttp.errors.HttpErrorException

Http proxy error.

Raised in aiohttp.connector.ProxyConnector if proxy responds with status other than 200 OK on CONNECT request.

aiohttp.helpers module

Various helper functions

class aiohttp.helpers.BasicAuth[source]

Bases: aiohttp.helpers.BasicAuth

Http basic authentication helper.

Parameters:
  • login (str) – Login
  • password (str) – Password
  • encoding (str) – (optional) encoding (‘latin1’ by default)
encode()[source]

Encode credentials.

class aiohttp.helpers.FormData(fields)[source]

Bases: builtins.object

Helper class for multipart/form-data and application/x-www-form-urlencoded body generation.

add_field(name, value, contenttype=None, filename=None)[source]
add_fields(*fields)[source]
contenttype[source]
gen_form_data(encoding='utf-8', chunk_size=8196)[source]

Encode a list of fields using the multipart/form-data MIME format

gen_form_urlencoded(encoding)[source]
is_form_data()[source]
aiohttp.helpers.parse_mimetype(mimetype)[source]

Parses a MIME type into its components.

Parameters:mimetype (str) – MIME type
Returns:4 element tuple for MIME type, subtype, suffix and parameters
Return type:tuple

Example:

>>> parse_mimetype('text/html; charset=utf-8')
('text', 'html', '', {'charset': 'utf-8'})

aiohttp.parsers module

Parser is a generator function (NOT coroutine).

Parser receives data with generator’s send() method and sends data to destination DataQueue. Parser receives ParserBuffer and DataQueue objects as a parameters of the parser call, all subsequent send() calls should send bytes objects. Parser sends parsed term to destination buffer with DataQueue.feed_data() method. DataQueue object should implement two methods. feed_data() - parser uses this method to send parsed protocol data. feed_eof() - parser uses this method for indication of end of parsing stream. To indicate end of incoming data stream EofStream exception should be sent into parser. Parser could throw exceptions.

There are three stages:

  • Data flow chain:

    1. Application creates StreamParser object for storing incoming data.

    2. StreamParser creates ParserBuffer as internal data buffer.

    3. Application create parser and set it into stream buffer:

      parser = HttpRequestParser() data_queue = stream.set_parser(parser)

    1. At this stage StreamParser creates DataQueue object and passes it and internal buffer into parser as an arguments.

      def set_parser(self, parser):

      output = DataQueue() self.p = parser(output, self._input) return output

    2. Application waits data on output.read()

      while True:

      msg = yield form output.read() ...

  • Data flow:

    1. asyncio’s transport reads data from socket and sends data to protocol with data_received() call.
    2. Protocol sends data to StreamParser with feed_data() call.
    3. StreamParser sends data into parser with generator’s send() method.
    4. Parser processes incoming data and sends parsed data to DataQueue with feed_data()
    5. Application received parsed data from DataQueue.read()
  • Eof:

    1. StreamParser receives eof with feed_eof() call.
    2. StreamParser throws EofStream exception into parser.
    3. Then it unsets parser.
_SocketSocketTransport ->
-> “protocol” -> StreamParser -> “parser” -> DataQueue <- “application”
exception aiohttp.parsers.EofStream

Bases: builtins.Exception

eof stream indication.

class aiohttp.parsers.StreamParser(*, loop=None, buf=None, paused=True, limit=65536)[source]

Bases: asyncio.streams.StreamReader

StreamParser manages incoming bytes stream and protocol parsers.

StreamParser uses ParserBuffer as internal buffer.

set_parser() sets current parser, it creates DataQueue object and sends ParserBuffer and DataQueue into parser generator.

unset_parser() sends EofStream into parser and then removes it.

at_eof()[source]
exception()[source]
feed_data(data)[source]

send data to current parser or store in buffer.

feed_eof()[source]

send eof to all parsers, recursively.

output[source]
pause_stream()[source]
resume_stream()[source]
set_exception(exc)[source]
set_parser(parser, output=None)[source]

set parser to stream. return parser’s DataQueue.

set_transport(transport)[source]
unset_parser()[source]

unset parser, send eof to the parser and then remove it.

class aiohttp.parsers.StreamProtocol(*, loop=None, **kwargs)[source]

Bases: asyncio.streams.FlowControlMixin, asyncio.protocols.Protocol

Helper class to adapt between Protocol and StreamReader.

connection_lost(exc)[source]
connection_made(transport)[source]
data_received(data)[source]
eof_received()[source]
is_connected()[source]
class aiohttp.parsers.ParserBuffer(*args, limit=16384)[source]

Bases: builtins.bytearray

ParserBuffer is a bytearray extension.

ParserBuffer provides helper methods for parsers.

exception()[source]
feed_data(data)[source]
read(size)[source]

read() reads specified amount of bytes.

readsome(size=None)[source]

reads size of less amount of bytes.

readuntil(stop, limit=None)[source]
set_exception(exc)[source]
skip(size)[source]

skip() skips specified amount of bytes.

skipuntil(stop)[source]

skipuntil() reads until stop bytes sequence.

wait(size)[source]

wait() waits for specified amount of bytes then returns data without changing internal buffer.

waituntil(stop, limit=None)[source]

waituntil() reads until stop bytes sequence.

class aiohttp.parsers.LinesParser(limit=65536)[source]

Bases: builtins.object

Lines parser.

Lines parser splits a bytes stream into a chunks of data, each chunk ends with n symbol.

class aiohttp.parsers.ChunksParser(size=8196)[source]

Bases: builtins.object

Chunks parser.

Chunks parser splits a bytes stream into a specified size chunks of data.

aiohttp.protocol module

Http related parsers and protocol.

class aiohttp.protocol.HttpMessage(transport, version, close)[source]

Bases: builtins.object

HttpMessage allows to write headers and payload to a stream.

For example, lets say we want to read file then compress it with deflate compression and then send it with chunked transfer encoding, code may look like this:

>>> response = aiohttp.Response(transport, 200)

We have to use deflate compression first:

>>> response.add_compression_filter('deflate')

Then we want to split output stream into chunks of 1024 bytes size:

>>> response.add_chunking_filter(1024)

We can add headers to response with add_headers() method. add_headers() does not send data to transport, send_headers() sends request/response line and then sends headers:

>>> response.add_headers(
...     ('Content-Disposition', 'attachment; filename="..."'))
>>> response.send_headers()

Now we can use chunked writer to write stream to a network stream. First call to write() method sends response status line and headers, add_header() and add_headers() method unavailable at this stage:

>>> with open('...', 'rb') as f:
...     chunk = fp.read(8196)
...     while chunk:
...         response.write(chunk)
...         chunk = fp.read(8196)
>>> response.write_eof()
HOP_HEADERS = None
SERVER_SOFTWARE = 'Python/3.4 aiohttp/0.9.2'
add_chunking_filter(chunk_size=16384)[source]

Split incoming stream into chunks.

add_compression_filter(encoding='deflate')[source]

Compress incoming stream with deflate or gzip encoding.

add_header(name, value)[source]

Analyze headers. Calculate content length, removes hop headers, etc.

add_headers(*headers)[source]

Adds headers to a http message.

filter = None
force_chunked()[source]
force_close()[source]
is_headers_sent()[source]
keep_alive()[source]
send_headers()[source]

Writes headers to a stream. Constructs payload writer.

status = None
status_line = b''
upgrade = False
websocket = False
write(chunk)[source]

Writes chunk of data to a stream by using different writers.

writer uses filter to modify chunk of data. write_eof() indicates end of stream. writer can’t be used after write_eof() method being called. write() return drain future.

write_eof()[source]
writer = None
class aiohttp.protocol.Request(transport, method, path, http_version=HttpVersion(major=1, minor=1), close=False)[source]

Bases: aiohttp.protocol.HttpMessage

HOP_HEADERS = ()
class aiohttp.protocol.Response(transport, status, http_version=HttpVersion(major=1, minor=1), close=False)[source]

Bases: aiohttp.protocol.HttpMessage

Create http response message.

Transport is a socket stream transport. status is a response status code, status has to be integer value. http_version is a tuple that represents http version, (1, 0) stands for HTTP/1.0 and (1, 1) is for HTTP/1.1

HOP_HEADERS = {'TRANSFER-ENCODING', 'TRAILERS', 'PROXY-AUTHENTICATE', 'DATE', 'TE', 'SERVER', 'PROXY-AUTHORIZATION', 'CONNECTION', 'UPGRADE', 'KEEP-ALIVE'}
class aiohttp.protocol.HttpVersion

Bases: builtins.tuple

HttpVersion(major, minor)

major

Alias for field number 0

minor

Alias for field number 1

class aiohttp.protocol.RawRequestMessage

Bases: builtins.tuple

RawRequestMessage(method, path, version, headers, should_close, compression)

compression

Alias for field number 5

headers

Alias for field number 3

method

Alias for field number 0

path

Alias for field number 1

should_close

Alias for field number 4

version

Alias for field number 2

class aiohttp.protocol.RawResponseMessage

Bases: builtins.tuple

RawResponseMessage(version, code, reason, headers, should_close, compression)

code

Alias for field number 1

compression

Alias for field number 5

headers

Alias for field number 3

reason

Alias for field number 2

should_close

Alias for field number 4

version

Alias for field number 0

class aiohttp.protocol.HttpPrefixParser(allowed_methods=())[source]

Bases: builtins.object

Waits for ‘HTTP’ prefix (non destructive)

class aiohttp.protocol.HttpRequestParser(max_line_size=8190, max_headers=32768, max_field_size=8190)[source]

Bases: aiohttp.protocol.HttpParser

Read request status line. Exception errors.BadStatusLine could be raised in case of any errors in status line. Returns RawRequestMessage.

class aiohttp.protocol.HttpResponseParser(max_line_size=8190, max_headers=32768, max_field_size=8190)[source]

Bases: aiohttp.protocol.HttpParser

Read response status line and headers.

BadStatusLine could be raised in case of any errors in status line. Returns RawResponseMessage

class aiohttp.protocol.HttpPayloadParser(message, length=None, compression=True, readall=False, response_with_body=True)[source]

Bases: builtins.object

parse_chunked_payload(out, buf)[source]

Chunked transfer encoding parser.

parse_eof_payload(out, buf)[source]

Read all bytes until eof.

parse_length_payload(out, buf, length=0)[source]

Read specified amount of bytes.

aiohttp.server module

simple http server.

class aiohttp.server.ServerHttpProtocol(*, loop=None, keep_alive=None, timeout=15, tcp_keepalive=True, allowed_methods=(), debug=False, log=<logging.Logger object at 0x7ffe03a24c88>, access_log=<logging.Logger object at 0x7ffe03cdd748>, access_log_format='%(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"', **kwargs)[source]

Bases: aiohttp.parsers.StreamProtocol

Simple http protocol implementation.

ServerHttpProtocol handles incoming http request. It reads request line, request headers and request payload and calls handle_request() method. By default it always returns with 404 response.

ServerHttpProtocol handles errors in incoming request, like bad status line, bad headers or incomplete payload. If any error occurs, connection gets closed.

Parameters:
  • keep_alive (int or None) – number of seconds before closing keep-alive connection
  • timeout (int) – slow request timeout
  • tcp_keepalive (bool) – TCP socket keep-alive flag
  • allowed_methods (tuple) – (optional) List of allowed request methods. Set to empty list to allow all methods.
  • debug (bool) – enable debug mode
  • log (aiohttp.log.server_log) – custom logging object
  • access_log (aiohttp.log.server_log) – custom logging object
  • access_log_format (str) – access log format string
  • loop – Optional event loop
cancel_slow_request()[source]
closing()[source]

Worker process is about to exit, we need cleanup everything and stop accepting requests. It is especially important for keep-alive connections.

connection_lost(exc)[source]
connection_made(transport)[source]
handle_error(status=500, message=None, payload=None, exc=None, headers=None)[source]

Handle errors.

Returns http response with specific status code. Logs additional information. It always closes current connection.

handle_request(message, payload)[source]

Handle a single http request.

Subclass should override this method. By default it always returns 404 response.

Parameters:
keep_alive(val)[source]

Set keep-alive connection mode.

Parameters:val (bool) – new state.
log_access(message, environ, response, time)[source]
log_debug(*args, **kw)[source]
log_exception(*args, **kw)[source]
start()[source]

Start processing of incoming requests.

It reads request line, request headers and request payload, then calls handle_request() method. Subclass has to override handle_request(). start() handles various exceptions in request or response handling. Connection is being closed always unless keep_alive(True) specified.

aiohttp.streams module

exception aiohttp.streams.EofStream[source]

Bases: builtins.Exception

eof stream indication.

class aiohttp.streams.StreamReader(limit=65536, loop=None)[source]

Bases: asyncio.streams.StreamReader

at_eof()[source]

Return True if the buffer is empty and ‘feed_eof’ was called.

exception()[source]
feed_data(data)[source]
feed_eof()[source]
is_eof()[source]

Return True if ‘feed_eof’ was called.

read(n=-1)[source]
read_nowait()[source]
readany()[source]
readexactly(n)[source]
readline()[source]
set_exception(exc)[source]
total_bytes = 0
wait_eof()[source]
class aiohttp.streams.DataQueue(*, loop=None)[source]

Bases: builtins.object

DataQueue is a general-purpose blocking queue with one reader.

at_eof()[source]
exception()[source]
feed_data(data)[source]
feed_eof()[source]
is_eof()[source]
read()[source]
set_exception(exc)[source]
class aiohttp.streams.ChunksQueue(*, loop=None)[source]

Bases: aiohttp.streams.DataQueue

Like a DataQueue, but for binary chunked data transfer.

read()[source]
readany()
class aiohttp.streams.FlowControlStreamReader(stream, *args, **kwargs)[source]

Bases: aiohttp.streams.StreamReader

read(n=-1)[source]
readany()[source]
readexactly(n)[source]
readline()[source]
class aiohttp.streams.FlowControlDataQueue(stream, *, loop=None)[source]

Bases: aiohttp.streams.DataQueue

FlowControlDataQueue resumes and pauses an underlying stream.

It is a destination for parsed data.

read()[source]
class aiohttp.streams.FlowControlChunksQueue(stream, *, loop=None)[source]

Bases: aiohttp.streams.FlowControlDataQueue, aiohttp.streams.ChunksQueue

FlowControlChunksQueue resumes and pauses an underlying stream.

readany()

aiohttp.websocket module

WebSocket protocol versions 13 and 8.

aiohttp.websocket.WebSocketParser(out, buf)[source]
class aiohttp.websocket.WebSocketWriter(writer)[source]

Bases: builtins.object

close(code=1000, message=b'')[source]

Close the websocket, sending the specified code and message.

ping()[source]

Send pong message.

pong()[source]

Send pong message.

send(message, binary=False)[source]

Send a frame over the websocket with message as its payload.

aiohttp.websocket.do_handshake(method, headers, transport, protocols=())[source]

Prepare WebSocket handshake. It return http response code, response headers, websocket parser, websocket writer. It does not perform any IO.

protocols is a sequence of known protocols. On successful handshake, the returned response headers contain the first protocol in this list which the server also knows.

class aiohttp.websocket.Message

Bases: builtins.tuple

Message(tp, data, extra)

data

Alias for field number 1

extra

Alias for field number 2

tp

Alias for field number 0

exception aiohttp.websocket.WebSocketError[source]

Bases: builtins.Exception

WebSocket protocol parser error.

aiohttp.worker module

aiohttp.wsgi module

wsgi server.

TODO:
  • proxy protocol
  • x-forward security
  • wsgi file support (os.sendfile)
class aiohttp.wsgi.WSGIServerHttpProtocol(app, readpayload=False, is_ssl=False, *args, **kw)[source]

Bases: aiohttp.server.ServerHttpProtocol

HTTP Server that implements the Python WSGI protocol.

It uses ‘wsgi.async’ of ‘True’. ‘wsgi.input’ can behave differently depends on ‘readpayload’ constructor parameter. If readpayload is set to True, wsgi server reads all incoming data into BytesIO object and sends it as ‘wsgi.input’ environ var. If readpayload is set to false ‘wsgi.input’ is a StreamReader and application should read incoming data with “yield from environ[‘wsgi.input’].read()”. It defaults to False.

SCRIPT_NAME = ''
create_wsgi_environ(message, payload)[source]
create_wsgi_response(message)[source]
handle_request(message, payload)[source]

Handle a single HTTP request