Onion Python Bindings: Fastest python HTTP server ever

TL;DR

Onion python bindings are blazing fast!

Introduction

One of the points in making onion in C is to be able to easy create bindings to other programming languages. On onion main branch we already provide C++ bindings, which really improve a lot the onion experience. At https://github.com/krakjoe/ponion there is a prof of concept for PHP bindings, and I also made a fast prof of concept of Python bindings. They are available at https://github.com/davidmoreno/onion/tree/python.

In this version it uses the ctypes to provide the bindings, with the hope to make it usable with pypy (although it does not work right now).

A simple web server


#!/usr/bin/python
import sys
sys.path.append('../../src/bindings/python/')
sys.path.append('../../../src/bindings/python/')

from onion import Onion, O_POOL, O_POLL, OCS_PROCESSED
import json

def hello(request, response):
 d=request.header().copy()
 response.write(json.dumps(d))

def bye(request, response):
 response.write_html_safe("Bye!")
 response.write_html_safe( 'path is <%s>'% request.fullpath() )
 #response.write_html_safe( json.dumps( request.query().copy() ) )

def main():
 o=Onion(O_POOL)

 urls=o.root_url()

 urls.add("", hello)
 urls.add_static("static", "This is static text")
 urls.add(r"^.*", bye)

 o.listen()

if __name__=='__main__':
 main()

The first two lines are about adding the library to the global path. In the future it should be installed as any other python module. The it is loaded, loading just some symbols. All of them are the same as possible as the C version and the C++.

Then we define a couple of handlers. Each handler should receive both the request and response. From the request some info about the request can be gathered, and on the response the response should be written. For this prof of concept I only implemented bindings for write and write_html_safe. Check at github source to see more options. Adding bindings for more methods is really easy.

The first handler converts all the headers to a json object, and dumps it. The second one just writes some safe html (properly quoting html symbols) to the response.

In the main function we just create the onion object, as a pool of threads, and add a url handler for several addresses. Normally is just a regex to handler mapping, but the same as onion there are special handlers already programmed in C which can give ahuge performance boost for certain situations, such as simple static data.

Finally we just call listen to listen for connections.

Performance

This is not a full performance benchmark, just a comparison with other technologies. A fast Google search yields this Blog post, which found gevent one of the fastest, so I will compare just to that. The test he gives just writes back pong:

#!/usr/bin/python

def application(environ, start_response):
    status = '200 OK'
    output = 'Pong!'
 
    response_headers = [('Content-type', 'text/plain'),
                        ('Content-Length', str(len(output)))]
    start_response(status, response_headers)
    return [output]
from gevent import wsgi
wsgi.WSGIServer(('', 8088), application, spawn=None).serve_forever()

Running ./wrk http://localhost:8088/ I get:

Running 10s test @ http://localhost:8088/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.39ms  373.51us   8.30ms   85.61%
    Req/Sec     3.49k   507.99     5.33k    75.98%
  66243 requests in 10.00s, 9.92MB read
Requests/sec:   6624.38
Transfer/sec:      0.99MB

A similar program for onion-python would be:

#!/usr/bin/python
import sys
sys.path.append('../../src/bindings/python/')
sys.path.append('../../../src/bindings/python/')
from onion import Onion, O_POOL

def application(request, response):
 response.write("Pong!")

o=Onion(O_POOL)
urls=o.root_url()
urls.add("", application)
o.listen()

And it yields this amazing results:

Running 10s test @ http://localhost:8080/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   344.00us  544.02us  15.02ms   91.09%
    Req/Sec    15.42k     3.17k   25.44k    69.41%
  292468 requests in 10.00s, 40.16MB read
Requests/sec:  29246.77
Transfer/sec:      4.02MB

But for this simple example maybe we can cheat a little bit, just telling onion that “Pong!” is a static content:

urls.add_static("", "Pong!")
Running 10s test @ http://localhost:8080/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   206.52us  351.44us   7.19ms   91.60%
    Req/Sec    23.80k     3.29k   54.33k    74.12%
  449232 requests in 10.00s, 61.69MB read
Requests/sec:  44923.53
Transfer/sec:      6.17MB

Drawbacks

Onion is quite stable and used in production in many places. This bindings are not. Many bugs may appear. For example Control-C does not work to stop it (I do Control-Z and kill -9 %1).

Onion-Python does not follow the WSGI protocol. WSGI allows an application in python that uses that interface, to use any backend. Maybe in the future this should be explored, but from my very low understanding of WSGI, that may mean that finally only the connection handling and parsing could be used, forgetting about the path dispatcher, for example.

Anyway it really looks that with some more work to expose all onion functionalities we have a huge winner here.

Leave a Reply