Setting up your Python open source project

In the past few months I've been able to find the time and energy to open-source 5 Python projects on Github. I think being able to work remotely from home on my own schedule is the reason something like this could have happened, but that's for a different blog post.

In this post, I would like to go over a few tools you can (and probably should!) set up on your own projects so that they stay in good shape. Some of them are Github-integrations and some are tools you run locally.

Travis CI

Code without tests is broken by design.

Having a well-maintained test-suite in your project has a lot of advantages.

  1. Tests provide a fairly automated way to repeatedly make sure that your code is doing what it's supposed to do. Without tests, all you have is some source code which you believe works, but you can't prove it.
  2. It helps you make changes to your codebase confidently. Let's say tomorrow morning you decide to change a few things. A good test-suite would help you assert that the behavior of your code post-surgery is the same as it was before.
  3. Tests also act as a form of documentation. Often, when I'm jumping into a new project and I don't know where to start, I open up the test-suite and start reading through the tests to try to get a sense of what exactly is going on. The tests point me to which functions or classes or modules are important, which in turn gives me a few entry points to the source code that I could use to dive in.

So now that we've established the importance of having tests, it also makes sense to run them on every change you make. This is what continuous integration is, and this is where Travis CI comes in.

Travis is a free and very popular continuous integration system. Once you enable it on a project of yours, it'll run your test suite on every single commit you make and tell you whether or not something broke.

Setting Travis up is quite straight-forward. When you sign-up (it's free for open-source projects), it shows you a list of projects it found in your Github account. You can then decide to enable selected projects from that list.

Once you've done that, it's just a matter of committing a .travis.yml config file to tell Travis more about what kind of a project you're trying to build.

Here's an example config file for a simple Python project.

language: python

    - "3.5"
    - "3.6"

    - pip install -r requirements.txt

    - python test

This tells Travis that you're building a Python project, and you'd like to test it with Python versions 3.5 and 3.6. The install step lets you install any project dependencies that your test suite might require. And finally the script step is how you let Travis know how to run your test suite.

From this point on, running all your tests for every single commit you make is the responsibility of Travis. Pretty neat.


More often than not, your project depends on some other software. Newer versions of these dependencies marks the ones your project is using outdated. This leaves you depending on older versions which could potentially contain security vulnerabilities.

Ideally you would like to be notified when a dependency has a new release. You could either keep an eye on the project pages for new releases, or you could sign up for email notifications.

The problem is that these solutions don't scale all that well. There's a limited number of changelogs you can stay on top of. And let's not talk about email notifications.

This is where PyUp comes in.

PyUp is a Python security and dependency tracker that's free for open-source projects.

Once you signup and enable it on a few projects, it'll start watching your project dependencies. If it notices an out-of-date dependency, it will open a pull-request with an update. All you have to do is review and approve such PRs.

What works even better is enabling both Travis and PyUp. In such cases, Travis will test every PR submitted by PyUp for potential breakages. This makes your decision to merge (or reject) even easier.


isort is a Python library to sort imports alphabetically. If your source files have a ton of imports, it can be really helpful to organize them in sections and then sort them alphabetically so they're easy to visually parse.

For example, if you have a Python file that looks like the following.

from my_lib import Object


import os

from my_lib import Object3

from my_lib import Object2

import sys

from third_party import lib15, lib1, lib2, lib3, lib4, lib5, lib6, lib7, lib8, lib9, lib10, lib11, lib12, lib13, lib14

import sys

from __future__ import absolute_import

from third_party import lib3


Running isort on it produces the following.

from __future__ import absolute_import

import os
import sys

from third_party import (lib1, lib2, lib3, lib4, lib5, lib6, lib7, lib8,
                         lib9, lib10, lib11, lib12, lib13, lib14, lib15)

from my_lib import Object, Object2, Object3


By default, isort sorts your imports into roughly three sections - the standard library, third party modules, and then the modules from the current project.

This, and a few other settings can be controlled using a configuration file. The standard way is to add a .isort.cfg file in your project root, but you can also add a [isort] section to your project's tox.ini or setup.cfg files and it'll all work the same way. Django for example specifies its isort settings inside setup.cfg.

A very useful feature of isort is checking your source files for incorrectly formatted imports. If you run isort **/*.py -c -vb, isort will display a report of correctly/incorrectly formatted files on stderr, and exit with an appropriate exit code.

This makes it very suitable as a build step in Travis. If you add isort **/*.py -c -vb as another line in the script section of your travis config, Travis will make the build as red if your source code contains incorrectly formatted imports.

One might think it's not worth the effort to sort the imports a certain way. But I personally find it nice when a codebase looks like it was written by one person even though multiple people may have worked on it. It makes you feel that things are consistent. Module imports may be a rather small section in your source code, but it's the first thing you see when opening a file, so having a certain order there certainly goes a long way in making things look consistent.

In this post I talked about 3 integrations you can enable on your Python code to make it nicer to work with. This is really only the tip of the iceberg and there are hundreds, if not thousands, of more such packages out there aiming at improving your code quality.

A lot of Python code out there stresses a lot on code quality, so that's definitely a metric I as a software developer would advice optimizing on (apart from other reasons).

It can sometimes feel like extra effort to set these external tools/integrations up. In the long term, however, this effort is completely worth it. And enabling integrations like the ones I talked about in this post will go a long way in making your code look Pythonic (for some definition of that word).