Although pip allows you to easily install and update your dependencies, it does not make it easy when you want to seamlessly switch from one project to another. Instead, what you get is an ugly game of tug of war where your dependencies are fighting for priority. Eventually, with enough projects, your computer’s environment will be dirty with packages you have lost track of.
This is where virtual environments come into play. Virtual environments have the power to isolate your dependencies and even allow you the granularity to choose which version of python you are using. They also make it easier to reproduce outputs from machine to machine since they spin up more consistently. Finally, they give you the ability to bypass permissions issues or OS restrictions you might otherwise not be able to get around.
If you are using Python 3.3 or newer, the venv
module is the preferred way to create and manage virtual environments. If you are running an earlier version of Python checkout VirtualEnv which has very similar commands.
#!/bin/bash
# Initialize your virtual environment
python -m venv venv
# Activate your virtual environment
# Windows
./venv/Scripts/activate
# Mac + Linux
source ./venv/bin/activate
# Install a dependency
(venv) python -m pip install numpy
# Save your dependencies
(venv) pip freeze > requirements.txt
# Restore your dependencies
(venv) pip install -r requirements.txt
# Deactivate a virtual environment
(venv) deactivate
Tox is a generic virtualenv management and test command line tool. It lets you quickly and easily build virtualenvs and automate running additional build steps like unit tests, documentation generation, and linting. After installing, just run tox
or tox-quickstart
to get started.
# content of: tox.ini , put in same dir as setup.py
[tox]
envlist = py27,py36
[testenv]
# install pytest in the virtualenv where commands will be executed
deps = pytest
commands =
# NOTE: you can run any command line tool here - not just tests
pytest
Conda is an open-source package management system and environment management.
#!/bin/bash
# Create a new environment
conda create -n [environment-name] python=[python-version]
# Or create one from a configuration
conda env create -f environment.yml
# Activate an environment
conda activate [environment-name]
# Install packages
conda install -n [environment-name] [package]
# Generate an environment.yml
conda env export > environment.yml
# Deactivate an environment
conda deactivate
# Remove an environment
conda remove -n [environment-name] -all
Docker uses OS-level virtualization to deliver software in packages called containers.
FROM python:3.9
RUN python -m pip install numpy==1.23.2
WORKDIR /usr/python
COPY ./ ./
CMD python main.py
Then you would need to build and run you image.
#!/bin/bash
docker build . -t [image-name]
docker run [image-name]
There are a lot of valid ways to manage your Python dependencies using virtual machine. My personal preference is Docker. The main reason being how cleanly it allows you to add dependencies as well as its flexibility to allow your code to interact with other resources. Docker in general will be most helpful if you ever decide to extend outside of Python.