Python virtual environment usage recommendations

  • venv
  • pipenv
  • virtualenv
  • virtualenvwrapper
  • virtualenv-burrito
    etc

So many tools for virtual environments, but which to choose?

Could some experienced (python) users chime in with what they use/recommend, and why?

Also pointing out common problems and pitfalls would be great!

I use venv, as it’s the official one and built directly into cpython. It supersedes virtualenv for most use cases. This is what the virtualenv docs have to say

Since Python 3.3, a subset of it has been integrated into the standard library under the venv module. Note though, that the venv module does not offer all features of this library (e.g. cannot create bootstrap scripts, cannot create virtual environments for other python versions than the host python, not relocatable, etc.). Tools in general as such still may prefer using virtualenv for its ease of upgrading (via pip), unified handling of different Python versions and some more advanced features.

Basically, if you don’t need any of those specific features go with venv.

3 Likes

pipenv is (iirc) one of the newest and is my preferred environment. You can install it with pip and it makes modules portable with a simpler format than setup.py. I find the commands reasonably intuitive, and it handles dependencies with little effort beyond formatting a Pipfile.

1 Like

Yes, built-in vs third-party is one big consideration.

It’s difficult to compare each of their pros/cons without having used them, and the variety of options makes it confusing. eg. pipenv apparently uses pip and virtualenv under the hood. But now the built-in venv is recommended?

Which options are industry standard vs obscure fads? There’s also the matter of adapting existing instructions and resources to use new tools.

The advantage to this is you can export a requirements.txt from pipenv if you need to. I guess it’s really just venv for humans normies.

This is true even when going from global packages to any env, inescapable to a degree

Probably a good idea if you plan to maintain projects with other devs, though I’d still recommend pipenv because it’s easy mode

My impression so far is that venv is the baseline for virtual environments, being a standard python library, while pipenv additionally focuses on packaging ease for application development and deployment.

Also, there are system packages on several OSes for installing pipenv. The description from dnf info pipenv says “The officially recommended Python packaging tool that aims to […]”. I wish they mentioned who officially recommends!

I think I’ll start with venv to get familiar, and keep an eye on pipenv because it looks pretty cool even though I’m not planning on application development right now.

I don’t suppose either of these tools have virtualenvwrapper's feature of listing virtual environments so they don’t get scattered and forgotten?

pipenv is now the standard in new projects. at least in the last 2 startups i worked in.

1 Like

More reading has expanded the list:

  • hatch
  • poetry
  • conda

This is getting ridiculous.

Apparently many love pipenv but there’s also plenty of hate for it and similar tools. An interesting criticism:

Conclusion¶

  • Pipenv, contrary to popular belief and (now removed) propaganda, is not an officially recommended tool of Python.org. It merely has a tutorial written about it on packaging.python.org (page run by the PyPA).
  • Pipenv solves one use case reasonably well, but fails at many others, because it forces a particular workflow on its users.
  • Pipenv does not handle any parts of packaging (cannot produce sdists and wheels). Users who want to upload to PyPI need to manage a setup.py file manually, alongside and independently of Pipenv.
  • Pipenv produces lockfiles, which are useful for reproducibility, at the cost of installation speed. The speed is a noticeable issue with the tool. pip freeze is good enough for this, even if there are no dependency classes (production vs development) and no hashes (which have minor benefits) [2]
  • Hatch attempts to replace many packaging tools, but some of its practices and ideas can be questionable.
  • Poetry supports the same niche Pipenv does, while also adding the ability to create packages and improving over many gripes of Pipenv. A notable issue is the use of a custom all-encompassing file format, which makes switching tools more difficult (vendor lock-in).
  • Pip, setup.py, and virtualenv — the traditional, tried-and-true tools — are still available, undergoing constant development. Using them can lead to a simpler, better experience. Also of note, tools like virtualenvwrapper can manage virtualenvs better than the aforementioned new Python tools, because it is based on shell scripts (which can modify the enivironment).

conda tends to be the go-to in the sciences. My guess is that’s because conda comes along for the ride with Anaconda which has a strong data science and ML focus out of the box.

In my case it allowed for tracking of non-python packages. Being able to lock a certain version of LAPLACK or BLAS to an environment can be really helpful. Other options might be able to do that too but it’s fairly easy to setup for conda.

Beyond that I don’t need a whole lot of control over environments. So depending on your needs something else might serve you better.

Which is great since that fits my use case. conda is actually available in fedora’s repo, but I plan to install Anaconda anyways. (Confusingly there’s also an unrelated anaconda package in fedora’s repo.) conda being language-agnostic is really convenient.

Any disadvantages to conda or better alternatives?

I’ll probably settle for conda for my general use and maybe pip + venv for isolated python projects. The other tools seem convenient for very specific workflows.

pip with virtulenv is pretty much standard in everything opensource. (just take a look at the mainstream repos for python in github)

most newer projects are using pipenv.

i would just recommend you use what’s mainstream mostly because is more usefully to know, it will be easier to work later on with a team that uses that, and it will be easier to share and involve more people when your using main stream software.

i personally just use virtualenv venv or virtualenv -p python3 virtualenv will install pip by default in your new env and them i just do source venv/bin/activate i have an alias for that last command that i can run in any folder where i have a python project. this is simple and very portable (work on linux and OSX)

i also add venv/ to my .gitignore file

1 Like

Only potential downside I’ve run into is conda doesn’t use symlinks for environments. Each environment is explicitly isolated on its own path.

Which is why it can manage non-python elements and even different versions of the python executable in each environment. That’s beneficial for what I do, but full isolation can also pose challenges depending on the project.

1 Like

I’m finding references to virtualenv confusing. Correct me if I’m wrong: virtualenv started as a third party library, parts of which then got incorporated into standard python libraries, apparently named virtualenv for python2 and venv for python3. Additionally many articles refer to virtualenv/venv as though it’s synonymous, maybe due to updating old articles with new terminology. Not to mention the confusion of examples that name virtual environments or directories with the same name as the modules! It becomes totally ambiguous whether the third party tool or built-in library is being referenced!

Since you have experience with this, what kinds of startups and projects are using pipenv?

You mention you’re doing DevOps in the AI/ML space (cool system build thread btw). What standard have you noticed for AI/ML projects?

virtualenv and venv are two different projects.

venv is in the python3 standard library and its define here https://www.python.org/dev/peps/pep-0405/

while similar i tend not to use this kind of tools on the standard library mostly because it takes ages to update and fix things in the std lib.

pipenv uses virtualenv in the background, actually most virtual envs kind of tools use virtualenv in the background and they just add a few convenient functionality.

i have found during the years that using virtualenv is the easiest because it adapts to most workflows instead of relaying on higher level projects adding to virtualenv.

for the past 2-3 years i have used nothing but pipenv on internal projects and the occasional pip (requirements.txt file) when needed or legacy.

i currently work for a company called Labelbox and in the past a company called dronedeploy.

now days is hard to say what’s the standard because i work with a lot of different languages, but python wise i don’t remember the last time a project was not using pipenv :wink:

Ok. I can’t recall where I got confused from, maybe by mixing a bunch of what I read.

eg

venv (for Python 3) and virtualenv (for Python 2) allow you to manage separate package installations for different projects.

If you are using Python 2, replace venv with virtualenv in the below commands.

https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/

But now it’s making more sense. virtualenv is ubiquitous, even after the introduction of venv, even with virtualenv's documentation recommending venv.

Interesting point about preferring a more rapidly maintained tool vs more slowly updated standard libraries. Though I doubt that’s much of an issue for my use case.

For different languages, what do you do? Use each one’s own tool? Use a multi-language tool? conda says it works with Python, R, Ruby, Lua, Scala, Java, JavaScript, C/ C++, FORTRAN, which is a pretty nice list for the sciences.

pip/pipenv for python
go mod for golang
yarn/npm for node
for c/c++ we keep a vendor folder with all external dependencies. (but we have very little c code)

1 Like