So, I have a python script I’d like to run from time to time from the CLI (on Linux) that resides inside a venv. What’s the recommended/intended way to do this?
Write a wrapper shell script and put it inside a $PATH-accessible directory that activates the virtual environment, runs the python script and deactivates the venv again? This seems a bit convoluted, but I can’t think of a better way.
That works nicely. Thanks 👍
I use my own Zsh project (zpy) to manage venvs stored like
~/.local/share/venvs/HASH-OF-PROJECT-PATH/venv
, so use zpy’svpy
function to launch a script with its associated Python executable ad-hoc, or add a full path shebang to the script with zpy’svpyshebang
function.vpy and vpyshebang in the docs
If anyone else is a Zsh fan and has any questions, I’m more than happy to answer or demo.
@Andy The convention is to place the venv in a .venv/ sub folder. Follow the convention!
This is shell agnostic
Learn pyenv and minimize shell scripts (only lives within a Makefile).
Shell scripts within Python packages is depreciated
That’s one convention. I don’t like it, I prefer to keep my venvs elsewhere. One reason is that it makes it simpler to maintain multiple venvs for a single project, using a different Python version for each, if I ever want to. It shouldn’t matter to anyone else, as it’s my environment, not some aspect of the shared repo. If I ever needed it there for some reason, I could always
ln -s $VIRTUAL_ENV .venv
.I have used pyenv. It’s fine. These days I use mise instead, which I prefer. But neither of them dictate how I create and store venvs.
I don’t understand if what you’re referencing relates to my comment.
The multiple venv for different Python versions sounds exactly like what tox does
Then setup a github action that does nightly builds. Which will catch issues caused by changes that only tested against one python version or on one platform
py313 is a good version to test against cuz there were many modules removed or depreciated or APIs changed
good luck. Hope some of my advice is helpful
Thanks, yes, I use nox and github actions for automated environments and testing in my own projects, and tox instead of nox when it’s someone else’s project. But for ad hoc, local and interactive multiple environments, I don’t.
Are you using github actions locally? Feel silly making gh actions and workflows and only github runs them
No, I don’t use GHA locally, but the actions are defined to run the same things that I do run locally (e.g. invoke
nox
). I try to keep the GHA-exclusive boilerplate to a minimum. Steps can be like:Sometimes if I want a higher level interface to tasks that run
nox
or other things locally, I usetaskipy
to define them in mypyproject.toml
, like:Thanks for the introduction to taskipy. Think if i need macros, Makefile is the way to go. Supports running targets in parallel and i like performing a check to ensure the virtual environment is activated or the command won’t run.
.ONESHELL: .DEFAULT_GOAL := help SHELL := /bin/bash APP_NAME := logging_strict #virtual environment. If 0 issue warning #Not activated:0 #activated: 1 ifeq ($(VIRTUAL_ENV),) $(warning virtualenv not activated) is_venv = else is_venv = 1 VENV_BIN := $(VIRTUAL_ENV)/bin VENV_BIN_PYTHON := python3 PY_X_Y := $(shell $(VENV_BIN_PYTHON) -c 'import platform; t_ver = platform.python_version_tuple(); print(".".join(t_ver[:2]));') endif .PHONY: mypy mypy: ## Static type checker (in strict mode) ifeq ($(is_venv),1) @$(VENV_BIN_PYTHON) -m mypy -p $(APP_NAME) endif
make mypy
without the virtualenv on will write a warning message why it’s not working!thanks for the head up on nox. Syntax seems like a tox meets pytest.