I hate this hand-holding. Certainly use venvs for dev projects but allow system-wide installations for those that want it. OSS has always been about giving you enough rope to hang yourself.
Python
Welcome to the Python community on the programming.dev Lemmy instance!
π Events
Past
November 2023
- PyCon Ireland 2023, 11-12th
- PyData Tel Aviv 2023 14th
October 2023
- PyConES Canarias 2023, 6-8th
- DjangoCon US 2023, 16-20th (!django π¬)
July 2023
- PyDelhi Meetup, 2nd
- PyCon Israel, 4-5th
- DFW Pythoneers, 6th
- Django Girls Abraka, 6-7th
- SciPy 2023 10-16th, Austin
- IndyPy, 11th
- Leipzig Python User Group, 11th
- Austin Python, 12th
- EuroPython 2023, 17-23rd
- Austin Python: Evening of Coding, 18th
- PyHEP.dev 2023 - "Python in HEP" Developer's Workshop, 25th
August 2023
- PyLadies Dublin, 15th
- EuroSciPy 2023, 14-18th
September 2023
- PyData Amsterdam, 14-16th
- PyCon UK, 22nd - 25th
π Python project:
- Python
- Documentation
- News & Blog
- Python Planet blog aggregator
π Python Community:
- #python IRC for general questions
- #python-dev IRC for CPython developers
- PySlackers Slack channel
- Python Discord server
- Python Weekly newsletters
- Mailing lists
- Forum
β¨ Python Ecosystem:
π Fediverse
Communities
- #python on Mastodon
- c/django on programming.dev
- c/pythorhead on lemmy.dbzer0.com
Projects
- PythΓΆrhead: a Python library for interacting with Lemmy
- Plemmy: a Python package for accessing the Lemmy API
- pylemmy pylemmy enables simple access to Lemmy's API with Python
- mastodon.py, a Python wrapper for the Mastodon API
Feeds
then they come after our guns, but spoons are always magically safe
To all the fat slob system wide installation cock blocking PR submitters, i say,
Ban spoons!
Shooting ourselves in the foot is a G'd given right! /nosarc
Couldn't have said it better. π
Which you can still do. That said, the "correct" and less problematic way of installing packages should be easier than the alternative.
What really annoys me is they purposely broke per-user and local installation. Fine, system wise installation isn't a good idea when it's already managed by another package manager, but user installation is my domain.
The reason they did this is because a package installed by the user can be active when a system tool is called and break the system tool. The distro developers went "Oh, we should force all user code into venvs so that our code is safe".
Completely and utterly backwards. The protected code needs to be inside the defensive wall. The user should be allowed to do anything in the knowledge that they can't inadvertently change the OS. When a system tool is called it should only have system libraries on it's Python Path.
You still have the option to choose not to use a venv and risk breaking your user space.
The changes make this harder to do it by accident by encouraging use of a venv. Part of the problem is that pip install --user
is not exactly in the user space and may in fact break system packages, and as you wrote, the user shouldn't be able to inadvertently change the OS.
Makes more sense and I agree, especially with the apparent ease of pip install --user
. But there should be no barriers when the root user is used with pip install --system
.
So the problem here is that you can inject code into a system python process because they run with the user's python install location on their path.
They've fixed the wrong "root cause".
@norambna Good points! ππ» especially since conflict resolution in PIP sucks and itβll happily install incompatible packages
pip is great! It lets ya know when there are dependency conflicts.
Up to us to learn how to deal with resolving dependency conflicts.
There are those who attend the whining parade down main street.
There are the very very few who write a package to resolve these issues.
How will this affect command-line tools like azure-cli installed in a container image that can be installed with pip? Will we be forced to append the venv to $PATH?
We need AA meetings
Hello!
My name is Billy Joe Jim Bob
Hello Billy!
I haven't had a dependency conflict for the past 3 hours. The sleeping problems haven't gone away. As i feel my eye lids drupe, keep thinking about each of my packages and imagining where will the next unresolvable dependency conflicts emerge.
Then i wake up covered in sweat.
Can't keep going on like this. Thank you for listening
Thank you for sharing Billy!
I'd you have full ci/cd then it's unnecessary
System-wide installation as it was implemented should stay in the past. I like pixi's (Conda alternative) approach here, where each system dependency lives in its own virtual bubble, so recreating and porting this software is a breeze.
But if all you use can stay in a venv, just use one.
Same thing said another way, be open to using more than one venv
I was about to go man systemd-wide
You already are forced to use a venv, but I fucking hate pip and some projects don't work in venv I don't know why it just doesn't and it sucks
Don't wanna be that guy who gaslights you.
If you are having issues, should be pointing us at a repo
That's the thing, if everybody is forced to use a venv, those projects will either fix their shit or lose all of their userbase.
So these package maintainers are harboring magical charms and voodoo dolls which us lowly plebs just don't know about?
If these guys are so awesome, shouldn't we be plying them with coke and blow and keep 'em working resolving our dependency resolution issues?
They do have the secret sauce and just holding it back from the rest of us
I'm sorry but... what?!?
Would the coke and blow happen for that guy?
Makes sense if i'm that guy
This question is about Python package funding. If world+dog no longer stresses over pip dependency resolution isn't this not extremely valuable? So how to go about getting that package permanently funded. No bs dangling a tiny carrot and insisting on strict justice (reporting milestones ...). Then funding only happens for large projects.
Question on package funding is very legitimate. Have a list of packages that are no longer maintained cuz funding never happened.
Can subsist on crocodile tears. It's a guilty pleasure.
Meaning, if package funding never ever happens, and all that ever happens is never ending articles/threads on Python devs whining about dependency resolution, i'm going to feed that.
Personally not suffering from dependency resolution stress. Everyone else does.
If the available solutions were sufficient there would be no more articles with comment sections filled with war stories, tales of carnage, and loss.
... always comes down to that one guy.
Solve the Python author maintainer funding issue!
Then and only then will i market the package that specifically targeted towards resolving pip dependency resolution issues for package (and app) maintainers.
Soon, you won't have a choice because major distros are adopting PEP 668. This will make pip install fail in the default system Python and show an error telling you to use a virtual environment.
Well, if this is true then why bother convincing people ;)
Even with PEP 668, you can still use pip --break-system-packages
So ... if I want to use a python module like, for example, mcstatus in a live shell for convenience I first need to create a venv, activate it, install the package and then use it? And then either have dozens of venvs somewhere or remake them every time?
Use pipx or uv --run
Distrobox?
Yes
Whats the alternative you are advocating for?!
The old way ig
the old way i am fine with
Never ever made a mistake and install anything system wide
don't need white knights or a nanny state to keep us safe
I am not sure what you mean. Once you created a venv you can always reuse it.
Yes, but it has to be somewhere. I don't want dozens of venv dirs in my homedir.
just to add to the other answers - no need to have them in your home dir (that sounds like it would suck). use a tool like uv tool
or pipx
, or just manually create any venv you need under a path you choose, say $HOME/.cache/venvs/
It doesn't really take up space if you use deduplication
Then create one venv for everything
the one venv to rule them all
is not a viable solution.
Some packages cause problems, one tactic is to isolate the problem package into a separate venv.
For example
.venv/
-- main venv for the dev setup
.doc/.venv
- Sphinx docs (now minimum py310)
.wth
for e.g. package restview which has out of date dependencies.
Each venv has its respective requirements files. Some associated with a dependency or optional dependency. The ones that aren't are pin files.
Lets say there are easily a total of 20 requirements and constraints (pin files).
This mess is in a freak'n nasty multi level hierarchy.
Now imagine you are the author maintainer of 10 packages. Can you be confident one package requirements won't conflict with other packages requirements?
Almost forgot
these packages are no longer maintained:
pip-tools
pip-requirements-parser
... scary
Can you create venvs inside venvs? That sounds like stuff is going to break tbh.
Why would you want a venv "inside" a venv? What would that mean?
Well, if you want to have Pip-installed tools available generally (e.g. until distros started screwing it up, pip
was the best way to install CMake), the suggestion was to have a venv for the user that would be activated in your .bashrc
or whatever.
I think that would work, but then what happens if you want to use a project-level venv, which is really what they're designed for? If you create and activate a venv when you already have one activated does it all work sensibly? My guess would be that it doesn't.
Oh! Hmm. That's a good question and I really don't know. So in other words (this is just how I'm organizing the thoughts in my own head, probably includes some misunderstandings so feel free to correct any you notice) - your "system Python" is really an activated venv specified in your user config in some way, and the question is what happens when you deliberately try to then activate a distinct project venv, which Python executable and collection of installed libraries is invoked when doing stuff with it active?
On the one hand I've never considered that and it's probably a mistake to make too many assumptions about how Python (and its instrumentation, pip
etc. included) are interacting with the OS. Because I know fuck all about that, when I really think about it lol. On the other hand, one of the things I find pleasant about Python is that usually much more informed and thoughtful people than myself have chosen among several ways of dealing with whatever situation I'm thinking about, and have decided on a sensible default. But yep, idk. I originally just thought you misunderstood the idea of a venv lol, to my happy surprise, nope!
This article is about Python venvs using Docker. That I wouldn't want to pollute the base installation on my local machine should be clear.
But you can just create a venv and install everything in there, no need to create dozens of venvs if that's what you want.