19
top 21 comments
sorted by: hot top controversial new old
[-] Andy@programming.dev 4 points 1 year ago

To add an alternative to what's already been suggested: you can keep a requirements.in with your explicit dependencies, and use pip-tools's pip-compile to generate requirements.txt with the full tree, version locked. Or you can generate this from a pyproject.toml. Then you can use pip-tools's pip-sync to install and uninstall packages to make your actual environment match the .txt lockfile.

[-] tuto193@lemmy.world 3 points 1 year ago

As already mentioned, poetry (python-poetry) is the best thing I know regarding package management for projects. It's quite easy to setup and use, and I use it together with pyenv.

If your problem lies on your locally installed packages, then I think you're basically on your own manually searching for packages you don't use. There's not really a way for pip/python to know which packages are relevant or not (at least not that I know of), and if your problem is just about a single environment, then just delete the environment and start anew.

[-] G0FuckThyself@lemmy.world 2 points 1 year ago

Thanks, i will look into poetry. Yeah the global python package are main problem. Should've used vnev from start. 😅

[-] CodeBlooded@programming.dev 3 points 1 year ago

As an alternative to having to clean your Python environments, I’d like to suggest putting those efforts into mastering Docker. If you can master using Docker containers as your Python environment, you can cut through a lot of the pains regarding dealing with virtual environments, multiple Python installations, and the quasi-confusing PYTHONPATH environment variable.

I don’t even install Python on my machines anymore. I’m 100% “dockerized” and I’ll never go back.

[-] gnus_migrate@programming.dev 4 points 1 year ago

This works great, assuming you're using Linux. On Windows this kind of a setup is a nightmare to maintain, since if something goes wrong you have to troubleshoot through several layers of virtualization and weird OS integrations to understand what's happening. Venv is a much better solution for that.

[-] qwop@programming.dev 2 points 1 year ago

Yeah, my experience with docker on windows has been pretty bad, uses high CPU and RAM at the best of times, at the worst completely hangs my computer on 100% CPU usage forcing a restart as the only fix.

I really don't understand why people are overcomplicating this. You can install multiple Python versions at once on Windows and it just works fine (you can use the py command to select the one you want).

Virtual environments are designed exactly for this use case. They've got integrations for pretty much everything, they're easy to delete/recreate, they're really simple to use, they're fast, and they just work.

If virtual environments alone aren't quite enough you can use something like poetry or pipenv or the many other package management options, but in many cases even that is overkill.

[-] gnus_migrate@programming.dev 1 points 1 year ago* (last edited 1 year ago)

That matches my experience as well. Nothing kills a developers motivation like spending time dealing with and debugging weird failures on their dev environments as opposed to actually working on the things they care about.

[-] CodeBlooded@programming.dev 1 points 1 year ago

I disagree, big time.

Now, of course, Linux knowledge is necessary to be productive with Docker in most cases. So, if Windows is all you know, Docker will have a learning curve for you regardless of where you run it.

In my experience, this kind of setup is ideal for Windows. Working on a Python project with individual developers who are using a mix of MacOS, Linux flavors, and Windows without using Docker would help illuminate Docker’s benefits here.

Can you speak more to how you had more than one level of virtualization when using Docker on Windows? 🤔

The only issue I ever ran into with Docker on Windows was the Docker VM’s clock getting out of sync with the host system’s clock (which was solved by restarting the VM 🤷‍♂️).

[-] gnus_migrate@programming.dev 3 points 1 year ago* (last edited 1 year ago)

The fact that you have to operate a VM on your machine in order to use it is a no go for me. Either use Linux as your OS in that case, or spend the time to make a dev environment that actually works. Even just mounting a directory is a painful exercise, and something that comes with performance limitations. I've had to deal with networking problems, drive problems, all sorts of issues that you really don't want to waste your time fixing when you have a deadline coming up.

Multiple levels of virtualization, meaning you have the containerization part which is virtualizing the userspace of the VM, and the virtualization of the VM itself. Your development environment consists of multiple layers of complex, not to mention fragile technologies which is an incredibly bad idea for something you rely on heavily for your day to day work.

The people I know who have to use it run a fully fledged Linux VM and do all their work on that. That is how bad the developer experience is on Windows.

Also I'm a bit insulted that you immediately assumed that I'm speaking from a lack of experience, as opposed to years of experience supporting docker dev environments on windows and failing to find a solution that doesn't come with severe drawbacks.

[-] CodeBlooded@programming.dev 1 points 1 year ago* (last edited 1 year ago)

I didn’t mean to insult you. I apologize for that! I work with Docker a lot and have years of first hand experience developing with it on Windows, MacOS, and Linux. I have never experienced the pains you’re describing with network problems and drive problems- the things I’m doing with Docker touch all of these things and I feel that Docker actually eases the pain. With that, hearing someone say this stuff makes me feel like it’s not Docker, but rather the developer. I hope that makes sense from my point of view.

I can’t imagine developing Python applications on Windows without Docker, especially if I’m deploying this code to run on servers or “in the cloud,” or if I’m working with multiple developers on a project. It’s made development so much more “sane.”

I accept that perhaps you’re working on some very performance focused projects that don’t deal well with running in a VM and it’s beyond my understanding- but saying stuff like “mounting directories is a painful exercise?” My experience has been the complete opposite. What you’re calling a “nightmare to maintain” has made Python projects a “dream to maintain” for me.

[-] gnus_migrate@programming.dev 1 points 1 year ago

I write Java mainly, not python, but I understand why python specifically might be easier with docker even though virtual env exists and works fine on Windows. To be clear, docker is a fantastic tool, and it has its use cases, but a local dev environment outside of Linux is a recipe for pain.

Every time I try to install it, I have to spend a bunch of time figuring out how to make it work, not to mention VPNs breaking it, and the tooling to run it changing every six months. The VM that it runs in on Docker desktop is largely undocumented, so you dont know how you can even log into it and troubleshoot, much less fix the issues that arise. This isn't a robust tool, this is a hodgepodge of technologies that are duct taped together into something that can work, but is extremely difficult to fix if it breaks.

I don't know why your experience is different, but don't assume that people don't know what they're doing when they say stuff like this. Just because it works on your machine doesn't mean it works on others. You can find lots of developers with similar experiences, it's not just me.

If you don't want to insult me, then believe what I'm saying rather than speculating about my abilities.

[-] CodeBlooded@programming.dev 1 points 1 year ago

My experience is working across many machines, all the major OS’s, and assisting many developers, flexing many features of Docker, usually operating behind VPN’s, for several years. This isn’t a “my machine” situation. Listening to a stranger on the internet describe how awful this technology is, I can only assume from my experiences that this is a “you” problem. I think that even if you disagree with me, that assessment is fair given the circumstances of us being strangers discussing this over the web with limited ability to share specifics regarding your issues.

I believe that you’re experiencing issues, I’m just unable to understand how it’s Docker’s fault from what you’ve shared. Again, this is based on a lengthy history working directly with, and enjoying the ease of, the parts of Docker that you’re calling problematic.

I don’t want to argue. I do believe you’re experiencing issues and I understand that deadlines don’t care about them! Instead of continuing here, what do you think about ending on this?: The next time you run into one of these issues, would you mind sharing it in a Docker focused Lemmy and tagging me? I’d love the opportunity to see if I could assist in some of the issues you’re having. Maybe I could help, or at least learn why these problems you’re describing are not impacting me.

[-] gnus_migrate@programming.dev 1 points 1 year ago

Its not docker's fault, its the fault of the stack of crap that is needed to run it(WSL and co). My point is that it isn't worth the trouble. I could figure it out myself, but dev containers don't bring enough value to my team to justify the investment, and I really dont want to spend a bunch of time troubleshooting issues related to it not just for myself, but for everyone on my team. I played that role before, and it is exhausting to have to do that on top of the other things I need to do.

Docker is really great for CI, for deployments, etc. I really like it, and i have spent a significant chunk of my career developing expertise in it. Its not something I would recommend locally unless you have no other choice, or you're running Linux and are able to use it natively.

[-] G0FuckThyself@lemmy.world 2 points 1 year ago

I have started to learn Docker recently and i might do this in future.

[-] CodeBlooded@programming.dev 1 points 1 year ago

You won’t regret it- Docker is great for development and deployment. It’s definitely a desirable skill to have on your resume.

[-] EngineerDaryl@fosstodon.org 2 points 1 year ago

@CodeBlooded @G0FuckThyself yeah I really want to do this as well at my job so I can develop and test for the correct Python version (my laptop runs a newer version then our Production environment), but the laptop is wayyyy too slow to do that. But, I will get a new laptop soon, so who knows

[-] uthredii@programming.dev 2 points 1 year ago

You should be able to use pip uninstall. See this link for details:https://pip.pypa.io/en/stable/cli/pip_uninstall/

Using it in a venv will affect the venv. Using it outside the venv will affect global packages.

[-] G0FuckThyself@lemmy.world 3 points 1 year ago

I know i can use pip to uninstall packages, the problem is I don't know which package is needed and which is not needed. I wish pip was like pacman or apt which can list orphan packages.

[-] coffeewithalex@lemmy.world 4 points 1 year ago

Sounds like you need poetry.

Or at least to track your dependencies, so that you can recreate your virtual environment.

[-] G0FuckThyself@lemmy.world 1 points 1 year ago

Thanks, i will look into poetry. Yeah the global python package are main problem. Should've used vnev from start. 😅

[-] const_void@lemmy.world 1 points 1 year ago

the ez way, delete site-packages & rando venv, then start anew with a venv + pip install from requirements.txt

load more comments
view more: next ›
this post was submitted on 08 Jul 2023
19 points (100.0% liked)

Python

6318 readers
1 users here now

Welcome to the Python community on the programming.dev Lemmy instance!

📅 Events

PastNovember 2023

October 2023

July 2023

August 2023

September 2023

🐍 Python project:
💓 Python Community:
✨ Python Ecosystem:
🌌 Fediverse
Communities
Projects
Feeds

founded 1 year ago
MODERATORS