Python deployment with cuda support examples

Hi, please forgive a naive question after only searching for “cuda” in the search field and finding nothing. Does anybody have working examples using flox in a python+cuda environment?

While no expert, I use nix on a daily basis (mac/ubuntu) and try to encourage its adoptions in projects I’m involved, but historically the learning curve is enough of a turnoff to the vast majority of people just trying to get work done.

After going through great lengths to get cuda working in a python environment defined with a flake, I couldn’t recommend using nix for a pytorch project when using a pytorch docker image already worked for other projects. As a result we messy structures of poetry, pip, requirements.txt, conda, and what have you. FWIW I have only used things like poetry2nix in the most trivial examples; I also wouldn’t recommend it generally. For the actual cuda setup I did get working in a flake, nix only helps lock the dev and build time libs and binaries. The python environment is all dirty. For flox, I assume it must also support such an interaction model.

Anyone have experience and/or guidance to share?

Thanks all

We haven’t yet exposed a flake integration on the Flox manifest but we have plans to. For cuda, we are working on an example environment but it is early and we still have to sort out a number of driver related issues. We have an accelerated pytorch example that currently is only accelerated for macOS. It’s early but you can check that out by flox pull flox/sdxl

1 Like