raptorista2
October 16th, 2017, 11:24 AM
Hi everybody. I'm running Ubuntu 16.04 and I recently run into problems with my Python packages; while searching for a solution I came across multiple sources advising to prefer installing packages with pip instead of apt, and even more sources advising not to mix the two installation methods, so I decided to clean my system which currently has python packages installed both ways. Easier said than done, I came up with several questions on best practices for a good system management.
There exists at least one python package that I need and I cannot find in Ubuntu's repos, so I decided to try to uninstall of all the python packages that I installed from apt and install the corresponding pip packages, but this turns out to break other packages' dependencies, so I arrived to the following stalemate:
I want package meshio which is available in pip but not in apt. This package I need only for a restricted use in a specific project.
Package numpy, currently in apt and available in pip, is a dependency of another application, e.g. Paraview, that I can't get from pip, so uninstalling apt::numpy in favor of pip::numpy would remove Paraview, which instead I need.
Package petsc4py is only available in pip and depends on petsc at a version available in pip but not in apt, but I already have apt::petsc and I need it for some other stuff, so installing pip::petsc4py would result in the installation of pip::petsc alongside apt::petsc [at a different version].
These are my biggest problems right now. As far as solutions are concerned, my ideas are
Keep in apt all python packages that are dependencies of non-python packages [e.g. paraview]; remove all apt packages that I installed manually; install the pip version of all packages that I need, including those that I keep in apt. This solution implies duplicating packages.
Keep in apt all python packages that are dependencies of non-python packages [e.g. paraview]; remove all apt packages that I installed manually and install the pip version of only these packages that I need. This solution still results in duplicating packages which are dependencies of both packages in apt and packages in pip.
Remove all python packages from apt, manually compile applications that are removed from apt as a consequence and link them to the python packages installed by pip. This goes against having a repository and I would really like to avoid this.
Virtual environments. I have a not-too-precise understanding of what this is, but the idea would be to remove all pip packages, keep all apt python packages that are dependencies of non-python packages and then create a virtualenv for every python project that I do and needs special dependencies not provided by apt. My understanding is that in this case a virtualenv could benefit packages installed from apt and pip would be used to complement the packages that are at the wrong version. This looks like the cleanest solution, although I have no experience with virtual environments and I will have to learn to use them.
What is the correct way to deal with this situation? Any other solution is welcome.
Thanks in advance for any help.
There exists at least one python package that I need and I cannot find in Ubuntu's repos, so I decided to try to uninstall of all the python packages that I installed from apt and install the corresponding pip packages, but this turns out to break other packages' dependencies, so I arrived to the following stalemate:
I want package meshio which is available in pip but not in apt. This package I need only for a restricted use in a specific project.
Package numpy, currently in apt and available in pip, is a dependency of another application, e.g. Paraview, that I can't get from pip, so uninstalling apt::numpy in favor of pip::numpy would remove Paraview, which instead I need.
Package petsc4py is only available in pip and depends on petsc at a version available in pip but not in apt, but I already have apt::petsc and I need it for some other stuff, so installing pip::petsc4py would result in the installation of pip::petsc alongside apt::petsc [at a different version].
These are my biggest problems right now. As far as solutions are concerned, my ideas are
Keep in apt all python packages that are dependencies of non-python packages [e.g. paraview]; remove all apt packages that I installed manually; install the pip version of all packages that I need, including those that I keep in apt. This solution implies duplicating packages.
Keep in apt all python packages that are dependencies of non-python packages [e.g. paraview]; remove all apt packages that I installed manually and install the pip version of only these packages that I need. This solution still results in duplicating packages which are dependencies of both packages in apt and packages in pip.
Remove all python packages from apt, manually compile applications that are removed from apt as a consequence and link them to the python packages installed by pip. This goes against having a repository and I would really like to avoid this.
Virtual environments. I have a not-too-precise understanding of what this is, but the idea would be to remove all pip packages, keep all apt python packages that are dependencies of non-python packages and then create a virtualenv for every python project that I do and needs special dependencies not provided by apt. My understanding is that in this case a virtualenv could benefit packages installed from apt and pip would be used to complement the packages that are at the wrong version. This looks like the cleanest solution, although I have no experience with virtual environments and I will have to learn to use them.
What is the correct way to deal with this situation? Any other solution is welcome.
Thanks in advance for any help.