Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bump dependencies #22

Open
CJ-Wright opened this issue Mar 2, 2018 · 17 comments
Open

bump dependencies #22

CJ-Wright opened this issue Mar 2, 2018 · 17 comments
Labels
GSOC Google Summer of Code

Comments

@CJ-Wright
Copy link
Member

@CJ-Wright commented on Wed Feb 28 2018

It would be good to have a way to bump dependencies along with the versions.
See: https://github.com/conda/conda-build/blob/master/conda_build/skeletons/pypi.py#L869
@isuruf
@sodre

@jakirkham
Copy link
Contributor

An easier strategy to consider, since we only want to parse the requirements, might be to run python setup.py egg_info in the source directory. This will generate some metadata in a *.egg-info directory (where * is the project name). One of the things it generates in this directory is requires.txt. This lists all of the requirements of the package including extra requirements in a standard requirements.txt file, which could be loaded and parsed with something like requirements-parser ( seems we don't have that yet conda-forge/staged-recipes#5303 ;).

Also might look at pipreqs for some useful thoughts. Though it looks like it mainly scans for imports. Still could be handy for detecting a setuptools requirement at run time (e.g. uses pkg_resources), which is something people seldom state explicitly. Though maybe it is easier to just find that one piece through "grepping".

@CJ-Wright
Copy link
Member Author

How do we decide between run and build requirements?

@CJ-Wright CJ-Wright added the GSOC Google Summer of Code label Mar 3, 2018
@jakirkham
Copy link
Contributor

They are run dependencies. It doesn't export build dependencies (e.g. setup_requires). Just checked.

More generally we have discouraged anyone from adding stuff to build beyond python and setuptools/pip unless it is actually used for the build (e.g. cython). There are historic reasons for it, but it is also just cleaner and a bit nicer on the CIs. It will also make this task easier if we stick to that guideline.

IOW if someone actually has a build dependency, it is reasonable to assume that they are linking to it or maybe it's some other build tool. Alternatively it could be an old recipe not following best practices.

IMHO it is fair to say we don't touch build dependencies. It's also probably best as the maintainer may have pinned the build requirements due to to build issues that would not be obvious from inspecting the source (e.g. breaks on some version of cython).

Sound reasonable? Other thoughts/concerns?

@CJ-Wright
Copy link
Member Author

CJ-Wright commented Mar 3, 2018

That sounds reasonable to me. I have seen a few PRs with massive build requirements (as python packages) but those usually end up being not needed.

I usually default to the "bot make reasonable decision which is non-aggressive and maintainers/reviewers can be more aggressive", so having the bot default to the recipe for build requirements goes well with that sentiment.

@jakirkham
Copy link
Contributor

Sure. Along those lines it would be reasonable to parse out optional dependencies that maintainers already harden and refresh those dependencies. Am hopeful this ends up being mostly straightforward.

@jakirkham
Copy link
Contributor

Would it be possible to add a note to the PRs reminding maintainers to check dependencies themselves? After talking to a few maintainers, I don't think they are aware the bot isn't doing this for them.

@jakirkham
Copy link
Contributor

Linking issue ( pypi/warehouse#474 ) as this would provide an API for querying dependencies from Warehouse.

@jdblischak
Copy link
Contributor

@jakirkham @CJ-Wright Here are some thoughts about handling dependencies for R packages.

conda skeleton cran does a good job of determining the R package dependencies. Since the bot script already depends on conda-smithy, using conda-build does not add an extra dependency. One possible strategy would be:

  1. Download the current meta.yaml from the feedstock repo. Extract current dependencies using conda_build.metadata.MetaData. Only keep non-R dependencies, i.e. those that do not start with r-.
  2. Run conda skeleton cran (or conda_build.api.skeletonize). Extract the updated dependencies.
  3. Merge the R and non-R dependecies
  4. Insert the merged dependencies back into the original meta.yaml (along with the updated version number and sha256)

If you're interested in obtaining the dependencies of many packages (e.g. to build a dependency graph), one option would be to call available.packages() via rpy2.

Hopefully some of that was useful. Please let me know if you have specific questions.

@jakirkham
Copy link
Contributor

This might be more doable now with Python 3.8's importlib.metadata.

@jakirkham
Copy link
Contributor

cc @beckermr (as you asked about this in the core meeting earlier 😉)

@beckermr
Copy link
Contributor

beckermr commented Feb 5, 2020

Thank you!

@jakirkham
Copy link
Contributor

Yeah was about to say. I don't think this is a blocker for the auto-merge, but it is highly desirable functionality. 😉

For things like C/C++ this mostly falls out of conda-forge-pinning, run_exports, migrations, etc. When it doesn't the build already fails, much like what pip check would do for us 🙂

@jakirkham
Copy link
Contributor

Should add there is some other work along these lines. So hopefully the bot can just leverage this once it's ready.

Please feel free to add more and/or correct me as needed 🙂

cc @ocefpaf @marcelotrevisani

@marcelotrevisani
Copy link
Member

I hope to speed up the development from my side to also support it, it is one of my goals to have it to the "new skeleton".
Let's see if next week I can present something :)

@CJ-Wright
Copy link
Member Author

@medb what do you mean? This issue is about keeping requirements stated in the meta.yaml properly pinned. Is your question related to that? If not please open a new issue!

@jakirkham
Copy link
Contributor

One option would be to use Grayskull ( #1471 )

@jakirkham
Copy link
Contributor

Looks like PEP 658 is now deployed on PyPI! 🎉

Maybe this is another option for pulling this metadata?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
GSOC Google Summer of Code
Projects
None yet
Development

No branches or pull requests

5 participants