You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
tl;dr Don't just update your psi4 source, do a fresh environment and build. There aren't comprehensive build helpers yet for v1.8, but there's some environment specs to help. What form do you want your future build helpers to take? For reference, here's how to get psi4 binary packages for all architectures.
Recent Upheaval
You may have noticed that the "source" option at https://psicode.org/installs/v18/ says "check back later". There are minor upheavals in the build procedure in that:
Libint2 switched source again last week (akin to the several L2 switches in spring 2022) so you'll want a new conda environment.
We're switching our primary binary distribution channel from conda install psi4 -c psi4 to conda install psi4 -c conda-forge (latter cmd is a simplification -- don't use as-is), so while most of the -c psi4 packages are perfectly compatible with master, they're not the freshest and will continue to decay.
New release means new tag, so you'll need to pull tags anyway (git fetch upstream 'refs/tags/*:refs/tags/*') for versioning, distributed driver, and dispersion addons to work.
Interim Build Directions
Altogether, now is a good time to do a fresh clone, environment, and compile ... if only there were directions. The interim build guidance is:
Then issue something likeconda env create -f Linux-buildrun-maxeco.yaml and activate the environment.
Clone psi4, then cmake -S. -Bobjdir -DBUILD_SHARED_LIBS=ON ..., and it usually configures fine.
For Apple Silicon, there isn't a env yaml file yet, but replacing libblas=*=*mkl with libblas=*=*accelerate or libblas=*=*openblas, getting rid of other mkl lines, and aggressively pruning qc addons (not many ported to arm64) should do the trick.
For Linux, if you want a high-AM L2, replace this line conda-forge/label/libint_dev::libint=2.7.3dev1https://github.com/psi4/psi4/blob/master/devtools/conda-envs/Linux-buildrun-maxeco.yaml#L22 with psi4/label/testing::libint2=2.7.2=h2fe1556_1 . Despite the different package names, versions, and channels, they're built from the same source, just different configuration. (I can't build a big AM w/i the 6h c-f time window.)
Future Build Tools
Previously, the main build helper has been the psi4-dev conda package that combines a maximal set of build tools (compilers, cmake, etc.) and psi4 buildtime ecosystem packages along with a little script psi4-path-advisor to firmly configure cmake with conda env locations. As far as I know, this is pretty convenient and durable, especially for build-once-use-forever workflows. Complications I've heard of are (1) Mac CONDA_BUILD_SYSROOT sometimes required present/absent and (2) psi4-dev is monolithic, awkward to update, or sometimes out-of-date with psi4 master's requirements. I, for one, don't often use psi4-dev b/c I want more flexibility with switching out packages.
So, I'd be glad to hear thoughts on what form and contents the build helpers should take. A few plans and questions below, but any feedback appreciated.
Do you want docker images of the stuff that was in psi4-dev? I think from PsiCon2022 the answer is yes.
Do you want minimal (build tools & req'd deps) or maximal (+ optional addons) or micro (no build tools, just L2, g2g, libxc, qcng, scipy, pytest) or several of those choices Docker images? (I have one vote for minimal.)
Docker images are hefty (~600MB for the built psi4). Can I just push them to a latest tag and overwrite old ones, or does there need to be a history?
For Docker of psi4 itself, I'm only doing linux-64 (https://hub.docker.com/r/psi4/psi4/tags). Since docker is usually run VM-like, is that all developers want, or should I be building containers for other arch? Certainly I have the other-arch conda pkgs available -- it's just a matter of figuring out docker from them.
I can build singularity/apptainer images from the Docker images, but I haven't figured out how to upload them yet. For devs, are apptainers wanted, or will docker alone do?
I'll probably revive the psi4-dev pkg, just because it is simple and durable. (And it's the easiest way to convey the Intel-atop-GNU flags.) It'll be a conda install psi4/label/dev::psi4-dev -c conda-forge access. Thoughts on improving it?
Those devtools/*/*buildrun*yaml env specs are semi-auto-generated from the ecosystem GHA. Are they handy enough to maintain and promote, even though they don't come with cmake lines?
Any other build tools to support other workflows wanted? It's easier to set them up all at once.
Psi4 Conda Packages at Present with Channels
Built psi4 packages are slightly off-topic, but I thought this could be a handy reference for the more exotic ones. Note that these don't have as many addons as previous -c psi4 packages did. See https://github.com/orgs/psi4/projects/2 for the conda-forge progress tracker.
conda install psi4 -c conda-forge/label/libint_dev -c conda-forge (clang-cl compilers, AM5 L2 (w/o Hessian integrals, so it'll fall back to findif and some tests will fail unless exclude d2ints label), MKL, all pythons)
tl;dr Don't just update your psi4 source, do a fresh environment and build. There aren't comprehensive build helpers yet for v1.8, but there's some environment specs to help. What form do you want your future build helpers to take? For reference, here's how to get psi4 binary packages for all architectures.
Recent Upheaval
You may have noticed that the "source" option at https://psicode.org/installs/v18/ says "check back later". There are minor upheavals in the build procedure in that:
conda install psi4 -c psi4
toconda install psi4 -c conda-forge
(latter cmd is a simplification -- don't use as-is), so while most of the-c psi4
packages are perfectly compatible with master, they're not the freshest and will continue to decay.git fetch upstream 'refs/tags/*:refs/tags/*'
) for versioning, distributed driver, and dispersion addons to work.Interim Build Directions
Altogether, now is a good time to do a fresh clone, environment, and compile ... if only there were directions. The interim build guidance is:
conda env create -f Linux-buildrun-maxeco.yaml
and activate the environment.cmake -S. -Bobjdir -DBUILD_SHARED_LIBS=ON ...
, and it usually configures fine.libblas=*=*mkl
withlibblas=*=*accelerate
orlibblas=*=*openblas
, getting rid of other mkl lines, and aggressively pruning qc addons (not many ported to arm64) should do the trick.target-sdk
stuff at https://github.com/psi4/psi4/blob/master/.github/workflows/ecosystem.ymlconda-forge/label/libint_dev::libint=2.7.3dev1
https://github.com/psi4/psi4/blob/master/devtools/conda-envs/Linux-buildrun-maxeco.yaml#L22 withpsi4/label/testing::libint2=2.7.2=h2fe1556_1
. Despite the different package names, versions, and channels, they're built from the same source, just different configuration. (I can't build a big AM w/i the 6h c-f time window.)Future Build Tools
Previously, the main build helper has been the
psi4-dev
conda package that combines a maximal set of build tools (compilers, cmake, etc.) and psi4 buildtime ecosystem packages along with a little scriptpsi4-path-advisor
to firmly configure cmake with conda env locations. As far as I know, this is pretty convenient and durable, especially for build-once-use-forever workflows. Complications I've heard of are (1) MacCONDA_BUILD_SYSROOT
sometimes required present/absent and (2)psi4-dev
is monolithic, awkward to update, or sometimes out-of-date with psi4 master's requirements. I, for one, don't often usepsi4-dev
b/c I want more flexibility with switching out packages.So, I'd be glad to hear thoughts on what form and contents the build helpers should take. A few plans and questions below, but any feedback appreciated.
psi4-dev
? I think from PsiCon2022 the answer is yes.latest
tag and overwrite old ones, or does there need to be a history?psi4-dev
pkg, just because it is simple and durable. (And it's the easiest way to convey the Intel-atop-GNU flags.) It'll be aconda install psi4/label/dev::psi4-dev -c conda-forge
access. Thoughts on improving it?devtools/*/*buildrun*yaml
env specs are semi-auto-generated from the ecosystem GHA. Are they handy enough to maintain and promote, even though they don't come withcmake
lines?Psi4 Conda Packages at Present with Channels
Built psi4 packages are slightly off-topic, but I thought this could be a handy reference for the more exotic ones. Note that these don't have as many addons as previous
-c psi4
packages did. See https://github.com/orgs/psi4/projects/2 for the conda-forge progress tracker.linux-64
conda install psi4 -c conda-forge/label/libint_dev -c conda-forge
(GNU compilers, AM5 L2, MKL, all pythons)conda install psi4/label/dev::psi4 -c psi4/label/testing -c conda-forge
(Intel compilers, AM7 L2, MKL, even pythons)conda install psi4/label/testing::psi4=1.8a2=py310hfdeccc3_2 libblas=*=*<mkl|openblas|etc> -c conda-forge/label/libint_dev -c conda-forge
(GNU compilers, AM5 L2, choose-your-own BLAS/LAPACK https://conda-forge.org/docs/maintainer/knowledge_base.html#switching-blas-implementation (addlibopenblas=*=*openmp*
with openblas), py310, not quite v1.8)win-64
conda install psi4 -c conda-forge/label/libint_dev -c conda-forge
(clang-cl compilers, AM5 L2 (w/o Hessian integrals, so it'll fall back to findif and some tests will fail unless excluded2ints
label), MKL, all pythons)osx-64
conda install psi4 -c conda-forge/label/libint_dev -c conda-forge
(Clang compilers, AM5 L2, MKL, all pythons)osx-arm64
conda install psi4 -c conda-forge/label/libint_dev -c conda-forge
(Clang compilers, AM5 L2, OpenBLAS, all pythons)conda install psi4 libblas=*=*accelerate -c conda-forge/label/libint_dev -c conda-forge
(Clang compilers, AM5 L2, Acellerate BLAS, all pythons)The text was updated successfully, but these errors were encountered: