Discussion:
[gentoo-science] Supporting multiple MPI stacks with empi
Justin Bronder
2016-08-11 13:29:52 UTC
Permalink
Basically, is anyone still using it over a self-built system using modules or
some similar project? There's some interest in the main Gentoo tree to port to
multilib.eclass which I think may be ignoring use cases important to HPC site
administrators. In particular, the ability to use emerge and empi to manage
multiple mpi/lapack/blas stacks. However, I have no proof aside from positive
feedback from back when empi was in development.

I definitely don't want to stand in the way of other motivated developers if
everyone using Gentoo for HPC is no longer using empi and instead using systems
outside of Gentoo.

So, you have a few weeks to speak up. If I don't hear anything back, I'll drop
empi and mpi.eclass from the science overlay.
--
Justin Bronder
Andrew Savchenko
2016-08-18 09:11:16 UTC
Permalink
Post by Justin Bronder
Basically, is anyone still using it over a self-built system using modules or
some similar project?
There's some interest in the main Gentoo tree to port to
multilib.eclass which I think may be ignoring use cases important to HPC site
administrators. In particular, the ability to use emerge and empi to manage
multiple mpi/lapack/blas stacks. However, I have no proof aside from positive
feedback from back when empi was in development.
Yes, we are using it in production (two HPC setups) to allow
our users to switch between openmpi and mpich2.

We also manage multiple lapack/blas/cblas/etc implementations using
eselect tools from the sci overlay.
Post by Justin Bronder
I definitely don't want to stand in the way of other motivated developers if
everyone using Gentoo for HPC is no longer using empi and instead using systems
outside of Gentoo.
So, you have a few weeks to speak up. If I don't hear anything back, I'll drop
empi and mpi.eclass from the science overlay.
Best regards,
Andrew Savchenko

Loading...