[gpfsug-discuss] Singularity + GPFS

valleru at cbio.mskcc.org valleru at cbio.mskcc.org
Thu Apr 26 15:40:52 BST 2018


We do run Singularity + GPFS, on our production HPC clusters.
Most of the time things are fine without any issues.

However, i do see a significant performance loss when running some applications on singularity containers with GPFS.

As of now, the applications that have severe performance issues with singularity on GPFS - seem to be because of “mmap io”. (Deep learning applications)
When i run the same application on bare metal, they seem to have a huge difference in GPFS IO when compared to running on singularity containers.
I am yet to raise a PMR about this with IBM.
I have not seen performance degradation for any other kind of IO, but i am not sure.

Regards,
Lohit

On Apr 26, 2018, 10:35 AM -0400, Nathan Harper <nathan.harper at cfms.org.uk>, wrote:
> We are running on a test system at the moment, and haven't run into any issues yet, but so far it's only been 'hello world' and running FIO.
>
> I'm interested to hear about experience with MPI-IO within Singularity.
>
> > On 26 April 2018 at 15:20, Oesterlin, Robert <Robert.Oesterlin at nuance.com> wrote:
> > > Anyone (including IBM) doing any work in this area? I would appreciate hearing from you.
> > >
> > > Bob Oesterlin
> > > Sr Principal Storage Engineer, Nuance
> > >
> > >
> > > _______________________________________________
> > > gpfsug-discuss mailing list
> > > gpfsug-discuss at spectrumscale.org
> > > http://gpfsug.org/mailman/listinfo/gpfsug-discuss
> > >
>
>
>
> --
> Nathan Harper // IT Systems Lead
>
>
> e: nathan.harper at cfms.org.uk   t: 0117 906 1104  m:  0787 551 0891  w: www.cfms.org.uk
> CFMS Services Ltd // Bristol & Bath Science Park // Dirac Crescent // Emersons Green // Bristol // BS16 7FR
>
> CFMS Services Ltd is registered in England and Wales No 05742022 - a subsidiary of CFMS Ltd
> CFMS Services Ltd registered office // 43 Queens Square // Bristol // BS1 4QP
> _______________________________________________
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> http://gpfsug.org/mailman/listinfo/gpfsug-discuss
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://gpfsug.org/pipermail/gpfsug-discuss_gpfsug.org/attachments/20180426/81aafa71/attachment.htm>


More information about the gpfsug-discuss mailing list