18 January 2018 0 703 Report

I am trying to implement an ABAQUS MPI-based parallelization analysis with about 200,000 3D stress/pore-pressure elements using the "iterative solver" of ABAQUS on a shared-memory computer with maximum 40 CPU cores. I am using my own codes for UMAT and USDFLD subroutines and am pretty sure that there is nothing wrong with my codes, as the analysis correctly progresses up to a random stage when some error message is issued (e.g. the attached file). This is the case while only a small fraction of the total available memory is used during the solution. Sometimes the memory is still occupied after this error, while there is no CPU usage.

I would really appreciate it if you could do me a favor and share your idea about the source of such an error and the way that I might resolve the issue.

More Ramin Pakzad's questions See All
Similar questions and discussions