-
Notifications
You must be signed in to change notification settings - Fork 97
Description
Describe the bug
Geos crashes with enabled thermal effects in elastic mechanical model, the simulation fails (negative BHP value + Segmentation fault).
Isothermal poroelastic model runs well.
To Reproduce
Steps to reproduce the behavior:
- Unpack the mesh files and adjust the path to VTU files in XML data files
- Run the attached model (
geosx ThermoPoroElastic.xml) - See the error attached below
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
Rank 9: wellControls2: surface density computed with P_surface = 101325 Pa
Rank 9: wellControls2: total fluid density at surface conditions = 995.6540540734746 kg/sm3, total rate = 0 kg/s, total surface volumetric rate = 0 sm3/s
Rank 9: wellControls2: The BHP (at the specified reference elevation) = -488346.5410838615 Pa
Attempt: 0, ConfigurationIter: 0, NewtonIter: 0
( Rflow ) = ( 5.59e-08 ) ( Renergy ) = ( 0.00e+00 ) ( Rsolid ) = ( 4.44e-12 ) ( Rwell ) = ( 0.00e+00 ) ( Renergy ) = ( 0.00e+00 ) ( R ) = ( 5.59e-08 )
***** ERROR
***** SIGNAL: 11
***** LOCATION: (external error, captured by signal handler)
Signal no. 11 encountered: Segmentation fault
StackTrace of 24 frames
Frame 0: /lib64/libc.so.6
Frame 1: /lib64/libc.so.6
Frame 2: .....mpi/2021.9.0/lib/release/libmpi.so.12
....
Frame 8: PMPI_Allgatherv
Frame 9: hypre_MPI_Allgatherv
Frame 10: hypre_GaussElimSetup
Frame 11: hypre_MGRSetup
Frame 12: geos::HyprePreconditioner::setup(geos::HypreMatrix const&)
Frame 13: geos::HypreSolver::setup(geos::HypreMatrix const&)
Frame 14: geos::PhysicsSolverBase::solveLinearSystem(geos::DofManager const&, geos::HypreMatrix&, geos::HypreVector&, geos::HypreVector&)
Frame 15: geos::PhysicsSolverBase::solveNonlinearSystem(double const&, double const&, int, geos::DomainPartition&)
Frame 16: geos::PhysicsSolverBase::nonlinearImplicitStep(double const&, double const&, int, geos::DomainPartition&)
Frame 17: geos::PhysicsSolverBase::solverStep(double const&, double const&, int, geos::DomainPartition&)
Frame 18: geos::CoupledSolver<geos::SinglePhasePoromechanics<geos::SinglePhaseBase, geos::SolidMechanicsLagrangianFEM>, geos::SinglePhaseWell>::solverStep(double const&, double const&, int, geos::DomainPartition&)
Frame 19: geos::PhysicsSolverBase::execute(double, double, int, int, double, geos::DomainPartition&)
Frame 20: geos::EventBase::execute(double, double, int, int, double, geos::DomainPartition&)
Frame 21: geos::EventManager::run(geos::DomainPartition&)
Frame 22: geos::GeosxState::run()
Frame 23: main
Frame 24: __libc_start_main
`
Platform (please complete the following information):
- Machine: RHEL 8.10
- Compiler: gcc 13.1.0
- GEOS Version: 1.1.0
- GEOS Version sha: develop 1e617be , tpls: 9687680c61fa3378f7b2ce7eea8327736809fb63
Additional context
Log files made for 10 MPI processes, but the bug reproduces on 1 MPI process too).
Mesh files
Meshes have .vtu format, I archived 34_34_57 with which the bug can be reproduced. Please remove .txt extension which was added to manage the uploading.
mesh_34_34_57.7z.txt
Just in case, I attach a smaller mesh which might help for debugging. However I didn't try to reproduce the issue with it.
mesh_4_4_6.vtu.txt
Poroelastic model (which runs)
PoroElastic.xml
PoroElastic_base.xml
run_isothermal.log
ThermoPoroelastic model (which fails)
run_thermal.log
ThermoPoroElastic.xml
ThermoPoroElastic_base.xml