Giter VIP home page Giter VIP logo

geodynamics / axisem Goto Github PK

View Code? Open in Web Editor NEW
63.0 63.0 31.0 67.48 MB

AxiSEM is a parallel spectral-element method to solve 3D wave propagation in a sphere with axisymmetric or spherically symmetric visco-elastic, acoustic, anisotropic structures.

TeX 13.21% MATLAB 1.17% Python 2.60% Fortran 79.58% Perl 0.67% Shell 2.48% Gnuplot 0.02% C 0.16% CMake 0.11%
fortran high-performance-computing mpi seismology spectral-elements

axisem's People

Contributors

andri221 avatar dmiller423 avatar eheien avatar martinvandriel avatar sstaehler avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

axisem's Issues

Mesher crashes when using OpenMP with Intel fortran

Crashes with

forrtl: severe (408): fort: (2): Subscript #1 of the array IND1 has value 65 which is greater than the upper bound of 64

Image              PC                Routine            Line        Source             
xmesh              0000000000872BBD  sorting_mp_pmerge         167  sorting.f90

SOLVER fails on ppc64le: Depends on SSE intrinsics (xmmintrin.h)

Following the README's instructions to run axisem on a ppc64le (IBM POWER8) machine, I encounter the following error when compiling the SOLVER:

[u0017592@sys-82824 SOLVER]$ ./submit.csh testrun1
Using mesh  MESHES/testmesh
copying mesh_params.h from  MESHES/testmesh
mpif90 -O3 -fopenmp         -Dsolver -c  global_parameters.f90
mpif90 -O3 -fopenmp         -Dsolver -c  data_proc.f90
mpif90 -O3 -fopenmp         -Dsolver -c  clocks.f90
mpif90 -O3 -fopenmp         -Dsolver -c  kdtree2.f90
gcc -O3                   -c -o ftz.o ftz.c
gcc -O3                   -c -o pthread.o pthread.c
cd UTILS; make
ftz.c:23:23: fatal error: xmmintrin.h: No such file or directory
#include <xmmintrin.h>
                    ^
compilation terminated.
mpif90 -O3 -fopenmp         -Dsolver -c  data_io.f90
mpif90 -O3 -fopenmp         -Dsolver -c  data_spec.f90
make[1]: Entering directory `/home/u0017592/projects/axisem/SOLVER/UTILS'
mpif90 -O3 -fopenmp          -c nc_postroutines.F90
mpif90 -O3 -fopenmp          -c field_transform.F90
mpif90 -O3 -fopenmp         -Dsolver -c  interpolation.f90
mpif90 -O3 -fopenmp         -Dsolver -c  list.f90
mpif90 -O3 -fopenmp         -Dsolver -c  data_time.f90
make: *** [ftz.o] Error 1
make: *** Waiting for unfinished jobs....
mpif90 -O3 -fopenmp          -c post_processing.F90
mpif90 -O3 -fopenmp       field_transform.o -o xfield_transform
mpif90 -O3 -fopenmp       post_processing.o nc_postroutines.o -o xpost_processing
make[1]: Leaving directory `/home/u0017592/projects/axisem/SOLVER/UTILS'
ERROR: Compilation failed, please check the errors.

ftz.c will have to be changed to optionally work with Altivec in order to support ppc64le machines, or some other alternative will have to be provided.

Note: I also had to manually remove -march=native as it isn't recognized by my platform's GCC (4.8.3 20140911 (Red Hat 4.8.3-9)) but that was a fairly obvious fix.

v1.4 in README

README.md references v1.4 but the latest tagged release is v1.3.

Can you clear up the discrepancy?

CMake does not search for NetCDF correctly.

I have installed the packaged version of NetCDF for Fortran, but CMake does not correctly find it.

$ cmake .
...
-- Failed to find NetCDF interface for F90
-- Could NOT find NetCDF (missing:  NetCDF_has_interfaces) 
-- Configuring done
-- Generating done

Note that there are two sets of libraries, non-MPI:

$ rpm -q netcdf-fortran-devel
netcdf-fortran-devel-4.2-11.fc20.x86_64
$ rpm -ql netcdf-fortran-devel | grep netcdf.inc
/usr/include/netcdf.inc
$ rpm -ql netcdf-fortran-devel | grep netcdf.mod
/usr/lib64/gfortran/modules/netcdf.mod
$ rpm -ql netcdf-fortran-devel | grep netcdff
/usr/lib64/libnetcdff.so

and an MPI version,

$ rpm -q netcdf-fortran-openmpi-devel
netcdf-fortran-openmpi-devel-4.2-11.fc20.x86_64
$ rpm -ql netcdf-fortran-openmpi-devel | grep netcdf.inc
/usr/include/openmpi-x86_64/netcdf.inc
$ rpm -ql netcdf-fortran-openmpi-devel | grep netcdf.mod
/usr/include/openmpi-x86_64/netcdf.mod
$ rpm -ql netcdf-fortran-openmpi-devel | grep netcdff
/usr/lib64/openmpi/lib/libnetcdff.so

I don't think FindNetCDF.cmake correctly understands this split.

Should there be. top-level CMakeLists.txt?

Should there be a top level CMakeLIsts.txt, or should MESHER and SOLVER be compiled separately?

Looking back at this fix:
e9f73ba

seems to indicate that there should be a top-level build. Generally, the build instructions in the documentation are pretty vague. If you can clarify some of the finer points, I can cut a Fork and help with that.

Possibly answering my own question, it looks like, yes, and that this disappearing CMakeLIsts.txt file has been a (probably git promoted problem for a while). It looks like the most recent commit with a top level CMakeLists.txt is: 4e5c7b1a3d5df6fd664e65ad9ab7bef9f2f1f1bd

Is that correct?

... but then I see a later commit where the cmake directory appears to be intentionally removed. I restored both of these from their most recent commits and appear to have compiled the code (still needs a test), but I would certainly appreciate any guidance from the developers. Thanks in advance!

recursive I/O error

For very coarse (and therefore fast, >100s) meshes, I sometimes get solver crashes with "recursive I/O error" when compiling with ifort.

forrtl: severe(40): recursive I/O operation, unit 6, file unknown

It happens after line 444 in nc_routines.F90 and is very elusive (happens at random in 1 out of 5 runs). Web search did not produce conclusive results.

Workaround: Set verbosity to 0 in inparam_advanced, which solved the problem for me.

External anisotropic models being read incorrectly

I am trying to run simulations using anisotropic models. I first run AxiSEM with the default anisotropic model prem_ani. This appears to work. I then read this model in, but AxiSEM modifies this into an isotropic model but having the format of an anisotropic model (vph = vpv, vsh = vsv, eta = 1). This means that the anisotropic simulation is not performed. My interest here is to read in my own (radially) anisotropic models, so first I want to make sure that AxiSEM can read in its own anisotropic models.

Reads in this (anisotropic excerpt; note columns 3-7):

6346600.  3380.75  8022.06  4396.02  8190.32  4611.80  0.90039 0.57827000E+05 0.60000000E+03
6335480.  3379.54  8009.46  4398.58  8182.26  4601.82  0.90471 0.57827000E+05 0.60000000E+03
6324360.  3378.33  7996.86  4401.15  8174.20  4591.84  0.90904 0.57827000E+05 0.60000000E+03
6313240.  3377.12  7984.26  4403.71  8166.14  4581.86  0.91336 0.57827000E+05 0.60000000E+03
6302120.  3375.91  7971.66  4406.27  8158.08  4571.88  0.91769 0.57827000E+05 0.60000000E+03
6291000.  3374.71  7959.06  4408.83  8150.02  4561.90  0.92201 0.57827000E+05 0.60000000E+03

Changes to this (isotropic excerpt; note columns 3-7):

6346600.  3380.75  8022.06  4396.02  8022.06  4396.02  1.00000 0.57827000E+05 0.60000000E+03
6335480.  3379.54  8009.46  4398.58  8009.46  4398.58  1.00000 0.57827000E+05 0.60000000E+03
6324360.  3378.33  7996.86  4401.15  7996.86  4401.15  1.00000 0.57827000E+05 0.60000000E+03
6313240.  3377.12  7984.26  4403.71  7984.26  4403.71  1.00000 0.57827000E+05 0.60000000E+03
6302120.  3375.91  7971.66  4406.27  7971.66  4406.27  1.00000 0.57827000E+05 0.60000000E+03
6291000.  3374.71  7959.06  4408.83  7959.06  4408.83  1.00000 0.57827000E+05 0.60000000E+03

Please see attached files for complete details of the external model and the model after reading.

external_model.txt
1dmodel_axisem.txt

TODO unification of "identical" files in mesher and solver

  • analytic_spheroid_mapping.f90 should be doing the same, but is quite different, needs testing
  • splib.f90 has changed some of the interfaces. Probably its best to compute everythin gll/glj related in the mesher and store it in the meshfile, then the whole splib can be removed from the solver
  • clocks.f90 should also be unified

CMake Error at FindOpenMP_Fortran.cmake

Running RHEL 7.1 on a ppc64le machine, I get the following build error:

[u0017592@sys-82824 axisem]$ sudo cmake .
-- Found MPI_C: /usr/local/lib/libmpi.so
-- Found MPI_CXX: /usr/local/lib/libmpi.so
-- Found MPI_Fortran: /usr/local/lib/libmpi_usempif08.so;/usr/local/lib/libmpi_usempi_ignore_tkr.so;/usr/local/lib/libmpi_mpifh.so;/usr/local/lib/libmpi.so
-- Try OpenMP Fortran flag = [/openmp]
-- Try OpenMP Fortran flag = [/Qopenmp]
-- Try OpenMP Fortran flag = [-openmp]
CMake Error at cmake/FindOpenMP_Fortran.cmake:83 (MESSAGE):
  OpenMP found, but test code did not run
Call Stack (most recent call first):
  CMakeLists.txt:21 (FIND_PACKAGE)


-- Configuring incomplete, errors occurred!

Just to make sure OpenMP was kosher, I manually created the testFortranOpenMP.f90 file that cmake generates and compiled it with gfortran -o testFotranOpenMP -fopenmp testFortranOpenMP.f90. It compiles successfully and its output is 2 (I do have two processors).

It's not clear to me why CMake reports this as failing if it works when I do it manually.

Possible SOLVER issues

I have just started trying to use AxiSEM and I’m running into a few issues.

  1. I can generate a mesh but once running the SOLVER I can’t seem to visualise the result in Paraview properly. For a full 2D half-disc only the crust layer shows up. And if I generate a mesh for a slab model with say, 100km depth and 1 degree of colatitude, the SOLVER doesn’t output any .xdmf files at all.

Whenever I generate a mesh I get these messages (not sure if this is relevant but thought I'd include it if so):

rm: No match.
rm: No match.
rm: No match.
make: Nothing to be done for `all'.
[1] 71684
xmesh submitted, output in "OUTPUT"
After the run, move the mesh to a new directory via:
./movemesh
This will be located in ../SOLVER/MESHES/

  1. Changing the timestep or sampling rate in the inparam_source file doesn't appear to do anything. When using Instaseis with the databases I generate changing the dt argument also does nothing (I mention this because I don't know if it's an issue with the databases I'm generating with AxiSEM).

  2. I'm trying to validate the installation by comparing the PREM_20s_ANI_FORCES database (downloadable from the Instaseis site) with my own 20s database with the same background model. So far I'm seeing amplitude discrepancies and possible frequency discrepancies as well although I don't know if this is a sampling issue as well.

20s_db_comp_V

Thanks in advance.

missing xpost_processing and xfield_transform

I have installed axisem1.3 and trying to run the quick start steps in the manual. I got up to step 10:

  1. ./submit.csh PREM_mrr_50s_gauss_1800s ⇒ compiles and runs the code

and I get an error saying the xpost_procesing and xfield_transform cannot be copied because they don't exist.

Here's the output of line 10:

fatal: Not a git repository (or any of the parent directories): .git
Using mesh MESHES/PREM_50s
copying mesh_params.h from MESHES/PREM_50s
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include global_parameters.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_proc.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include clocks.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include kdtree2.f90
gcc -O3 -march=native -c -o ftz.o ftz.c
gcc -O3 -march=native -c -o pthread.o pthread.c
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_io.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_spec.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include interpolation.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include list.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_time.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_source.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include analytic_semi_mapping.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include subpar_mapping.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include apply_masks.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_pointwise.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_heterogeneous.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include nc_helpers.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_matr.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include clocks_wrapper_solver.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_mesh.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include data_comm.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include background_models.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include commpi.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include analytic_spheroid_mapping.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include nc_snapshots.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include unrolled_loops.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include rotations.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include analytic_mapping.f90
pthread.c:39:4: warning: implicit declaration of function 'nc_dump_strain_to_disk' is invalid in C99
[-Wimplicit-function-declaration]
nc_dump_strain_to_disk();
^
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include pointwise_derivatives.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include stiffness_fluid.f90
1 warning generated.
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include commun.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include utlity.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include nc_routines.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include get_mesh.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include lateral_heterogeneities.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include def_grid.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include attenuation.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include meshes_io.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include seismograms.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include source.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include wavefields_io.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include get_model.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include parameters.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include def_precomp_terms.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include stiffness_di.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include stiffness_mono.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include stiffness_quad.f90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include time_evol_wave.F90
mpif90 -O3 -march=native -fopenmp -Denable_netcdf -Dsolver -c -I /opt/netcdf-4.4.1.1/include main.f90
mpif90 -O3 -march=native -fopenmp -o axisem background_models.o commpi.o commun.o get_model.o lateral_heterogeneities.o meshes_io.o nc_helpers.o nc_routines.o nc_snapshots.o parameters.o time_evol_wave.o analytic_mapping.o analytic_semi_mapping.o analytic_spheroid_mapping.o apply_masks.o attenuation.o clocks.o clocks_wrapper_solver.o data_comm.o data_heterogeneous.o data_io.o data_matr.o data_mesh.o data_pointwise.o data_proc.o data_source.o data_spec.o data_time.o def_grid.o def_precomp_terms.o get_mesh.o global_parameters.o interpolation.o kdtree2.o list.o main.o pointwise_derivatives.o rotations.o seismograms.o source.o stiffness_di.o stiffness_fluid.o stiffness_mono.o stiffness_quad.o subpar_mapping.o unrolled_loops.o utlity.o wavefields_io.o ftz.o pthread.o -L /opt/netcdf-4.4.1.1/lib -lnetcdff -Wl,-rpath,/opt/netcdf-4.4.1.1/lib
Receiver file type: stations
Source file: inparam_source, Receiver file: STATIONS
source names: ./
source components: mrr
Create the run directory PREM_mrr_50s_gauss_1800s
copying make_axisem.macros from ../

Setting up simulation ./
creating ./Info
copying crucial files for the simulation...
preparing job on 2 nodes...
[1] 18157
Job running in directory ./
cp: /Users/wardah/Software/axisem-9f0be2f/SOLVER/UTILS/xpost_processing: No such file or directory
cp: /Users/wardah/Software/axisem-9f0be2f/SOLVER/UTILS/xfield_transform: No such file or directory
To convolve and sum seismograms, run ./post_processing.csh after the simulations in:
/Users/wardah/Software/axisem-9f0be2f/SOLVER/PREM_mrr_50s_gauss_1800s
.... the post-processing input file param_post_processing is generated in the solver
.... based on guesses. Edit please.
~ ~ ~ ~ ~ ~ ~ h a n g o n & l o o s e ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~

problem with horizontal components for stations exactly at the poles

Obviously, coordinate transforms are not well defined for stations exactly at the north or south pole when using NEZ componenents. But: E and N should then at least just be some kind of rotated version of R and T, but they are not. Amplitudes are an order of magnitude to small then, while waveforms are ok. This is both the case in comparison to stations close to the pole for sources at arbitrary places as well as compared to the new rdbm mode.

I used receivers.dat in these tests.

I would guess this is somewhere in the coordinate transforms in postprocessing.F90

Program received signal SIGSEGV: Segmentation fault - invalid memory reference.

Please help me figure out this problem (Ubuntu 18.04 system). The axisem just calculated 49% and then segmentation faults appeared (every time). Thanks in advance.

.......
time step: 5500; t= 787.69 s ( 43.8%)
time step: 5600; t= 802.01 s ( 44.6%)
time step: 5700; t= 816.33 s ( 45.3%)
time step: 5800; t= 830.65 s ( 46.1%)
time step: 5900; t= 844.97 s ( 46.9%)
time step: 6000; t= 859.29 s ( 47.7%)
time step: 6100; t= 873.62 s ( 48.5%)
time step: 6200; t= 887.94 s ( 49.3%)

Program received signal SIGSEGV: Segmentation fault - invalid memory reference.

Backtrace for this error:
#0 0x7f58064312da in ???
#1 0x7f5806430503 in ???
#2 0x7f580588ef1f in ???
#3 0x7f5805c49bd8 in ???
#4 0x563eaaad94e1 in c_wait_for_io
at /home/dell/software/axisem-master/SOLVER/pthread.c:55
#5 0x563eaa775cf5 in __nc_routines_MOD_nc_rec_checkpoint
at /home/dell/software/axisem-master/SOLVER/nc_routines.F90:599
#6 0x563eaa7bc448 in dump_stuff
at /home/dell/software/axisem-master/SOLVER/time_evol_wave.F90:1144
#7 0x563eaa808070 in sf_time_loop_newmark
at /home/dell/software/axisem-master/SOLVER/time_evol_wave.F90:498
#8 0x563eaa8081d6 in __time_evol_wave_MOD_time_loop
at /home/dell/software/axisem-master/SOLVER/time_evol_wave.F90:238
#9 0x563eaa92d991 in axisem
at /home/dell/software/axisem-master/SOLVER/main.f90:92
#10 0x563eaa92dbcc in main
at /home/dell/software/axisem-master/SOLVER/main.f90:25

mpirun noticed that process rank 1 with PID 0 on node dell-Precision-7920-Tower exited on signal 11 (Segmentation fault).

ld: library not found for -lnetcdf collect2: error: ld returned 1 exit status

I am very new to programming. I am trying to install AXISEM and it has a makefile that throws me the following error:

ld: library not found for -lnetcdf
collect2: error: ld returned 1 exit status

The part of code where the error is thrown is the following
ifeq ($(strip $(USE_NETCDF)),true)
FFLAGS += -Dunc
ifdef NETCDF_PATH
LIBS = -L $(strip $(NETCDF_PATH))/lib -lnetcdff -Wl,-rpath,$(strip $(NETCDF_PATH))/lib
INCLUDE = -I $(strip $(NETCDF_PATH))/include
else
LIBS = -lnetcdff
INCLUDE = -I /usr/include
endif
else
LIBS =
INCLUDE =
endif

Thanks in advance if anyone can help me.

velocity seismograms output broken?

Reported by Wangxin:

But when we use use post processing to generate the velocity output. We can change the parameter "SEISTYPE disp" to "SEISTYPE velo" in param_post_processing. When we change the "SEISTYPE velo", there is something wrong. For example: "frottl: severe (24): end-of-file during read, unit 60,...file.../Data/ABU_FNet_velo.dat". I check the source code post_processing.F90. In post_processing.F90, Line 338, the code will load seismograms from directories. If we change the "SEISTYPE velo", it will read data "/Data/*_velo.dat". Actually, after use command "./submit.csh IASP91_exp20s", it will only generate the *_disp.dat in ./IASP91_exp20s/Data.

parallel IO broken on CSCS machines

I assume this is related to the "fix" in a8331a0, which made it run on SuperMUC, but causes freezes on CSCS.

I will start working on this on a separate branch now, this issue is meant to keep track and communicate the changes and ensure it works on both machines.

create a circular heterogeneity with inparam_hetero ?

I would like to do axisem simulations with a circular heterogeneity (Vp,Vs and rho) below the surface and some distance from the sorurce.
Can this be done with the inparam_hetero file? The documentation is currently limited to
some inline comments which I am not sure to fully understand.

Fix problem or raise error when storing wavefields across processor boundaries

This relates to issue #29. If this is not fixed until the next release, can you please raise an error if someone attempts to calculate a database where this is problematic? I think this can done at the very beginning of the meshing stage after parsing the input files.

As (I hope at least) the usage of instaseis will increase more and more people will want to calculate their own databases and we really don't want them to calculate problematic ones.

orte_create_session_dirs is set to false.

When I run 'tail -f iasp91_mrr_50s_gauss_1800s_SC_201103131137258_6.1' in step 12, a problem as below happened.

orte_create_session_dirs is set to false. In this case, the run-time cannot
detect that the abort call was an abnormal termination. Hence, the only
error message you will receive is this one.

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).

You can avoid this message by specifying -quiet on the mpirun command

After I add the '-quiet' to mpirun and rebuild the project, this problem is still there, how do I fix it? Any suggestions? Thanks!

Parallel builds don't work.

Building in parallel does not seem to work:

$ cmake .
-- The C compiler identification is GNU 4.8.2
-- The CXX compiler identification is GNU 4.8.2
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- The Fortran compiler identification is GNU
-- Check for working Fortran compiler: /usr/bin/gfortran
-- Check for working Fortran compiler: /usr/bin/gfortran  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /usr/bin/gfortran supports Fortran 90
-- Checking whether /usr/bin/gfortran supports Fortran 90 -- yes
-- Found MPI_C: /usr/lib64/openmpi/lib/libmpi.so  
-- Found MPI_CXX: /usr/lib64/openmpi/lib/libmpi_cxx.so;/usr/lib64/openmpi/lib/libmpi.so  
-- Found MPI_Fortran: /usr/lib64/openmpi/lib/libmpi_usempi.so;/usr/lib64/openmpi/lib/libmpi_mpifh.so;/usr/lib64/openmpi/lib/libmpi.so  
-- Try OpenMP Fortran flag = [/openmp]
-- Try OpenMP Fortran flag = [/Qopenmp]
-- Try OpenMP Fortran flag = [-openmp]
-- Try OpenMP Fortran flag = [-fopenmp]
-- Found OpenMP_Fortran: -fopenmp  
-- Failed to find NetCDF interface for F90
-- Could NOT find NetCDF (missing:  NetCDF_has_interfaces) 
-- Configuring done
-- Generating done
-- Build files have been written to: /home/elliott/code/axisem
$ make -j
Scanning dependencies of target xmesh
Scanning dependencies of target axisem
[  1%] [  2%] [  3%] Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_diag.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/global_parameters.f90.o
[  5%] [  6%] Building Fortran object CMakeFiles/xmesh.dir/MESHER/clocks.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_numbering.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_coarse.f90.o
[  7%] [  8%] [ 10%] [ 11%] Building Fortran object CMakeFiles/axisem.dir/SOLVER/data_proc.f90.o
Building Fortran object CMakeFiles/axisem.dir/SOLVER/clocks.f90.o
Building Fortran object CMakeFiles/axisem.dir/SOLVER/global_parameters.f90.o
[ 14%] [ 15%] [ 12%] Building Fortran object CMakeFiles/axisem.dir/SOLVER/kdtree2.f90.o
[ 16%] Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_grid.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_gllmesh.f90.o
[ 17%] [ 19%] [ 20%] Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_bkgrdmodel.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/interpolation.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_pdb.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_spec.f90.o
[ 21%] [ 23%] Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_mesh.f90.o
[ 25%] [ 24%] Building Fortran object CMakeFiles/xmesh.dir/MESHER/splib.f90.o
[ 26%] Building Fortran object CMakeFiles/xmesh.dir/MESHER/data_time.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/analytic_semi_mapping.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/analytic_spheroid_mapping.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/subpar_mapping.f90.o
[ 28%] [ 29%] [ 30%] Building Fortran object CMakeFiles/axisem.dir/SOLVER/data_io.f90.o
Building Fortran object CMakeFiles/axisem.dir/SOLVER/apply_masks.f90.o
[ 32%] [ 33%] [ 34%] [ 35%] Building Fortran object CMakeFiles/axisem.dir/SOLVER/list.f90.o
[ 37%] Building Fortran object CMakeFiles/xmesh.dir/MESHER/mesh_info.f90.o
Building Fortran object CMakeFiles/axisem.dir/SOLVER/data_pointwise.f90.o
[ 39%] [ 38%] Building Fortran object CMakeFiles/axisem.dir/SOLVER/data_time.f90.o
Building Fortran object CMakeFiles/axisem.dir/SOLVER/analytic_semi_mapping.f90.o
Building Fortran object CMakeFiles/axisem.dir/SOLVER/data_spec.f90.o
[ 41%] Building Fortran object CMakeFiles/xmesh.dir/MESHER/sorting.f90.o
[ 43%] Building Fortran object CMakeFiles/axisem.dir/SOLVER/subpar_mapping.f90.o
[ 44%] [ 43%] [ 46%] Building Fortran object CMakeFiles/axisem.dir/SOLVER/interpolation.f90.o
[ 47%] Building Fortran object CMakeFiles/axisem.dir/SOLVER/splib.f90.o
Building Fortran object CMakeFiles/xmesh.dir/MESHER/background_models.F90.o
Building Fortran object CMakeFiles/axisem.dir/SOLVER/data_matr.f90.o
Building Fortran object CMakeFiles/axisem.dir/SOLVER/data_source.f90.o
Building Fortran object CMakeFiles/axisem.dir/SOLVER/data_heterogeneous.f90.o
/home/elliott/code/axisem/MESHER/sorting.f90:284.10:

  iclock01 = tick()
          1
Error: Symbol 'iclock01' at (1) has no IMPLICIT type
/home/elliott/code/axisem/MESHER/sorting.f90:288.10:

  iclock05 = tick()
          1
Error: Symbol 'iclock05' at (1) has no IMPLICIT type
/home/elliott/code/axisem/MESHER/sorting.f90:286.28:

  iclock01 = tick(id=idold01, since=iclock01)
                            1
Error: Symbol 'idold01' at (1) has no IMPLICIT type
/home/elliott/code/axisem/MESHER/sorting.f90:333.28:

  iclock05 = tick(id=idold05, since=iclock05)
                            1
Error: Symbol 'idold05' at (1) has no IMPLICIT type
make[3]: *** [CMakeFiles/xmesh.dir/MESHER/sorting.f90.o] Error 1
make[2]: *** [CMakeFiles/xmesh.dir/MESHER/sorting.f90.o.provides] Error 2
make[2]: *** Waiting for unfinished jobs....

What I don't understand is that there are no complaints about missing headers or modules, so I don't know why those variables are undefined...

memory in fieldtransform

the optional argument for memory only works the displacement fields, but surface fields can still be much larger.

rename KERNEL_* fields in inparam_advanced

Given that most AxiSEM users outside the developers use AxiSEM to compute Instaseis databases and have not heard about kernels, we might want to rename these fields. What about INSTASEIS_ * instead, as all Kernel users know Instaseis, but not the other way around?

Problem with heterogeneities in inparam_hetero

Hello,

I would like to add a cylinder shaped heterogeneity in the outer-core using the 'inparam_hetero' file but I am not sure if I am doing things properly. My 'inparam_hetero file' looks like this:

1
.true.
discr
model_TangCyl_Full_v10.0-10.0_grad02.00
iso # anisotropy: iso, radial, hex
rel # relative perturbations or absolute velocities: rel, abs
101 # p value for inverse distance weighting (if > 100, only nearest neighbour interpolation: FAST)
0 # maximum radius of inverse distance weighting (0 = infinite) [km]

Where the file 'model_TangCyl_Full_v10.0-10.0_grad02.00' starts with:
3439
1217.5 -90.00 10.00 10.00 10.00
1217.5 -89.00 10.00 10.00 10.00
1217.5 -88.00 10.00 10.00 10.00
1217.5 -87.00 10.00 10.00 10.00
1217.5 -86.00 10.00 10.00 10.00
1217.5 -85.00 10.00 10.00 10.00
1217.5 -84.00 10.00 10.00 10.00
1217.5 -83.00 10.00 10.00 10.00
1217.5 -82.00 10.00 10.00 10.00
...

If I make a simple figure with the latitude-radius points of this file, it looks like this (and it is exactly what I want):

model_tangcyl_full_v10 0-10 0_grad02 00

However, when I run axisem (with a full moment tensor a source and source at latitude=90degrees and longitude=0degrees and a grid with 2s dominant period decomposed into 32x5 slices in latitude and radius), the output VTK files for model_vpv_*.vtk is:

screenshot - 10202014 - 02 34 53 pm

and model_vpv_gll_het*.vtk :

screenshot - 10202014 - 02 40 06 pm

Which does not look like at all like my cylinder... I really do not know what I did wrong. Would you have any clue ?

Thanks in advance for your time.
Regards,
Joanne Adam

Remove Numerical Recipes code

There are at least four instances with explicit references:

$ grep -IRi 'numerical recipe'
SOLVER/unrolled_loops.f90:  ! outer product (dyadic) from numerical recipes
SOLVER/unrolled_loops.f90:  ! outer product (dyadic) from numerical recipes
SOLVER/source.f90:!> Calculates the error function, coefficients taken from Numerical Recipes
MESHER/numbering.f90:  ! Use Heap Sort (Numerical Recipes)

The license of Numerical Recipes does not allow for redistribution in source code form and is incompatible with the GPL license used by AxiSEM.

netcdf runtime and postprocessing errors

Hi all,

Attached is a benchmark of AxiSEM synthetics against FK synthetics, using the "scak" model from the University of Alaska, Fairbanks. The fits look pretty good!

Getting this result took a lot of debugging. Since AxiSEM is such a useful package for us, if anyone has any advice regarding the following, it'd be really helpful to know.

Thanks,
Ryan

  • on our system, instaseis was unable to read netcdf files processed using the field_transform.sh script

  • instaseis successfully read netcdf files processed by the field_transform.py script. However, despite trying a variety block_sizes and cache_sizes, the processing was extremely slow (<< 1 MB/s), which led to processing times of several days for a 2 s dominant period simulation

  • in short simulations, the AxiSEM solver and mesher ran reliably, but in longer runs or runs with nproc > 4, fatal netcdf runtime errors were a frequent occurrence

  • when LOCAL_MAX_COLAT in MESHER/inparm_mesh was uncommented, instaseis was unable to read the resulting netcdf output files

  • to compile AxiSEM in debugging mode on our cluster, we had to add -pthread to the gfortran compiler flags in make_axisem.macros

AxiSEM_vs_FK_3s.pdf

Problem compiling

Hi,
I'm having a problem compiling the solver.

Running:
make SOLVER/axisem
gives me the following error:

mpif90 -g -fbacktrace -fbounds-check -frange-check -pedantic -Dinclude_mpi -Dsolver -c commpi.F90
mpif-sizeof.h:66.41:
Included at mpif.h:63:
Included at commpi.F90:48:

  CHARACTER, DIMENSION(1,1,1,1,1,1,1,*)::x
                                     1

Error: Array specification at (1) has more than 7 dimensions

I'm using these versions:
mpiexec --version
mpiexec (OpenRTE) 4.1.1

/usr/local/bin/mpirun -> ../Cellar/open-mpi/4.1.1_2/bin/mpirun

gcc --version
gcc (Homebrew GCC 11.2.0_1) 11.2.0
Copyright (C) 2021 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

netcdf library version 4.8.1 of Oct 19 2021 02:57:31
nc-config --version
netCDF 4.8.1

gcc, netcdf and open-mpi installed with Homebrew.

I'm on Mac Catalina 10.15.7.

Thanks in advance for any help,
Ceri

input file for force sources needed

Current status:

Single sources are read from inparam_source, moment tensors from CMTSOLUTION. In case of moment tensor, submit.csh generates inparam_source for each run using the parameters from CMTSOLUTION.

Force sources are for now hardcoded to the northpole and the surface, which is fine for adjoint wavefields, but not for general force sources.

Problem:

There is no equivalent of CMTSOLUTION for forces. If we generalize inparam_source to include general forces, the parameter SIMULATION_TYPE in inparam_basic could be in conflict with the definition of the source. On the other hand, SIMULATION_TYPE controls, which source file is read at all, so it would become somewhat obsolete and we could put this functionality into inparam_source. This however would not be backward compatible.

Large transformation times for databases

I am trying to transform the axisem-generated databases for faster access for iterative inversion processes, as suggested by the developers. My axisem jobs are on our cluster, and I will need the database on the cluster for rapid access of millions of different sources (hypocenter + moment tensor). I have run both global-scale domains as well as localized (in depth and laterally) domains. See table below showing details of some test run times. For me, the transformation of the database is the bottleneck for our science applications.

The issue I face is that the transformation of the generated database takes many days, whereas the job to generate the database takes hours (depending on # cores). I believe the transformation of the database is a 1 core job and hence slow (and will depend on the machine). Please let me know if you have advice. Partly I also wanted to share these tests with others.

run_times

Feature Request: Store 1D model in netCDF Files

We already talked about this and it would be very useful to store a couple more things in the netCDF files:

  • 1D_MODEL: The 1D model used for the calculation of the netcdf files for all parameters and at all GLL points in depth.
  • IS_1D: Flag to indicate whether or not the database did actually use a 1D model or an arbitrary axisymmetric one.

Fails to simulate an acoustic model

I replaced with zeros all the vs values in <model>_axisem.bm output by AxiSEM in /MESHER/Diags/, to make an acoustic Earth model. Then assign the file as external model in inparam_mesh.

However, the SOLVER exited due to some errors related to source location. Below is the OUTPUT_<RUNNAME> file on my PC:



    Initialized run for nproc =    2

 MAIN: Welcome to AxiSEM!

     Simulation started on 04/01/2017 at 20h 42min

 MAIN: Reading parameters..................................

        ///////////////////////////////////////////////////////////////
        //                                                           //
        //                  A   x   i   S   E   M                    //
        //                                                           //
        //                                                           //
        //         Parallel spectral-element solution to             //
        //                                                           //
        //           3-D seismic wave propagation for                //
        //                                                           //
        //          axially symmetric background models              //
        //                                                           //
        //               in a spherical 2-D domain                   //
        //                                                           //
        //                                                           //
        //  Authors : Tarje Nissen-Meyer (Oxford University)         //
        //              Martin van Driel (ETH Zurich)                //
        //                 Simon Stähler (LMU Munich)                //
        //                Kasra Hosseini (LMU Munich)                //
        //               Stefanie Hempel (University of Muenster)    //
        //            Alexandre Fournier (IPG Paris)                 //
        //                   Tony Dahlen (Princeton University)      //
        //                                                           //
        //   Contact:     [email protected]                           //
        //   Information: www.axisem.info                            //
        //                                                           //

        //                                                           //
        //     If you are publishing results obtained with this      //
        //          code, please cite this paper:                    //
        //                                                           //
        // (1) T. Nissen-Meyer, M. van Driel, S. C. Staehler,        //
        //     K. Hosseini, S. Hempel, L. Auer, A. Colombi           //
        //     and A. Fournier:                                      //
        //     "AxiSEM: broadband 3-D seismic wavefields in          //
        //              axisymmetric media"                          //
        //     Solid Earth, 5, 425-445, 2014                         //
        //     doi:10.5194/se-5-425-2014                             //
        //                                                           //
        //       Comprehensive description of the underlying         //
        //           numerical analysis can be found in:             //
        //                                                           //
        // (2) Tarje Nissen-Meyer, F. A. Dahlen, A Fournier (2007)   //
        //     "Spherical-earth Frechet sensitivity kernels"         //
        //     Geophysical Journal International 168(3),1051-1066.   //
        //     doi:10.1111/j.1365-246X.2006.03123.x                  //
        //                                                           //
        // (3) Tarje Nissen-Meyer, A Fournier, F. A. Dahlen (2007)   //
        //     "A two-dimensional spectral-element method for        //
        //        computing spherical-earth seismograms -            //
        //        I. Moment-tensor source"                           //
        //     Geophysical Journal International 168(3), 1067-1092.  //
        //     doi:10.1111/j.1365-246X.2006.03121.x                  //
        //                                                           //
        // (4) Tarje Nissen-Meyer, A Fournier, F. A. Dahlen (2007)   //
        //     "A two-dimensional spectral-element method for        //
        //        computing spherical-earth seismograms -            //
        //        II.  Waves in solid-fluid media"                   //
        //     Geophysical Journal International 174(3), 873-888.    //
        //     doi:10.1111/j.1365-246X.2008.03813.x                  //
        //                                                           //
        // (5) Martin van Driel and Tarje Nissen-Meyer (2014)        //
        //     "Seismic wave propagation in fully anisotropic        //
        //        axisymmetric media"                                //
        //      Geophysical Journal International 199 (2): 880-893.  //
        //      doi: 10.1093/gji/ggu269                              //
        //                                                           //
        // (6) Martin van Driel and Tarje Nissen-Meyer (2014)        //
        //     "Optimized visco-elastic wave propagation for         //
        //        weakly dissipative media"                          //
        //      Geophysical Journal International 199 (2): 1078-1093.//
        //      doi: 10.1093/gji/ggu314                              //
        //                                                           //
        //                                                           //
        //  April 2016: version 1.3                                  //
        //                                                           //
        ///////////////////////////////////////////////////////////////

        =============  I N P U T    P A R A M E T E R S ===============
            Data I/O path:                      ./Data              
            Info I/O path:                      ./Info              
            Simulation length [s]:               3600.000
            Enforced time step [s]:               0.000
            Enforced source period [s]:           0.000
            Simulation type:                    single
            Receiver file type:                 colatlon
            Sum seismograms?                     F
            Sum wavefields?                      F
            Time extrapolation scheme:          symplec4
            Seismogram sampling rate [s]:         0.000
            Dump kin./pot. energy?               F
            Dump global snaps?                   F
            Dump strain?                         F
            Wavefield dumping type:             displ_only  
            First GLL to save in strains:        1
            Last GLL to save in strains:         3
            First GLL to save in strains:        1
            Last GLL to save in strains:         3
            Samples per period for strains:       4.000
            Source dumping type:                igno
            Add heterogeneous region?            F
            Perform extensive mesh tests?        F
            Output format (seism., wavefields): netcdf
        ===============================================================
 MAIN: Reading mesh database...............................

   General numerical input/output parameters================
     grid pts/wavelngth =   1.5000000000000000     
     source period [s]  =   50.000000000000000     
     courant number     =  0.60000002384185791     
     time step [s]      =  0.14321715989718831     
 Checking value order in external model
  Value ' rad'  is in column  1
  Value ' vpv'  is in column  3
  Value ' vsv'  is in column  4
  Value ' rho'  is in column  2
  Value ' qka'  is in column  5
  Value ' qmu'  is in column  6
 Model in file ./external_model.bm has   160 layers and is isotropic and anelastic...
 Depths/radii are assumed to be defined in meters
 Layers in file ./external_model.bm start at surface
 Checking for discontinuities in the external velocity model
  1st order disc. at radius   6356000.0, layer:     2
  1st order disc. at radius   6346600.0, layer:     4
  1st order disc. at radius   6291000.0, layer:    10
  1st order disc. at radius   6151000.0, layer:    16
  1st order disc. at radius   5971000.0, layer:    22
  1st order disc. at radius   5771000.0, layer:    28
  1st order disc. at radius   5701000.0, layer:    34
  1st order disc. at radius   5600000.0, layer:    40
  1st order disc. at radius   3630000.0, layer:    81
  1st order disc. at radius   3480000.0, layer:    87
  1st order disc. at radius   1221500.0, layer:   134
 External model has  12 discontinuities
 
 Creating interpolation objects
    idom, upper_layer, lower_layer,      r(ul),      r(ll)
       1            1            2   6371000.0   6356000.0
       2            3            4   6356000.0   6346600.0
       3            5           10   6346600.0   6291000.0
       4           11           16   6291000.0   6151000.0
       5           17           22   6151000.0   5971000.0
       6           23           28   5971000.0   5771000.0
       7           29           34   5771000.0   5701000.0
       8           35           40   5701000.0   5600000.0
       9           41           81   5600000.0   3630000.0
      10           82           87   3630000.0   3480000.0
      11           88          134   3480000.0   1221500.0
      12          135          160   1221500.0         0.0
 

   Background model=========================================
     bkgrdmodel = external
     radius [m] =    6371000.0000000000     
     have_fluid =  T

   Min/max grid spacing=====================================
     hmin (global) [m]   :    1623.1277476708324     
     hmax (global) [m]   :    92925.020380043366     
     min_distance_dim [m]:    162.31277476708325     

   Axialogy=================================================

     Global total axial elements:          84
     Global solid axial elements:           0
     Global fluid axial elements:          84


 MAIN: Initializing grid...................................
 MAIN: Starting wave preparation...........................
 
 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 ++++++++    SEISMIC WAVE PROPAGATION: SOLID-FLUID CASE  ++++++++
 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 
   *****************GIVEN SOURCE PARAMETERS*****************
      Magnitude [Nm]:             1.000E+20
      Excitation type:        monopole    mrr       
      Depth [km]:                 0.000E+00
      Colat. [deg]:               0.000E+00
      Long. [deg]:                0.000E+00
      Source time function:       gauss_0
      Dom. period mesh [s]:         50.0000
   *********************************************************
 
    construction of mapping for kwf output...
    ...solid part...
    ...fluid part...
 local point number:               38400
 after removing duplicates:        24937
 compression:                 0.649401069    
    .... constructing midpoint grid for kwf output
    .... constructing finite element grid for kwf output
    .... constructing spectral element grid for kwf output
    .... finished construction of mapping for kwf output

     Using period of the mesh:   50.000000000000000     

     desired simulation length  :  3600.00   seconds
     offered simulation length  :  3600.05   seconds
     number time loop iterations:   16758

     desired seismogram sampling:     0.00   seconds
     offered seismogram sampling:     0.21   seconds
     ...that is, every          :       1 timesteps
     number of samples          :   16759

     Number of snapshots        :      61
     ...approximately every     :   60.00   seconds
     ...that is, every          :     279 timesteps
 
   *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-
   SOURCE TIME FUNCTION: gauss_0
    coarsest dump every           1 th time step, dt:  0.21482573984578246     
 
   SHIFT FACTOR of source time function [s]:   75.1890106    
    # SEM, seis, coarse points per shift factor:   350.000000       350.000000       350.000000    
    # simul. half widths per shift factor:   1.50378025    
   *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-
 

  ::::::::: BACKGROUND MODEL & PRECOMPUTED MATRICES:::::::
 WARNING: boundary term not all that precise!
  Term should equal four for 2 boundaries (i.e. int (sin) )
  Actual numerical value:   0.0000000000000000     
  :::::::DONE BACKGROUND MODEL & PRECOMPUTED MATRICES:::::
  *****************************************
     Welcome to the source term calculator 
  *****************************************
  locating the source....
ERROR: Source should be on the axis, hence theta = 0, but:
ERROR: Source should be on the axis, hence theta = 0, but:
At line 494 of file source.f90 (unit = 6, file = 'stdout')
Fortran runtime error: Expected INTEGER for item 2 in formatted transfer, got CHARACTER
(a,/,i7,2i2,/,a,3e10.2)
     ^
At line 494 of file source.f90 (unit = 6, file = 'stdout')
Fortran runtime error: Expected INTEGER for item 2 in formatted transfer, got CHARACTER
(a,/,i7,2i2,/,a,3e10.2)
     ^
-------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
-------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[6601,1],0]
  Exit code:    2
--------------------------------------------------------------------------

On the cluster, the SOLVER also raise errors due to source (OUTPUT tells that multiple processes have the source).

heterogeneities from 'inparam_hetero' not taken into account for all latitudes

Reported by Joanne Adam per mail to axisem.info

I am using a mesh grid with dominant period 1 second that is decomposed into 32x5 slices in latitude and radius. I would like to add topography at the inner-core boundary and see how this affects the core phases. To add the topography around the inner-core boundary, I have created an 'inparam_hetero' file (see attached with other input files). The OUTPUT_MZZ file is too large to be sent via e-mail ; you can download it here if you wish: https://dl.ipgp.fr/ahp9d5

I am running this using the full moment tensor and set True the 'DIAGNOSTIC_FILE_OUTPUT' option to obtain the VTK files and check the topography. I am happy with the shape of the topography however, I see in the VTK files that the topography is not along all the colatitudes. I have put heterogeneities between colatitude=0 and 180 degrees and I expect to see topography at all latitudes but the output VTK files only show topography at colatitudes greater than ~45 degrees. You can find few VTK files of the output Earth model here: https://dl.ipgp.fr/pg2o
You can see the topography in the South but it disappears in the North. I did not put all the VTK files because there are large files. In the rest of VTK files, you would see that the topography continues all the way to the South Pole and that there is no more topography at higher latitudes (see figure 'Inner-Core_Topography.png').

Why are the heterogeneities not taken into account everywhere ? Or are they taken into account for the computation but not showed in the VTK files ?

inner-core_topography

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.