This documentation is not for the latest stable Salvus version.
%matplotlib inline# This notebook will use this variable to determine which
# remote site to run on.
import os
SALVUS_FLOW_SITE_NAME = os.environ.get("SITE_NAME", "local")
PROJECT_DIR = "project"import matplotlib.pyplot as plt
import numpy as np
import pathlib
import salvus.namespace as sn
from salvus.project.tools.processing import block_processing
from salvus.toolbox.helpers.wavefield_output import (
WavefieldOutput,
wavefield_output_to_xarray,
)
print("Opening existing project.")
p = sn.Project(path=PROJECT_DIR)Opening existing project.
simulation_3, but with volumetric output.
In addition to displacement, let's also output the spatial gradient of the displacement field, so we can compute derived quantities like the divergence or curl afterwards.p.simulations.launch(
ranks_per_job=2,
site_name=SALVUS_FLOW_SITE_NAME,
events=p.events.list(),
simulation_configuration="simulation_3",
extra_output_configuration={
"volume_data": {
"sampling_interval_in_time_steps": 10,
"fields": ["displacement", "gradient-of-displacement"],
},
},
# We have previously simulated the same event but without
# extra output. We have to thus overwrite the existing
# simulation.
delete_conflicting_previous_results=True,
)[2025-04-15 19:27:02,836] INFO: delete_conflicting_previous_results is set to True. Deleting results for simulation configuration simulation_3 and event event_0. [2025-04-15 19:27:02,841] INFO: Removing contents of `project/EVENTS/event_0/WAVEFORM_DATA/INTERNAL/55/11/abcfd69ba132`. [2025-04-15 19:27:02,870] INFO: Submitting job ... Uploading 1 files... 🚀 Submitted job_2504151927041980_410992afad@local
1
query_simulations() as below.p.simulations.query(block=True)True
p.simulations.get_simulation_output_directory(
simulation_configuration="simulation_3", event=p.events.list()[0]
)PosixPath('project/EVENTS/event_0/WAVEFORM_DATA/INTERNAL/55/11/abcfd69ba132')# Uncomment the next line and paste the path from above
# !ls _paste_the_path_above_herevolume_data_output_elastic_volume.xdmf is what we are after. It is this .xdmf file that should be opened in Paraview. When you do open such a file, you should see a dialogue box to select the correct reader. Make sure to select the XDMF Reader and not any of the Xdmf3Reader or Paraview will crash immediately :-/WaveformSimulationConfiguration, Salvus will automatically compute a suitable time step for the simulation based on the CFL condition. This will typically heavily oversample the signal. For post-processing, however, it might be convenient to work with resampled data.ed = p.waveforms.get("simulation_3", p.events.list())print(
ed[0].get_receiver_data(
receiver_name="XX.REC1.", receiver_field="displacement"
)
)2 Trace(s) in Stream: XX.REC1..X | 1969-12-31T23:59:59.920000Z - 1970-01-01T00:00:00.600000Z | 1038.2 Hz, 707 samples XX.REC1..Y | 1969-12-31T23:59:59.920000Z - 1970-01-01T00:00:00.600000Z | 1038.2 Hz, 707 samples
EventData object will then work with re-sampled data. For example, we can extract 250 samples at a sampling rate of 500 Hz starting from time zero as follows:ed[0].set_temporal_interpolation(
start_time_in_seconds=0.0, sampling_rate_in_hertz=500, npts=250
)print(
ed[0].get_receiver_data(
receiver_name="XX.REC1.", receiver_field="displacement"
)
)
ed[0].get_receiver_data(
receiver_name="XX.REC1.", receiver_field="displacement"
).plot()2 Trace(s) in Stream: XX.REC1..X | 1970-01-01T00:00:00.000000Z - 1970-01-01T00:00:00.498000Z | 500.0 Hz, 250 samples XX.REC1..Y | 1970-01-01T00:00:00.000000Z - 1970-01-01T00:00:00.498000Z | 500.0 Hz, 250 samples
<Figure size 800x500 with 2 Axes>
get_receiver_data returns an obspy stream object containing a list of traces. You can access individual traces by index and the raw data by the member data, which is a numpy array. Here, we query the Y component at index 1.trace_data = (
ed[0]
.get_receiver_data(
receiver_name="XX.REC1.", receiver_field="displacement"
)[1]
.data
)
print(type(trace_data), trace_data.shape)<class 'numpy.ndarray'> (250,)
get_data_cube in case you seek to obtain the receivers in a certain order. The function returns a tuple of the time axis as 1D array, and receiver data as 2D array (nrec x nt). There is also the option to sort the receivers in case a certain order is desired.times, values = ed[0].get_data_cube(
receiver_field="displacement", component="Y"
)plt.plot(times, values[0, :], label="get_data_cube")
plt.plot(times, trace_data, "--", label="get_receiver_data")
plt.legend()
plt.show()WavefieldOutput object.# First, retrieve the volumetric output file
output_file = pathlib.Path(
p.simulations.get_simulation_output_directory("simulation_3", "event_0"),
"volume_data_output.h5",
)
# Create a WavefieldOutput object for resampling
volume_output = WavefieldOutput.from_file(
output_file, "displacement", "volume"
)xarray.DataArray with dimensions for
time, components and the array of points provided. The second argument can be an arbitrary array of points.
Here, we just use the coordinates of the points in the mesh.sampled_output_1 = wavefield_output_to_xarray(
volume_output, p.simulations.get_mesh("simulation_3").points
).Tx and y with 10 m spacing.sampled_output_2 = wavefield_output_to_xarray(
volume_output,
[
np.linspace(0.0, 2000.0, 200),
np.linspace(0.0, 1000.0, 100),
],
).Txarray.ax = sampled_output_2.sel(c=1, t=sampled_output_2.t[20]).plot(
shading="gouraud", infer_intervals=False
)
ax.axes.set_aspect("equal")resampled_output = block_processing.resample(
sampled_output_1.data, sampled_output_1.t.data, np.linspace(0.0, 0.5, 251)
)event_1 with newly defined source(s) and receivers.site_name and the ranks_per_job in the launch function and Salvus will take care about the rest.