Data and Checkpoints#

There is no use for running a simulation if it doesn’t export anything. Despite that, if everything produced is exported the amount of data would be overwhelming, probably it wouldn’t even fit into a common hard drive.

The field data describes specifications for data save and some checkings to do with these data. Some of the features available are:

  • Divergence checking

  • Instaneous field export

  • Statistics export

  • Probes for time series export

  • IBM nodes export

Divergence#

One common problem in numerical simulations is that the fields diverge. Here a divergence is considered when some value is NaN (not a number) in the macroscopics field. This checking is configured through the data.divergence field.

simulations:
  - name: example
    data:
      # Interval and frequency to check for simulation divergence
      divergence: { end_step: 0, frequency: 50, start_step: 0 }

Macroscopics#

The data available is mostly based on the macroscopics fields for the simulation. The list below describe the name of the fields and what they represent:

  • rho: Density

  • u: Velocity (ux, uy, uz)

  • S: Stress tensor (Sxx, Sxy, Sxz, Syy, Syz, Szz)

  • theta: Relative temperature (only for simulations with thermal model)

  • omega_LES: Value of total omega (only for simulations that use LES)

  • omega_mask: Mask value to use for omega (only for simulations that use wall model)

  • f_IBM: IBM force (only for simulations that use IBM) (f_IBMx, f_IBMy, f_IBMz)

  • sigma: Sigma value for HRRBGK (only for simulations that use HRRBGK)

All fields that require one or a list of macroscopics must use these names to refer to them. If the field doesn’t exist, it won’t be exported or used.

Instantaneous#

It’s possible to configure the export of instantaneous macroscopics fields using the data.instantaneous field.

simulations:
  - name: example
    data:
      instantaneous:
        # Name of instantaneous export
        default:
          # Interval to export
          interval:
            start_step: 78
            end_step: 0
            frequency: 10
          # Volume to export.
          # Any block fully outside this volume is not exported
          volume_export:
            start: [32, 0, 0]
            end: [448, 128, 64]
            # is_abs defaults to true
            is_abs: true
          # Macroscopics to export
          macrs: ["rho", "Sxx", "Sxy"]
          # Rescale macroscopics for exports
          # result = macr * mul + cte
          macrs_rescale:
            # This must be the macroscopic name (ux, not u)
            ux:
              mul: 40
              cte: 0
          # Time step multipler for rescaling time
          # Defaults to 1
          time_rescale: 1
          # Max level resolution to export
          # Defaults to -1 that export up to max level
          max_lvl: 3
        # Other field with only required fields for export
        export_minimun:
          interval:
            frequency: 10
          macrs: ["u", "rho"]

Each instantaneous configuration saves a .vtm file that can be viewed using ParaView.

Note

Some arrays, such as f_IBM, are not present in all blocks. For the blocks that they don’t exist, the values are filled with NaN (not a number)

Statitics#

Nassu supports the calculation of statistics in runtime for the domain, those being mean (1st order) and the mean of the value squared (2nd order). The field data.statistics is used to configure it.

simulations:
  - name: example
    data:
      statistics:
        # Interval to calculate the statistics
        interval:
          start_step: 78
          end_step: 0
          frequency: 10
        # Specify the volume in which to calculate the statistics
        # Only the blocks inside this volume will have the statistics
        # Blocks outside will be exported as NaN values
        volume_export:
          start: [32, 0, 0]
          end: [448, 128, 64]
          # is_abs defaults to true
          is_abs: true
        # Macroscopics for first order statistics
        macrs_1st_order: [rho, u]
        # Macroscopics for second order statistics
        macrs_2nd_order: [rho, u]
        exports:
          default:
            interval:
              frequency: 5000
          export_area2:
            # Rescale macroscopics for statistics
            # Keep in mind that the transformation is linear and only for export
            # APPLY RESCALE IN STATISTICS ONLY IF YOU KNOW WHAT YOU'RE DOING
            # because 2nd order statistics get messed up
            macrs_rescale:
              # This must be the macroscopic name (ux, not u)
              rho:
                mul: 142.2
                cte: 742
            # Time step multipler for rescaling time
            # Defaults to 1
            time_rescale: 42.24
            volume_export:
              start: [32, 0, 0]
              end: [448, 128, 64]
              is_abs: true
            interval:
              frequency: 50

Each statistics export configuration saves a .vtm file that can be viewed using ParaView.

Monitors#

Sometimes keep track of global statistics over time is important, to know wheter the simulation is stabilizing or diverging, if mass is being gained, velocity is out of control or other multiple use cases. For this Nassu implements monitors to keep track of macroscopics statistics during simulation.

simulations:
  - name: example
    data:
      monitors:
        fields:
          # Monitor name
          rho_max:
            # Macroscopics to monitor
            macrs: [rho, ux]
            # Rescale macroscopics for monitors
            macrs_rescale:
              # This must be the macroscopic name (ux, not u)
              ux:
                mul: 41
                cte: 0
            # Time step multipler for rescaling time
            # Defaults to 1
            time_rescale: 1
            # Statistics to check, available ones are: min, max, mean, pos
            # pos exports the position of maximun and minimun value (if they are selected)
            stats: [max, pos]
            # Interval in which to monitor statistics
            interval: {start_step: 500, end_step: 10000, frequency: 50}
          # Another example of macroscopics statistics
          macrs_stats:
            macrs: [rho, uy, Sxy, Sxz]
            stats: [min, max, mean]
            interval: {start_step: 0, end_step: 0, frequency: 500}
            # It's possible to define a series of volumes to monitor only blocks in it.
            # If none is specified, it checks the full domain.
            # The checking is done based on blocks, so the volume may be a bit larger than specified
            volumes_monitor:
              - start: [10, 20, 0]
                end: [100, 100, 100]
                is_abs: true
              - start: [0.1, 0.1, 0.1]
                end: [0.9, 0.9, 0.9]
                is_abs: false
            # In case any volume should be ignored, it can be specified here. All blocks that are 
            # fully inside a volume to ignore are not used for monitoring.
            volumes_ignore:
              - start: [50, 60, 5]
                end: [70, 120, 10]
                is_abs: true

Each monitor exports a .csv file and plots the statistics over time for each macroscopic.

IBM Nodes#

IBM nodes have values attributed such as the interpolated velocity, position, force that is spread and such. The field data.export_IBM_nodes is used to configure it.

simulations:
  - name: example
    data:
      export_IBM_nodes:
        start_step: 500
        end_step: 10000
        frequency: 100

These export one .csv for each body with all the body’s nodes informations in it.

Probes#

It’s very common to have interest in exporting a time series over a set of points. These points may be a line, a single point, the values in a body or any other kind of positions arrangement.

The field data.probes.historic_series is used when this type of export is required

simulations:
  - name: example
    data:
      probes:
        historic_series:
          # Series to export
          series1:
            # Formats to export data
            # Defaults to: ["hdf"]
            formats_export: [hdf, csv]
            # Macroscopics to export
            macrs: [rho, u]
            # Rescale macroscopics for historic series output
            macrs_rescale:
              # This must be the macroscopic name (ux, not u)
              ux:
                mul: 41
                cte: 0
            # Time step multipler for rescaling time
            # Defaults to 1
            time_rescale: 42.24
            # Interval of sampling 
            interval:
              start_step: 78
              end_step: 0
              frequency: 10
              # Level to use for time step frequency
              lvl: 4
            # The exported HDF file is divided in tables. This is the interval 
            # to group time steps in a table (each 500 steps a new table is generated)
            # Defaults to 1000
            interval_group: 500
            # Lines to use
            lines:
              # Line name
              line1:
                # Specification, with start, end and distance between points.
                start_pos: [200.46875, 79.4285, 2.905]
                end_pos: [200.46875, 80.5715, 2.905]
                dist: 0.28575
            # Single points to use
            points:
              # Point name
              point1:
                pos: [200.46875, 79.4285, 2.905]
            # Bodies to use
            bodies:
              # Name of export
              my_CAARC:
                # Body to use
                body_name: "CAARC"
                # Normal offset value
                normal_offset: 0.03125
                # Cell uses the triangles from the geometry
                # Vertex uses the vertices from the geometry
                element_type: "cell" # or "vertex"
              # Another export name
              my_surface:
                # It's also possible to export only from a given surface
                body_name: "CAARC.surface_name"
                normal_offset: -0.03125
                element_type: "cell" # or "vertex"
            # CSVs to use
            csvs:
              # Name of csv
              my_csv:
                # File has a header x, y, z and is separated by ,
                filename: "my_filename.csv"
              # Another csv name
              my_other_csv:
                filename: "another csv"
          # Another series configuration for export
          another_series:
            macrs: [u, S]
            interval:
              frequency: 10
              lvl: 4
            lines:
              line1:
                start_pos: [200.46875, 79.4285, 2.905]
                end_pos: [200.46875, 80.5715, 2.905]
                dist: 0.28575

Each entity of points (such as line, body and its names) exports a .csv with the actual points being exported and a .hdf with the macroscopics time series indexed by the points.

Important

The time series values are indexed in reference to the original points.

For example, if the original list of points is [(1, -1, 1), (1, 1, 1), (2, 2, 2), (2, -4, 2)] only the points at index 1 and 2 are inside the domain. So the export time series will have only point_idx values of 1 and 2, referecing to the index in the original list of points.

This is particularly useful when exporting from a body, to relate the index with the .lnas list of vertices or triangles

Spectrum analysis#

There is one special case for probes that require special attention. For spectrum analysis of a time series, we are interested in analyzing the frequency of the signal using a transformation such as Fourier. In this case it’s very important to obtain the values at the same time resolution as the numerical one, that is, at every time step evolution, export the value.

It’s also not desired to perform any type of spacial interpolations, because this may lead to a low pass filter that disturbs the numerical signal.

For this special use case, we use the data.probes.spectrum_analysis field

simulations:
  - name: example
    data:
      probes:
        spectrum_analysis:
          # Macroscopics to export for all cases
          macrs: [rho, u]
          # Rescale macroscopics for spectrum analysis
          macrs_rescale:
            # This must be the macroscopic name (ux, not u)
            ux:
              mul: 41
              cte: 0
          # Time step multipler for rescaling time
          # Defaults to 1
          time_rescale: 1
          # Points in which to export the full scale time series
          points:
            # If the point doesn't exactly match a node position in the domain
            # the nearest node is used
            upstream:
              pos: [200.46875, 80.0, 4.81]
            downstream:
              pos: [201.48375, 80.0, 4.81]

Each point exports a .csv with the actual position used and a .hdf with the time series data.

Checkpoint#

One key feature is the capability of restart a simulation, mantaining the fields and state from a previous time step. This is provided through the checkpoint field, which tries to best fit this goal, providing the capacity to restart a simulation or to run a new one using the state of another simulation.

simulations:
  - name: example
    checkpoint:
      export:
        # Interval to export checkpoint. It defaults to no export.
        interval: {end_step: 30000, frequency: 5000, start_step: 10000}
        # True to save checkpoint after simulation is finished or not.
        # Defaults to false.
        finish_save: false
        # Keep only the last checkpoint on disk, removing other saved ones in the `checkpoint` folder
        # Defaults to false
        keep_only_last_checkpoint: true
      load:
        # Start simulation from checkpoint. Defaults to False
        checkpoint_start: true
        # Reset simulation time step, start at 0 instead of checkpoint time step.
        # Defaults to false.
        reset_time_step: false
        # Path to folder to load checkpoint from. If not specified, tries to load
        # from the simulation output folder the last checkpoint saved
        # Defaults to null
        folderpath: null

There are some known limitations to the checkpoint capabilities. Some notable ones are:

  • The multiblock communication that requires temporal interpolation lost their previous state, starting with a constant time field.

  • Statistical fields continuity are lost and not backed up

  • For probes and spectrum_analysis the simulation overwrites the files

    • Despite this, the original ones are saved in the checkpoint as well, so it’s possible to join them afterwards.