WAMIT Tutorials: From Beginner to Advanced Analysis

Comparing WAMIT Outputs: Best Practices and Validation TipsWAMIT is a widely used boundary-element method (BEM) software for computing hydrodynamic interactions between waves and offshore structures. Engineers and researchers rely on WAMIT to produce frequency-domain and time-domain hydrodynamic coefficients (added mass, damping, excitation forces, and wave diffraction/radiation potentials) that feed into design, motion prediction, and control system studies. Because WAMIT outputs can be sensitive to geometry representation, numerical settings, and post-processing choices, careful comparison and validation are essential to ensure trustworthy results. This article outlines best practices for comparing WAMIT outputs and offers practical validation tips to reduce errors and increase confidence in your simulations.


Overview of key WAMIT outputs

Before comparing results, know which outputs matter for your application:

  • Added mass (A∞ and frequency-dependent A(ω)) — inertial contribution of the fluid.
  • Radiation damping (B(ω)) — dissipative effects due to wave radiation.
  • Wave excitation forces (Fe(ω)) — diffraction and Froude–Krylov forces from incident waves.
  • Wave potentials and Green’s function integrals — underlying potentials used to derive coefficients.
  • Transfer functions and RAOs (response amplitude operators) — vehicle/structure motion response after coupling hydrodynamics with mass, stiffness, and damping.
  • Hydrodynamic kernels and memory functions used for time-domain convolution.

When comparing outputs, keep in mind whether you examine frequency-domain quantities or time-domain reconstructions derived from them (e.g., convolution with retardation functions).


Pre-comparison checklist: ensure apples-to-apples

  1. Geometry and mesh
    • Use the identical surface geometry (same facetization and node distribution) across runs you compare. Small differences in mesh resolution or node placement change potential integrals and singularity treatments.
    • Check for duplicated or overlapping panels, closed watertight surfaces, and consistent normals pointing outward from the fluid domain.
  2. Physical and simulation parameters
    • Match water depth, gravity, seawater density, and coordinate system origin.
    • Ensure the same wave heading(s), frequency ranges, and discretization (number of frequencies, spacing).
    • Confirm identical boundary conditions (e.g., radiation condition, body constraints).
  3. Numerical settings
    • Panel integration tolerances, singular integration options, and near-field treatment must be the same.
    • Same truncation distances (for Sommerfeld radiation integrals or domain extents).
    • Consistent use of symmetry or multiple bodies (e.g., single-body vs. multi-body runs).
  4. Output conventions and units
    • Check units (SI vs. non-SI) and sign conventions for excitation and radiation coefficients.
    • Use the same normalization (per unit length, per unit beam, etc.) when comparing results with literature or other codes.

Best practices for comparing outputs

  1. Start with low-complexity test cases
    • Validate against canonical problems: submerged or floating sphere, vertical circular cylinder, and rectangular barge. These have analytical or well-established numerical solutions.
  2. Use convergence studies
    • Vary mesh refinement and frequency discretization systematically. Plot added mass and damping vs. mesh density to determine when results converge.
    • Track change in key integrals (e.g., A(ω) at low/high ω) rather than absolute values only.
  3. Compare frequency-by-frequency and integrated metrics
    • Visualize frequency-domain curves for A(ω), B(ω), and excitation amplitude/phase. Differences localized to narrow frequency bands often point to numerical integration or near-field errors.
    • Compute integrated metrics such as mean-squared difference across a frequency band or L2 norm to quantify discrepancies.
  4. Inspect complex components and phases
    • For excitation forces and transfer functions, compare both amplitude and phase (or real/imag parts). Phase mismatches can indicate differences in reference point, sign conventions, or omitted radiated wave contributions.
  5. Use non-dimensional parameters
    • Present results using non-dimensional frequency (ka), Froude number, or mass ratios to generalize comparisons and reduce metric sensitivity to scale.
  6. Cross-validate with alternative methods
    • Compare WAMIT outputs with other BEM solvers (e.g., Nemoh, AQWA, HydroSTAR) and, where applicable, analytical solutions. Agreement across independent codes increases confidence.
  7. Validate time-domain reconstructions
    • When converting frequency-domain WAMIT outputs to time-domain memory kernels or convolution integrals, ensure the chosen windowing and transform methods (e.g., inverse FFT, Bromwich integral approximations) preserve causality and stability.
  8. Document and automate
    • Keep scripts for geometry generation, mesh export, and post-processing. Automating mesh convergence and batch comparisons reduces human error and makes results reproducible.

Common sources of discrepancy and how to diagnose them

  • Mesh differences: check panel counts, aspect ratios, and visual inspections for skewed panels.
  • Singular/integral treatment: small panels near sharp edges produce strong integrand variations. Use higher-order integration or local refinements.
  • Truncation and far-field approximations: verify Sommerfeld integral cutoffs and far-field Green’s function approximations.
  • Coordinate/origin mismatch: excitation phase or radiation transfer functions shift with changes in body reference point; ensure same center of rotation and inertia references.
  • Hydrodynamic sign conventions: different software may use different signs for excitation or radiation coefficients; compare real and imaginary parts carefully.
  • Numerical noise at high frequencies: employ smoothing, increase frequency sampling, or refine mesh to reduce oscillatory artifacts.
  • Symmetry assumptions: enabling symmetry reduces computational cost but changes mode coupling—use only when physically justified.

Practical validation workflow (step-by-step)

  1. Prepare canonical test: run a sphere or vertical cylinder case with analytical/benchmark reference.
  2. Mesh convergence: run 3–5 meshes from coarse to fine; plot A(ω) and B(ω) and pick a mesh near asymptotic behavior.
  3. Frequency-sampling test: run low-resolution and high-resolution frequency sets; check interpolation sensitivity.
  4. Cross-code check: reproduce the same case in another solver or compare to published datasets.
  5. Time-domain check (if applicable): transform to memory kernel and run a time-domain simulation of an impulse or prescribed motion; compare RAOs or transient decay against expected results.
  6. Sensitivity experiments: perturb physical parameters (depth, density), mesh small adjustments, and note which outputs change and how.
  7. Final acceptance: define quantitative tolerances for your application (e.g., ±5% on added mass at operating band) and accept mesh/settings when within limits.

Tips for post-processing and plotting

  • Always plot both magnitude and phase (or real/imag) for complex quantities.
  • Use log-scale for frequency when behavior spans orders of magnitude.
  • Overplot baseline/reference solutions and highlight the operating band relevant to the design.
  • Annotate resonance peaks, zero crossings, and irregular frequencies (where diffraction effects change rapidly).
  • For time-domain memory kernels, show kernel decay to verify causality and numerical damping.

Example checks for a floating platform

  • Heave added mass A33(ω): check low-frequency limit equals displaced water mass; confirm high-frequency tends to zero or theoretical expectation.
  • Surge and sway coupling: verify symmetry-related zeros in cross-coupling terms for symmetric geometries.
  • Radiation damping B(ω): ensure positive definiteness for passive damping regions; negative values usually indicate numerical issues.
  • Excitation force phase at zero frequency: ensure phase aligns with expected Froude–Krylov sign.

When to involve experiments or field data

Numerical validation should be complemented by experimental data when possible:

  • Model tests in wave tanks provide diffraction/radiation measurements, exciting validation for highly nonlinear or viscous-dominated regimes where BEM assumptions weaken.
  • Use controlled experiments for complex appendages, perforated structures, or mooring interactions that may not be captured accurately by potential flow alone.

Summary checklist

  • Match geometry, units, and numerical settings before comparing.
  • Perform mesh and frequency convergence studies.
  • Compare amplitude and phase; use non-dimensional parameters.
  • Cross-validate with other codes, analytic solutions, and experiments when possible.
  • Automate tests and document tolerances for acceptance.

This set of best practices and validation tips should help you compare WAMIT outputs more reliably, diagnose common problems, and build confidence in hydrodynamic coefficients used for offshore design and analysis.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *