This repository documents a validated workflow for radar cross section (RCS) simulation of airborne platforms using Ansys HFSS (Hybrid IE, SBR, and SBR+). The primary public benchmark in this repo is a DJI Phantom 3 quadcopter model, where simulated RCS patterns and distributions are compared against published measurement/simulation results. The goal is reproducibility: clear solver settings, geometry preparation steps, and runtime/accuracy trade-offs.
In addition to the public Phantom 3 benchmark, similar characterization was performed for other airframes (e.g., a low-observable bomber-class platform and a MALE ISR/UCAV-class platform). Those additional results are not published here and are available upon request.
Highlights • Scope • DJI Phantom 3 • HFSS Setup • Results • Literature Comparison • Frequency Sweep • Notes
- End-to-end HFSS RCS pipeline: CAD cleanup → material assignment → illumination/scan setup → solver selection → sweep → post-processing.
- Two solver configurations: Basic Hybrid IE + SBR vs Full SBR+ (Advanced) with explicit runtime implications.
- Validation mindset: sanity-check geometry/pose conventions and compare patterns against published results.
- Deliverable-ready outputs: polar cuts, sweep visuals (2D/3D), and summary figures.
This work started as a hands-on exploration of how different physical and numerical choices influence RCS outcomes. There isn’t “one specific build” to inspect—so the goal is generality: develop a workflow that is repeatable, and validate it against a public benchmark before moving to more complex airframes.
CAD model from:
https://grabcad.com/library/dji-phantom3-1
The Phantom 3 model is imported into HFSS and used as the first RCS case. A short rotation/overview demo is included to sanity-check geometry orientation and posture.
Demo GIF:
media/phantom3_model_overview.gif
This particular platform is used as a starting point for ease of comparison: a peer-reviewed reference already analyzes the same target and provides measured data (anechoic chamber) along with a disclosed simulation configuration. That enables two validations: (1) verify that the simulation workflow produces the expected behavior, and (2) cross-check the simulated values against measured ones.
For this benchmark, the target is modeled as PEC as a deliberate simplification to keep the first public validation focused on solver behavior, geometry/orientation sanity, and reproducibility. For higher-fidelity work on real platforms, material assignment is matched to reality as far as practical (plastics, composites, metals, radomes, etc.).
All Phantom 3 simulations here are monostatic HH RCS using two hybrid configurations:
- Basic: Hybrid IE + SBR (no creeping waves / no wedge correction)
- Advanced: Full SBR+ with scattering features enabled, using a balanced speed–accuracy tradeoff
The advanced SBR+ configuration is dramatically slower in this benchmark (≈ 72.55× in the recorded run), which matters when you scale to frequency sweeps, finer angular sampling, or more complex airframes.
The intent is not to claim perfect agreement from a simplified PEC benchmark, but to validate that: (1) coordinate conventions are correct, (2) aspect-dependent behavior looks physically plausible, and (3) solver choices move results in expected directions.
A frequency sweep is shown for the Phantom 3 using the Basic SBR configuration, including both 3D and 2D views.
- The public Phantom 3 benchmark intentionally uses simplifying assumptions (e.g., PEC) to validate workflow mechanics and solver behavior.
- For real platforms, material assignment, small details, and configuration fidelity matter and should be matched to the target and the measurement setup. All of this was done on the other airframes, and are available upon request.
Benchmark reference (measurements + simulation on DJI Phantom family):
P. J. Speirs, A. Murk, M. Renker, P. Wellig, and U. Aulenbacher,
“High-detail simulations of consumer-grade UAV RCS signatures, and comparisons against measurements,”
in Drone Detectability: Modelling the Relevant Signature (NATO STO Meeting Proceedings, STO-MP-MSG-SET-183),
Virtual (WebEx), 27 April 2021, Paper MP-MSG-SET-183-03, pp. 1–26.
PDF: https://publications.sto.nato.int/publications/STO%20Meeting%20Proceedings/STO-MP-MSG-SET-183/MP-MSG-SET-183-03.pdf
Simulation files are available upon request.
Email: rawashdeh758@gmail.com.










