12–14 Oct 2021
on-line
Europe/Amsterdam timezone

Application of stochastic approach for satellite data processing unit (DPU) thermal analysis

12 Oct 2021, 13:45
30m
on-line

on-line

thermal analysis and software tools Thermal Analysis

Speaker

Artur Jurkowski

Description

In the early years of CubeSat technology the reliability of the missions was about 30-50%. However, over time the mission’s success rate increased up to about 74% in 2018 (based on Thyrso Villela et al., “Towards the Thousandth CubeSat: A Statistical Overview”). Nevertheless, there’s still a place for improvement due to high investment cost. Thermal control of a spacecraft is one of the systems that can be a source of a critical failure. In this technical field, success of the mission relies on numerical analysis, performed tests and finally, in-space thermal behavior. Uncertainty of the simulation and inaccuracy of the laboratory tests enforce engineers to use safety margins on temperature results and make design process more conservative. In the planned presentation we want to propose usage of DAKOTA Sandia software (allowing for a statistical analysis of the numerical model), which we combined with ESATAN-TMS for having wide field of view on achieved results with benefit to our nanosatellite mission Intuition-1.

Despite the small size of nanosatellite, there are many variables which have big overall impact on mission thermal behavior, like emissivities, thermal conductivities etc. which engineers should consider during development. All of this data has its own statistics which have direct impact on final results confidence level. Unfortunately, using this extended statistical data manually is time consuming and ineffective operation. To automate this process the IT tool for stochastic approach was employed. The thermal mathematical model (TMM) of the computational subsystem (DPU) of Intuition-1 nanosatellite was created and the input physical parameters with their uncertainty margins were assessed. The experimental data, which was used for a validation purpose, was obtained during thermal balance tests in thermal vacuum chamber (TVAC) related to relevant ECSS standard. After getting preliminary results, TMM model was correlated with the data from the experiment. Such a model was used for further stochastic analysis.

Firstly, the automated Sensitivity Analysis (SA) was employed to show which parameters have the greatest impact on the chosen model outputs. Knowing this significant parameters, next step was to prepare more precise statistical input data for Uncertainty Quantification (UQ) study, which gave the probability information for output function. This data was used for estimation of new thermal margins. Moreover, input parameters with known upper and lower bounds but unknown statistics (epistemic uncertainty) were also analyzed. This was achieved by employing mixed UQ approach where aleatory and epistemic inputs were considered.

The stochastic approach allowed to make validation process more insightful. Moreover, it gave more information and data about the most relevant heat paths. Additionally, the approach allowed to get more strict error margins for temperature measurement points. These advantages and data help engineers for being confident for virtual prototyping and afterwards decrease probability of mission failure.

Primary author

Co-authors

Presentation materials