(C) PLOS One [1]. This unaltered content originally appeared in journals.plosone.org.
Licensed under Creative Commons Attribution (CC BY) license.
url:
https://journals.plos.org/plosone/s/licenses-and-copyright
------------
SISPO: Space Imaging Simulator for Proximity Operations
['Mihkel Pajusalu', 'Space Technology Department', 'Tartu Observatory', 'University Of Tartu', 'Tõravere', 'Iaroslav Iakubivskyi', 'Gabriel Jörg Schwarzkopf', 'Department Of Electronics', 'Nanoengineering', 'School Of Electrical Engineering']
Date: 2022-04
This paper describes the architecture and demonstrates the capabilities of a newly developed, physically-based imaging simulator environment called SISPO, developed for small solar system body fly-by and terrestrial planet surface mission simulations. The image simulator utilises the open-source 3-D visualisation system Blender and its Cycles rendering engine, which supports physically based rendering capabilities and procedural micropolygon displacement texture generation. The simulator concentrates on realistic surface rendering and has supplementary models to produce realistic dust- and gas-environment optical models for comets and active asteroids. The framework also includes tools to simulate the most common image aberrations, such as tangential and sagittal astigmatism, internal and external comatic aberration, and simple geometric distortions. The model framework’s primary objective is to support small-body space mission design by allowing better simulations for characterisation of imaging instrument performance, assisting mission planning, and developing computer-vision algorithms. SISPO allows the simulation of trajectories, light parameters and camera’s intrinsic parameters.
Funding: This work was funded by the ESA Contract No. 4000131003/20/NL/IB/ig with the University of Tartu ("Comet Interceptor (EE-1): OPIC Engineering Model Development", PI is MP; the salary was paid to: MP, II, GLB, HT, and hardware acquisition), the Archimedes Foundation (
https://archimedes.ee , UT ASTRA project 2014–2020.4.01.16-0029 KOMEET “Benefits for Estonian Society from Space Research and Application” in the form of travel supports), the Eesti Teadusagentuur (EE) (MOBTP151 and PUTJD601, awarded to MP), and the base funding of Tartu Observatory (registration number 74001073). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Copyright: © 2022 Pajusalu et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Introduction
A versatile image-simulation environment is required in order to design advanced deep-space missions, to simulate large sets of mission scenarios in parallel, and to develop and validate algorithms for semi-autonomous operations, visual navigation, localisation and image processing. This is especially true in the case of Small Solar System Body (SSSB) mission scenarios, where the mission has to be designed with either very limited information about the target (i.e., precise size, shape, exact composition and activity) or the targets can remain a near-complete mystery before their close encounter (i.e., as in the case of interstellar objects [1, 2]). Some publicly known cosmic-synthetic-image generators for space missions are available, and they are briefly described in the next paragraph.
Airbus Defence and Space has developed SurRender, which renders realistic images with a high level of representativeness for space scenes [3]. It uses ray tracing to simulate views of scenes composed of planets, satellites, asteroids and stars, taking into account the illumination conditions and the characteristics of the imaging camera through a user-defined Point Spread Function (PSF). The textures are accessed in a large virtual file, or procedural texture generation can be used. SurRender uses the different models for Bidirectional Reflectance Distribution Function (BRDF), for example Lambertian [4] or [5–7] for the Moon and asteroids, [8] for the Jovian moons. The University of Dundee, UK has developed the Planet and Asteroid Natural Scene Generation Utility (PANGU), which generates realistic, high-quality, synthetic images of planets and asteroids using a custom graphics-processing-unit-based renderer, which includes a parameterisable camera model [9]; it also has a graphical user interface, which makes it more intuitive to use. PANGU implements a Spacecraft Planet Instrument Camera-matrix Events (SPICE) interface, which provides historical and future ephemerides of the Solar System and selected spacecraft. The standard Lambertian diffuse reflectivity model is included as well as Hapke, Oren–Nayar, Blinn–Phong and Cook–Torrance BRDFs. NASA’s Navigation and Ancillary Information Facility developed the internationally recognised Spacecraft Planet Instrument Camera-matrix Events (SPICE) tool, which provides the fundamental observation geometry needed to perform photogrammetry, map making and other kinds of planetary science data analysis [10]. It is a numerical tool that provides position and orientation ephemerides of spacecraft and target bodies (including their size and shape), instrument-mounting alignment and field-of-view geometry, reference frame specifications, and underlying time-system conversions; however, it does not have surface-rendering capabilities, and it is limited to shape rendering by implementing the digital shape kernel system (tessellated plate data and digital elevation models). SPICE has the three-dimensional (3-D) visualisation application Cosmographia [11], and has been recently implemented in the toolkit with rendering capabilities for spacecraft orbit visualisation and depictions of observations by probe instruments in Blender [12]. Hapke model also has been used with Blender previously [13].
The comparison between various simulators that are capable of SSSB synthetic image generation is summarised in Fig 1 (camera, orientation and exact light parameters may differ between simulator set-ups). The 25143 Itokawa asteroid model by the [14] was used.
PPT PowerPoint slide
PNG larger image
TIFF original image Download: Fig 1. The comparison of available simulators for space-scene image rendering. The rendered example for each simulator is asteroid 25143 Itokawa. (A) SurRender image generator by Airbus; it uses backward ray tracing or image generation with Open Graphics Library (OpenGL). (B) PANGU image generator by University of Dundee, UK and ESA; it uses fractal terrain generation using OpenGL. (C) SISPO by the University of Tartu, Estonia and Aalto University, Finland; it uses Blender Cycles physically based path tracer (the same model as above with a simple diffuse shader); (D) SISPO with Blender Cycles physically based path tracer (with procedural displacement with new surface features and reflectance textures for extra detail).
https://doi.org/10.1371/journal.pone.0263882.g001
The Space Imaging Simulator for Proximity Operations (SISPO) has been developed to provide a full pipeline from simulated imagery to final data products (e.g., 3-D models); to include photorealistic, physically based rendering; to support automatic surface generation with procedural displacement textures; to allow the implementation of spacecraft, instrument and environmental models (e.g., imaging distortions, gas and dust environment, attitude dynamics); and to avoid legacy software. SISPO uses Cycles rendering with the Blender software package, which allows for programming procedures in Python [15]. The preliminary results of 3-D reconstruction and localisation using SISPO, without providing details for simulated imagery and near environment (i.e., gas and dust), were published by [16]. SISPO works on large scales and could simulate a variety of objects at the Solar System scale. It could also be used to generate realistic videos from individual frames, either for visualisation purposes or public outreach.
This article demonstrates the synthetic image simulation capabilities of SISPO, and 3-D reconstruction as a case study showing how such images can be utilised. It also discusses SISPO application to actual space missions, architecture, rendering system and supplementary models.
[END]
[1] Url:
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0263882
(C) Plos One. "Accelerating the publication of peer-reviewed science."
Licensed under Creative Commons Attribution (CC BY 4.0)
URL:
https://creativecommons.org/licenses/by/4.0/
via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/plosone/