- Cet évènement est passé
Séminaire IMAGE : Physically constrained generative networks for cloud and texture synthesis (Pierrick Chatillon)
20 juin / 14:00 - 15:00
Nous aurons le plaisir d’accueillir Pierrick Chatillon, qui parla du travail de sa thèse entre Télécom Paris et l’ONERA.
Il donnera un séminaire IMAGE, le jeudi 20 juin 2024, à 14h00, en salle de séminaire F-200.
Title: Physically constrained generative networks for cloud and texture synthesis
Abstract:
Evaluating the performance of optical sensors requires large-scale databases of cloud backgrounds, for example to predict optical link availability between ground stations and satellites and detecting small objects like drones against cloudy skies. Deep learning algorithms can be implemented for these purposes, requiring large training databases. Accessing such databases is challenging since passive systems used for atmospheric observations provide only partial views, and databases constructed through physical modeling are costly. We have therefore developed deep learning methods to overcome this need for large quantities of cloud images, with the aim of maintaining the spectral and radiometric properties of the images.
Physical simulations are limited in terms of spatial resolution if the area to be covered is large. Hence, we explored two super-resolution approaches to enhance image definition. Both methods belong to internal methods, exploiting information redundancy within a single image at different locations and scales. They leverage the fractal properties of cloud backgrounds and use a generative network as a common model for various resolutions. These methods enable the generation of images exhibiting a power decay of the spectral density, an essential descriptor of cloud textures.
A different direction of our research involves exploring texture synthesis methods. We introduce a generative model for cloud image generation based on physical parameters. This model can control the spectral behavior and histogram characteristics of the generated images, given set of physical descriptors. It utilizes an appropriate multi-scale noise weighting to govern the spectral slope. Finally, we delved into texture synthesis from a general perspective, proposing an auto-encoder structure adapted to textures and enriched to handle textures with periodic patterns.
Overall, our work contributes to generating realistic cloud images from limited data, preserving spectral and radiometric properties, thanks to multi-scale approaches that leverage the fractal characteristics of clouds.