Chargement Évènements

« Tous les Évènements

Séminaire IMAGE : « Navigating the Evaluation Landscape: From Scalar Metrics to Parametrized Curves in Generative Model Assessment » (Benjamin Sykes)

30 mai / 14:00 - 15:30

Nous aurons le plaisir d’écouter Benjamin Sykes, doctorant de l’équipe IMAGE du GREYC.
Il donnera un séminaire IMAGE, le jeudi 30 mai 2024, à 14h00, en salle de séminaire F-200.

Titre: « Navigating the Evaluation Landscape: From Scalar Metrics to Parametrized Curves in Generative Model Assessment »

Résumé: 
While it seems fairly easy to evaluate the « quality » of a generative model’s output, finding a metric that intrinsically quantifies the performance of the model remains a difficult task, let alone defining what a « performant generative model » in fact means.
Whereas most generative models outputs are still evaluated with some scalar values such as Fréchet Inception Distance (FID) or Inception Score (IS), in the last years (Sajjadi et al., 2018) proposed a definition of precision and recall for distributions (PRD) with an interpretation as a precision-recall curve, therefore transposing a concept already used and accepted in binary classification. Since then, various different approaches to precision and recall have seen the light (Kynkaanniemi et al., 2019), (Park et al., 2023), (Naeem et al., 2020). While those implementations of precision and recall tend to jointly simplify the computations, give an intuition of the algorithms and aim at reducing the effects of outliers, they do have noticeable short-comings such as the fact they are based solely on some intuition and they overlook the initial and mathematically grounded definition of precision and recall as a trade-off curve.
After presenting the generative model evaluation landscape by presenting the existing evaluation metrics (FID, IS, PRD, Improved Precision Recall, Density Coverage). We will then motivate the use of precision and recall as parametrised curves as opposed to two scalar metrics.  We will finally present our unifying setting in which we extend common scalar PR metrics to PRD curves using the binary classification approach from (Simon et al., 2019).

/

Références :
– Kynkäänniemi, T., Karras, T., Laine, S., Lehtinen, J., & Aila, T. (2019). Improved Precision and Recall Metric for Assessing Generative Models. Advances in Neural Information Processing Systems, 32. https://proceedings.neurips.cc/paper_files/paper/2019/hash/0234c510bc6d908b28c70ff313743079-Abstract.html
– Naeem, M. F., Oh, S. J., Uh, Y., Choi, Y., & Yoo, J. (2020). Reliable Fidelity and Diversity Metrics for Generative Models. Proceedings of the 37th International Conference on Machine Learning, 7176–7185. https://proceedings.mlr.press/v119/naeem20a.html
– Park, D., & Kim, S. (2023). Probabilistic Precision and Recall Towards Reliable Evaluation of Generative Models. 20099–20109.
https://openaccess.thecvf.com/content/ICCV2023/html/Park_Probabilistic_Precision_and_Recall_Towards_Reliable_Evaluation_of_Generative_Models_ICCV_2023_paper.html
– Sajjadi, M. S. M., Bachem, O., Lucic, M., Bousquet, O., & Gelly, S. (2018). Assessing Generative Models via Precision and Recall. Advances in Neural Information Processing Systems, 31. https://proceedings.neurips.cc/paper_files/paper/2018/hash/f7696a9b362ac5a51c3dc8f098b73923-Abstract.html
– Simon, L., Webster, R., & Rabin, J. (2019). Revisiting precision recall definition for generative modeling. Proceedings of the 36th International Conference on Machine Learning, 5799–5808. https://proceedings.mlr.press/v97/simon19a.html

/

On vous y attend nombreux !

Détails

Date :
30 mai
Heure :
14:00 - 15:30
Catégories d’évènement:
,

Organisateur

Image

Lieu

ENSICAEN – Batiment F – Salle F-200
6 Bd Maréchal Juin
Caen, 14050 France
+ Google Map
View Lieu Website