Anomaly Detection plays a pivotal role across multiples areas of research like Cybersecurity, Manufacturing Inspection and Medical Imaging. Traditional Supervised Learning methods struggle to discriminate between subtle intra-class variations and actual anomalies due to the scarcity of labeled anomalous data, making Unsupervised and Self-Supervised approaches often more efficient and reliable. Contrastive Learning has emerged as a powerful technique to separate similar (nominal) and dissimilar (augmented or surrogate) samples without explicit labels thanks to pretext tasks that help the models learn robust feature representations. Moreover, recent studies proved that the inherent nominal data distribution can be further enhanced through the combination of multiple levels of knowledge, which often exceeds the perfomance of conventional pretrained models. While many researchers aim to develop solutions that generalize across diverse domains, very few consider class-specific adaptations to capture their unique properties. This thesis explores the sinergy between Contrastive Learning and Multi-stage representation through a novel pretext task that leverages multiple domain-specific transformations.

Learning class-specific features with Random Augmentation Prediction and Multi-stage Representation for Anomaly Detection

PERENCIN, FRANCESCO
2024/2025

Abstract

Anomaly Detection plays a pivotal role across multiples areas of research like Cybersecurity, Manufacturing Inspection and Medical Imaging. Traditional Supervised Learning methods struggle to discriminate between subtle intra-class variations and actual anomalies due to the scarcity of labeled anomalous data, making Unsupervised and Self-Supervised approaches often more efficient and reliable. Contrastive Learning has emerged as a powerful technique to separate similar (nominal) and dissimilar (augmented or surrogate) samples without explicit labels thanks to pretext tasks that help the models learn robust feature representations. Moreover, recent studies proved that the inherent nominal data distribution can be further enhanced through the combination of multiple levels of knowledge, which often exceeds the perfomance of conventional pretrained models. While many researchers aim to develop solutions that generalize across diverse domains, very few consider class-specific adaptations to capture their unique properties. This thesis explores the sinergy between Contrastive Learning and Multi-stage representation through a novel pretext task that leverages multiple domain-specific transformations.
File in questo prodotto:
File Dimensione Formato  
Tesi_Francesco_Perencin.pdf

non disponibili

Dimensione 7.67 MB
Formato Adobe PDF
7.67 MB Adobe PDF

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14247/25188