Retrieval with Deep Learning: A Ranking loss Survey Part 1

A deep network trained with a ranking loss to enable searching and indexing.
Contrastive Loss formulation.
Triplet Loss formulation
Triplet loss tuple (anchor, positive, negative) and margin m. Hard, semi-hard and easy negatives highlighted in red, cyan and orange respectively.
Hard sampling promotes unimodal embedding by picking the farthest positive and nearest negative (a, p1, n). Semi-hard sampling picks (a, p2, n) and avoids any tuple (a, p, n) where n lies between a and p.

--

--

--

I write reviews on computer vision papers. Writing tips are welcomed.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Text Summarization, Part 2 — State Of the Art

Damage Detection of cars and Repair cost Estimation!!

Snagging Parking Spaces with Mask R-CNN and Python

Queue Prioritization Using Random Forest

Dog Breed Classifier: From Scratch and Transfer Learning

A Decision Tree is an algorithm used for supervised learning problems such as classification or…

How to Create a Lead Scoring Model with Graphite

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Ahmed Taha

Ahmed Taha

I write reviews on computer vision papers. Writing tips are welcomed.

More from Medium

Review — Equalized Focal Loss for Dense Long-Tailed Object Detection

Going deeper with convolutions: The Inception paper, explained

Emotional Computer Vision and Machine Consciousness

Review — Motion Masks: Learning Features by Watching Objects Move