Why I Started Writing
2026 is shaping up to be another exciting year for AI and technology. One of my New Year’s resolutions is to start a personal website to document my learning journey and the “aha” moments I encounter along the way, both in technology and in life. So why writing, and why now? Some might argue that large language models already explain technical concepts so well that writing blog posts no longer adds much value. I see it differently.Here are my reasons. ...
Introduction to Contrastive Loss
Contrastive Loss is a widely used objective in metric learning and contrastive learning. Its goal is to learn an embedding space where similar samples are close together, while dissimilar samples are far apart. The loss operates on pairs of samples: Positive pairs: two samples that should be considered similar Negative pairs: two samples that should be considered different Given a pair of embeddings and a binary label, contrastive loss: penalizes large distances between positive pairs penalizes small distances between negative pairs (up to a margin) This encourages the model to learn representations that are discriminative and geometry-aware. ...