Final answer:
In t-SNE objective function, the symmetrized measure of relative entropy or Jensen-Shannon divergence is used to measure the "distance" between distributions and minimize the discrepancy between high-dimensional and low-dimensional sample distributions.
Step-by-step explanation:
The mathematical concept used in the t-SNE objective function to measure the "distance" between distributions and minimize the discrepancy between high-dimensional and low-dimensional sample distributions is the symmetrised measure of relative entropy or Jensen-Shannon divergence (JSD).
This measure is based on the idea of comparing the dissimilarity between pij and q, where pij represents the pairwise similarity between high-dimensional points and q represents the low-dimensional representation.
The symmetrised measure of relative entropy is a square root of the relative entropy and is a proper metric or distance function. It provides a way to quantify the similarity or dissimilarity between distributions, with values close to 0 indicating very similar distributions and values close to the maximum indicating little to no similarity.
By minimizing the JSD, t-SNE aims to find a low-dimensional representation that captures the similarities between high-dimensional data points.