Euclidean versus EMD error function

We tried to find out the relation between number of bins and error function. We observed an interesting behaviour (figure 4.6 on page [*]). For Euclidean distance error function we had the best classification rate $90\%$ for 9 bins per box. For EMD the peak was for 30 bins, the classification rate was $85.5\%$. The lack of discrimination with more bins is being solved via EMD. On the other hand with less bins these are discriminative by itself.


Table 4.3: Euclidean Distance error function and # of bins in SIFT(see figure 4.6 on page [*]) and how this affects the classification rate
# of bins Euclidean Distance (success [%]) EMD (success [%])
5 85.42 72.93
7 87.50 79.17
9 89.60 81.25
11 89.60 77.08
15 85.42 79.17
20 77.08 79.17
30 75.00 85.42
45 75.00 75.00


Figure 4.6: EMD vs. Euclidean Distance Function
\includegraphics[width=85mm,height=70mm]{EMDvsEu.eps}

Kocurek 2007-12-17