DCFA

Distributed Class-Depedent Feature Analysis - A Big Data Approach

K. Luu, C. Zhu, and M. Savvides

IEEE International Conference on Big Data, DC, 2014.

 

@inproceedings{luu2014distributed,

  title={Distributed class dependent feature analysis—A big data approach},

  author={Luu, Khoa and Zhu, Chenchen and Savvides, Marios},

  booktitle={Big Data (Big Data), 2014 IEEE International Conference on},

  pages={201--206},

  year={2014},

  organization={IEEE}

}

 

We propose a new machine learning approach named Distributed Class-dependent Feature Analysis (DCFA), to combine the advantages of sparse representation in an over-complete dictionary. The classifier is based on the estimation of class-specific optimal filters, by solving an L1-norm optimization problem. We demonstrate how this problem is solved using the Alternating Direction Method of Multipliers (ADMM) and also explore relevant convergency details. More importantly, our proposed framework can be efficiently implemented on a robust distributed framework. Thus, it improves both accuracy and computational time in large-scale databases.

 

Our approach optimizes the following objective function via L1-norm:

 

 

 

We solve it in the ADMM form:

 

 

 

 

 

 

The algorithm is setup in the distributed framework to deal with large-scale datasets, by model parallelism:

 

 

 

 

 

 


 

 

 

Our method achieves very high classification accuracies in face recognition in the presence of occlusions on AR database. It also outperforms the state of the art methods in object recognition on two challenging large-scale object databases, i.e. Caltech101, Caltech256. It hence shows its applicability to general computer vision and pattern recognition problems. In addition, computational time experiments show our distributed method achieves high speedup of 7.85x on Caltech256 databases with just 10 machine nodes compared to the non-distributed version and can gain even more with more computing resources.