ISSN (Online) : 2456 - 0774

Email : ijasret@gmail.com

ISSN (Online) 2456 - 0774


Reduction the Dimensional Dependence Using Rank-Based Similarity Search

Abstract

TheK-NN is a technique in which objects are classified depends on nearest trainingexamples which is present in the feature query space. The K-NN is the simplestclassification method in data mining. In K-NN objects are classified when thereis no information about the distribution of the data objects is known. In K-NNperformance of classification is depend on K and it can be determined by thechoice of K as well as the distance metric of query. The performance of K-NNclassification is largely affected by selection of K which is having a suitableneighborhood size. It is a key issue for classification. This paper proposed adata structure which is for K-NN search, called as RANK COVER TREE to increasethe computational cost of K-NN Search. In RCT pruning test involves thecomparison of objects similar values relevant to query. In Rank Cover Tree eachobject can assign a specific order and according to that order object canselected which can be relevant to the respective object query. It can controlthe overall query execution cost .It provides result for Non-metric pruningmethods for similarity search and when high dimensional data is processed itprovides the same result. It returns corrects query execution result inrequired time that relies on a intrinsic dimensionality of objects of the dataset. RCT can exceed the performance of methods involving metric pruning andmany selection tests involving distance values having numerical constraints onit

Keywords: K-Nearest neighbor search, intrinsic dimensionality, rank-basedsearch, RCT.

Full Text PDF

IMPORTANT DATES 

Submit paper at ijasret@gmail.com

Paper Submission Open For April  20212
UGC indexed in (Old UGC) 2017
Last date for paper submission 30th April, 2022
Deadline Submit Paper any time
Publication of Paper Within 01-02 Days after completing all the formalities
Paper Submission Open For Publication /online Conference 
Publication Fees  
Free for PR Students