ISSN (Online) : 2456 - 0774

Email : ijasret@gmail.com

ISSN (Online) 2456 - 0774


Reduction the Dimensional Dependence Using Rank-Based Similarity Search 

Abstract

The K-NN is a technique in which objects are classified depends on nearest training examples which is present in the feature query space. The K-NN is the simplest classification method in data mining. In K-NN objects are classified when there is no information about the distribution of the data objects is known. In K-NN performance of classification is depend on K and it can be determined by the choice of K as well as the distance metric of query. The performance of K-NN classification is largely affected by selection of K which is having a suitable neighbourhood size. It is a key issue for classification. This paper proposed a data structure which is for K-NN search, called as RANK COVER TREE to increase the computational cost of K-NN Search. In RCT pruning test involves the comparison of objects similar values relevant to query. In RCT, by assigning ranks to each objects and select objects with respect to their ranks which is relevant to the data query object. It provides much control on the whole execution costs of query. It gives experimental results which is a Non-metric pruning strategies for similarity search .when high dimensional data are used it gives the same result. It returns corrects query execution result in required time that relies on a intrinsic dimensionality of objects of the data set. RCT can exceed the performance of methods involving metric pruning and many selection tests involving distance values having numerical constraints on it 


Keywords:- K-Nearest neighbour search, intrinsic dimensionality, rank-based search, RCT.

Full Text PDF

IMPORTANT DATES 

Submit paper at ijasret@gmail.com

Paper Submission Open For March 2024
UGC indexed in (Old UGC) 2017
Last date for paper submission 30th March, 2024
Deadline Submit Paper any time
Publication of Paper Within 01-02 Days after completing all the formalities
Paper Submission Open For Publication /online Conference 
Publication Fees  
Free for PR Students