Skip to end of metadata
Go to start of metadata

Accuracy Analysis

Icon

Accuracy analysis is performed to determine the accuracy of the classes formed after the classification process.

Accuracy analysis is a control method based on the principle of statistical comparison with reference maps or a source providing accurate information about the land.
Icon

In the analysis, the user enters the actual values of the classes of certain points and an error matrix is generated. Size of this matrix is [class number]x[class number].

Errors occur due to inaccurate classification of the pixels. In the analysis, degree of accuracy can be analysed instead of the degree of error.

If there are many unclassified pixels, accuracy representation rate of the training data set decreases.

i th element of row i of the matrix contains the number of pixels correctly labeled by the classifier and included in class i by the operator, elements in the other columns of the same row show the number and distribution of the pixels classified inaccurately. Degree of accuracy of class i, is obtained by dividing the diagonal element by the reference data total of the same row. Overall classification accuracy (percentage) is the average of accuracies of each class.

 

Table overall classification accuracy is (350/410)*100 = 85.4%

I refers to the row total; II refers to the classification accuracy of each class in percentage; III refers to the number of pixels that belong to class i in the reference data but not assigned to class i as a result of the classification; IV refers to the number of pixels assigned to class i as a result of the classification but do not belong to class i in the reference data; V refers to the column total.

Icon

Another accuracy analysis approach is the Kappa (κ) coefficient.

Kappa (κ) coefficient is mainly used as a statistical measure to calculate weighted accuracy of the classification.

r is the number of classes; xi+ is the row total; xii is the diagonal elements of the error matrix, x+i is the column total; N is the total number of pixels in the error matrix.

Kappa (κ) value obtained at the end shows that;

  • There is 1.00 full consistency,
  • 0.75 and higher classification performance is very good,
  • Performance below 0.40 is insufficient,
  • 0.00 shows that there is no consistency between classified and reference data.
Icon

For accuracy analysis (?)

  • No labels