Using up-sampling or down-sampling methods, we can deal with unbalanced data. K-Fold cross validation with metrics such as precision and recall can be used to evaluate the model's performance.

Model-Tuning: The simplest approach to counteracting the negative effects of class imbalance is to tune the model to maximize the accuracy of the minority classes.

Alternate Cutoffs: You use the ROC curve since it calculates the sensitivity and specificity across a continuum of cutoffs. Using this curve, an appropriate balance between sensitivity and specificity can be determined.

In case of Naive Bayes, you can adjust prior probabilities. Another approach to rebalancing the training set would be to increase the weights for the samples in the minority classes. Sampling methods are another approach.

If you search how to deal with class imbalance you will find a lot of research papers suggesting an approach but these are the most popular ones.

Using up-sampling or down-sampling methods, we can deal with unbalanced data. K-Fold cross validation with metrics such as precision and recall can be used to evaluate the model's performance.

Model-Tuning: The simplest approach to counteracting the negative effects of class imbalance is to tune the model to maximize the accuracy of the minority classes.

Alternate Cutoffs: You use the ROC curve since it calculates the sensitivity and specificity across a continuum of cutoffs. Using this curve, an appropriate balance between sensitivity and specificity can be determined.

In case of Naive Bayes, you can adjust prior probabilities. Another approach to rebalancing the training set would be to increase the weights for the samples in the minority classes. Sampling methods are another approach.

If you search how to deal with class imbalance you will find a lot of research papers suggesting an approach but these are the most popular ones.