HOME JOURNALS CONTACT

Journal of Applied Sciences

Year: 2013 | Volume: 13 | Issue: 19 | Page No.: 3903-3908
DOI: 10.3923/jas.2013.3903.3908
Application and Study of Ordinal Decision Tree in the Teaching Quality Evaluation
Hong-Yan Ma, Jian-Kai Chen , Nan Yang and Li-Ling Wang

Abstract: Ordinal decision tree is one of the important ways of dealing with ordinal classification tasks. Ordinal decision tree based rank mutual information is representative of ordinal decision tree learning algorithms. Rank mutual information can be used to reflect the monotonous relevance between features and decision. Namely, it is useful for measuring the importance of attributes in ordinal classification. This study applies ordinal decision tree to teaching evaluation of colleges and universities for improving the level of teaching evaluation. The aim is to make teaching quality evaluation fair, reasonable and effective.

Fulltext PDF

How to cite this article
Hong-Yan Ma, Jian-Kai Chen , Nan Yang and Li-Ling Wang , 2013. Application and Study of Ordinal Decision Tree in the Teaching Quality Evaluation. Journal of Applied Sciences, 13: 3903-3908.

Keywords: Ordinal decision tree, rank mutual information and teaching evaluation

REFERENCES

  • Ben-David, A., L. Sterling and Y.H. Pao, 1989. Learning and classification of monotonic ordinal concepts. Comput. Intell., 5: 45-49.
    CrossRef    Direct Link    


  • Hu, Q.H., M.Z. Guo, D.R. Yu and J.F. Liu, 2010. Information entropy for ordinal classification. Sci. China Inform. Sci., 53: 1188-1200.
    CrossRef    Direct Link    


  • Dembczynski, K., W. Kotlowski and R. Slowinski, 2008. Ensemble of decision rules for ordinal classification with monotonicity constraints. Proceedings of the 3rd International Conference on Rough Sets and Knowledge Technology, May 17-19, 2008, Chengdu, China, pp: 260-267.


  • Mitchell, T.M., 1997. Machine Learning. WCB/McGraw-Hill, Boston, MA., USA


  • Hu, Q.H., X.J. Che, L. Zhang, D. Zhang, M. Guo and D. Yu, 2012. Rank entropy-based decision trees for monotonic classification. IEEE Trans. Knowl. Data Eng., 24: 2052-2064.
    CrossRef    Direct Link    


  • Quinlan, J.R., 1986. Induction of decision trees. Mach. Learn., 1: 81-106.
    CrossRef    


  • Wu, X., V. Kumar, J.R. Quinlan, J. Ghosh and Q. Yang et al., 2008. Top 10 algorithms in data mining. Knowledge Inform. Syst., 14: 1-37.
    CrossRef    Direct Link    


  • Wang, X.Z., X.H. Gao and Q. He, 2010. Side effect of cut in decision tree generation for continuous attributes. Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, October 10-13, 2010, Istanbul, pp: 1364-1369.


  • Wang, X.Z., H.W. Yang, M.H. Zhao and J. Sun, 2002. A decision tree based on hierarchical decomposition. Proceedings of the International Conference on Machine Learning and Cybernetics, Volume 4, November 4-5, 2002, Beijing, China, pp: 1824-1828.


  • Wang, X.Z. and J. Shen, 2006. Using special structured fuzzy measure to represent interaction among IF-THEN rules. Proceedings of the 4th International Conference on Advances in Machine Learning and Cybernetics, August 18-21, 2005, Guangzhou, China, pp: 459-466.


  • Zopounidis, C. and M. Doumpos, 2002. Multicriteria classification and sorting methods: A literature review. Eur. J. Oper. Res., 138: 229-246.
    CrossRef    Direct Link    


  • Zhu, Y.Q., H.B. Yang and L. Sun, 2006. Data Mining Technology. Southeast University Press, Nanjing, China

  • © Science Alert. All Rights Reserved