Subscribe Now Subscribe Today
Research Article
 

Feature Extraction by Using Non-linear and Unsupervised Neural Networks



A. Jalil , I.M. Qureshi , A. Naveed and T.a. Cheema
 
Facebook Twitter Digg Reddit Linkedin StumbleUpon E-mail
ABSTRACT

Feature extraction is fairly popular in pattern recognition and classification of images. In this paper we propose an unsupervised learning algorithm for neural networks that are used in feature extraction problem. These learning algorithms use genetic algorithm as a searching technique for global minimum of error performance surface and LMS algorithm for final convergence to the global minimum. These learning algorithms used Sammon`s stress as criterion for getting feature with maximum inter pattern distances and minimum intra pattern distances. A common attribute of these learning algorithms is that they are adaptive in nature, which makes them suitable in environments when the distribution of patterns in the feature space changes with respect to time.

Services
Related Articles in ASCI
Search in Google Scholar
View Citation
Report Citation

 
  How to cite this article:

A. Jalil , I.M. Qureshi , A. Naveed and T.a. Cheema , 2003. Feature Extraction by Using Non-linear and Unsupervised Neural Networks. Information Technology Journal, 2: 40-43.

DOI: 10.3923/itj.2003.40.43

URL: https://scialert.net/abstract/?doi=itj.2003.40.43

REFERENCES

  1. Abbas, H.M. and M.M. Fahmy, 1992. A neural model for adaptive Karhunen Loeve transformation (KLT). Int. Joint Conf. Neural Networks, 2: 975-980.
    Direct Link  |  


  2. Baldi, P. and K. Hornik, 1989. Neural networks and principal component analysis: Learning from examples without local minima. Neural Networks, 2: 53-58.
    CrossRef  |  


  3. Jain, B.G., K. Anil and C.D. Richard, 1981. Evaluation of projection algorithms. Proceedings of the IEEE Transactions on Pattern Analysis and Machine Intelligence, (TPAMI'81), Michigan State University, East Lansing, MI., pp: 701-708
    Direct Link  |  


  4. Chien, Y., 1978. Interactive Pattern Recognition. Marcel Dekker, New Jersey


  5. Chambers, L., 1995. Practical Handbook of Genetic Algorithm Application. Vol. 1, CRC Press, UK


  6. Davis, L., 1991. Handbook of Genetic Algorithms. 1st Edn., Van Nostrand Reinhold, New York, USA., ISBN-13: 9780442001735, Pages: 385


  7. Huang, W.Y. and R.P. Lippmann, 1987. Comparisons between neural net and conventional classifiers. Proceedings of the IEEE 1st International Conference Neural Networks, June 1987, San Diego, CA., pp: 485-493


  8. Hornikand, K. and C.M. Kuan, 1992. Convergence analysis of local feature extraction algorithms. Neural Networks, 5: 229-240.
    Direct Link  |  


  9. Jain, A.K. and J. Mao, 1992. Artificial neural network for nonlinear projection of multivariatedata. Int. Joint Conf. Neural Networks, 3: 335-340.
    CrossRef  |  Direct Link  |  


  10. Freeman, J.A. and M. David, 1999. Skapura. Addison-Wesley, UK


  11. Khotanzad, A. and Y.H. Hong, 1988. Rotation invariant pattern recognition using Zernike moments. Proc. 9th Int. Conf. Pattern Recogn., 1: 326-328.
    Direct Link  |  


  12. Kim, H. and K. Nam, 1995. Object recognition of one-DOF tools by a back-propagation neural net. IEEE Trans. Neural Networks, 6: 484-487.
    Direct Link  |  


  13. Khotanzad, A. and J.H. Lu, 1990. Classification of invariant image representations using a neural network. IEEE Trans. Acoust. Speech Signal Process., 38: 1028-1038.
    Direct Link  |  


  14. Haykin, S., 1998. Neural Networks: A Comprehensive Foundation. 2nd Edn., Prentice Hall, Englewood Cliffs, NJ., USA., ISBN-10: 0132733501, pp: 842


©  2022 Science Alert. All Rights Reserved