Skip to content
BY-NC-ND 3.0 license Open Access Published by De Gruyter Open Access September 25, 2013

Pruning algorithms of neural networks — a comparative study

M. Augasta EMAIL logo and T. Kathirvalavakumar
From the journal Open Computer Science


The neural network with optimal architecture speeds up the learning process and generalizes the problem well for further knowledge extraction. As a result researchers have developed various techniques for pruning the neural networks. This paper provides a survey of existing pruning techniques that optimize the architecture of neural networks and discusses their advantages and limitations. Also the paper evaluates the effectiveness of various pruning techniques by comparing the performance of some traditional and recent pruning algorithms based on sensitivity analysis, mutual information and significance on four real datasets namely Iris, Wisconsin breast cancer, Hepatitis Domain and Pima Indian Diabetes.

[1] P. M. Atkinson, A. R. L. Tatnall, Neural networks in remote sensing, Int. J. Remote Sens. 18(4), 699, 1997 in Google Scholar

[2] A. Fangju, A New Pruning Algorithm for Feedforward Neural Networks, Fourth International Workshop on Advanced Computational Intelligence, IEEE Conference Publication, Wuhan, Hubei, China 19–21 October 2011, 286–289 Search in Google Scholar

[3] A. Yoan, A. Sorjamaa, P. Bas, O. Simula, C. Jutten, A. Lendasse, 3. OP-ELM: optimally pruned extreme learning machine, IEEE Trans. Neural Networks 21(1), 158–162, 2010 in Google Scholar PubMed

[4] S. Ahmmed, K. Abdullah-Al-Mamun, M. Islam, A novel algorithm for designing three layered artificial neural networks, Int. J. Soft. Comput. 2(3), 450–458, 2007 Search in Google Scholar

[5] O. Aran, O. T. Yildiz, E. Alpaydin, An incremental framework based on cross validation for estimating the architecture of a multilayer perceptron, Int. J. Pttern. Recogn. Artif. Intell. 23(2), 159–190, 2009 in Google Scholar

[6] J. Xua, D. W. C. Hob, A new training and pruning algorithm based on node dependence and Jacobian rank deficiency, Neurocomputing 70, 544–558, 2006 in Google Scholar

[7] B. Choi, J. HongLee, D.-H. Kim, Solving local minima problem with large number of hidden nodes on two layered feedforward artificial neural networks, Neurocomputing 71, 3640–3643, 2008 in Google Scholar

[8] D. Sabo, X.-H. Yu, A new pruning algorithm for neural network dimension analysis, IJCNN 2008, IEEE World Congress on Computational Intelligence, In Proc. of IEEE Int. Joint Conference on Neural Networks, Hong Kong, 1–8 June 2008, 3313–3318 10.1109/IJCNN.2008.4634268Search in Google Scholar

[9] R. Reed, Pruning algorithms a survey, IEEE T. Neural Networ. 4(5), 740–747, 1993 in Google Scholar PubMed

[10] R. Setiono, H. Liu, Understanding Neural Networks via Rule Extraction, In: Proc. of 14th International Joint Conference on Artificial Intelligence, Montreal, Canada, August 20–25 1995, 480–485 Search in Google Scholar

[11] M. D. Emmerson, R. I. Damper, Determining and improving the fault tolerance of multi layer perceptrons in a pattern-recognition application, IEEE T. Neural Networ. 4, 788–793, 1993 in Google Scholar PubMed

[12] J. M. Zurada, Introduction to Artificial Neural Systems (Jaisco Publishing House, Mumbai, 2002) Search in Google Scholar

[13] R. Setiono, B. Baesens, C. Mues, A note on knowledge discovery using neural networks and its application to credit card screening, Eur. J. Oper. Res. 192(1), 326–332, 2008 in Google Scholar

[14] M. G. Augasta, T. Kathirvalavakumar, Reverse Engineering the Neural Networks for Rule Extraction in Classification Problems, Neural Process. Lett. 35, 131–150, 2012 in Google Scholar

[15] A. P. Engelbrecht, A new pruning heuristic based on variance analysis of sensitivity information, IEEE T. Neural Networ. 12(6), 1386–1399, 2001 in Google Scholar

[16] T. Q. Huynh, R. Setiono, Effective neural network pruning using cross validation, In: Proc. of IEEE Int. Joint Conference on Neural Networks 2, Montreal, 31 July–4 August 2005, 972–977 Search in Google Scholar

[17] G. Castellano, A. M. Fanelli, M. Pelillo, An Iterative Pruning algoritm for feedforward neural networks, IEEE T Neural Networ. 8(3), 519–530, 1997 in Google Scholar

[18] S. Marsland, S. U. Nehmzow, J. Shapiro, A self organizing network that grows when required, Neural Networ. 15(809), 1041–1058, 2002 in Google Scholar

[19] R. Zhang, Y. Lan, G. B. Huang, Z. B. Xu, Universal approximation of extreme learning machine with adaptive growth of hidden nodes, IEEE T. Neural Networ. Learn. Syst. 23(2), 365–371, 2012 in Google Scholar PubMed

[20] G. B. Huang, L. Chen, 20. Enhanced random search based incremental extreme learning machine, Neuro Comput. 71(16–18), 3460–3468, 2008 10.1016/j.neucom.2007.10.008Search in Google Scholar

[21] A. B. Nielsen, L. K. Hansen, Structure learning by pruning in independent component analysis, Neuro Comput. 71(10–12), 2281–2290, 2008 10.1016/j.neucom.2007.09.016Search in Google Scholar

[22] D. Sabo, X.-H. Yu, Neural network dimension selection for dynamical system identification, In: Proc. of 17th IEEE International Conference on Control Applications, San Antonio, TX, 3–5 September 2008, 972, 977 Search in Google Scholar

[23] S. C. Huang, Y. F. Huang, Bounds on the number of hidden neurons in multilayer perceptrons, IEEE T. Neural Networ. 2, 47–55, 1991 in Google Scholar PubMed

[24] H.-G. Han, J.-F. Qiao, A structure optimisation algorithm for feedforward neuralnetwork construction, Neurocomputing 99, 347–357, 2013 in Google Scholar

[25] P. L. Narasimhaa, W. H. Delashmitb, M. T. Manrya, J. Lic, F. Maldonado, An integrated growing-pruning method for feedforward network training, Neurocomputing 71, 2831–2847, 2008 in Google Scholar

[26] A. B. Nielsen, L. K. Hansen, Structure learning by pruning in independent component analysis, Neurocomputing, 71(10–12), 2281–2290, 2008 in Google Scholar

[27] M. Attik, L. Bougrain, F. Alexandra, Neural Network topology optimization, In: Proceedings of ICANN’05, Lecture Notes in Computer Science, Vol. 3697, 5th International Conference, Warsaw, Poland, 11–15 September, 2005 (Springer, Berlin, Heidelberg, 2005) 53–58 10.1007/11550907_9Search in Google Scholar

[28] Q. Jun-fei, Z. Ying, H. Hong-gui, Fast unit pruning algorithm for feed-forward neural network design, App. Math. Comput. 205(2), 662–667, 2008 10.1016/j.amc.2008.05.049Search in Google Scholar

[29] N. Fnaiech, S. Abid, F. Fnaiech, M. Cheriet, A modified version of a formal pruning algorithm based on local relative variance analysis, First International IEEE Symposium on Control, Communications and Signal Processing, Hammamet, Tunisia, 21–24 March, 2004, 849, 852 in Google Scholar

[30] R. Setiono, A penalty function approach for pruning feedforward neural networks, Neural Comput. 9(1), 185–204, 1997 in Google Scholar

[31] W. Wan, S. Mabu, K. Shimada, K. Hirasawa, Enhancing the generalization ability of neural networks through controlling the hidden layers, J. Hu, App. Soft Comput. 9, 404–414, 2009 in Google Scholar

[32] M. Hagiwara, A simple and effective method for removal of hidden units and weights, Neurocomputing, 6, 207–218, 1994 in Google Scholar

[33] J. Sietsma, Dow RJF, Neural net pruning: why and how, In: Proc. of the IEEE International Conference on Neural Networks, Vol. 1, San Diego, CA, USA, 24–27 July 1988, 325–333 in Google Scholar

[34] H.-J. Xing, B.-G. Hu, Two phase construction of multilayer perceptrons using Information Theory, IEEE T. Neural Networ. 20(4), 715–721, 2009 in Google Scholar PubMed

[35] Z. Zhang, J. Qiao, A Node Pruning Algorithm for Feedforward Neural Network Based on Neural Complexity, In: Int. Conf. on Intelligent Control and Information Processing, Dalian, 13–15 August 2010, 406–410 10.1109/ICICIP.2010.5564272Search in Google Scholar

[36] D. Whitley, C. Bogart, The evolution of connectivity: Pruning neural networks using genetic algorithms, In: Int. Joint Conf. Neural Networks, 1 (IEE Press, Washington DC, 1990) 134–137 Search in Google Scholar

[37] P. G. Benardos, G.-C. Vosniakos, Optimizing feedforward artificial neural network architecture, Eng. App. Artif. Intelligence, 20, 365–382, 2007 in Google Scholar

[38] X. Zeng, D. S. Yeung, Hidden neuron pruning of multilayer perceptrons using a quantified sensitivity measure, Neuro Comput. 69, 825–837, 2006 10.1016/j.neucom.2005.04.010Search in Google Scholar

[39] P. Lauret, E. Fock, T. A. Mara, A Node Pruning Algorithm Based on a Fourier Amplitude Sensitivity Test Method, IEEE T. Neural Networ. 17(2), 273–293, 2006 in Google Scholar

[40] Y. Le Cun, J. S. Denker, S. A. Solla, In. D. S. Touretzky (Ed.), Optimal brain damage, Advances in neural information processing systems (Morgan Kaufmann, San Mateo, 1990) 2, 598–605 Search in Google Scholar

[41] B. Hassibi, D. G. Stork, G. J. Wolf, Optimal brain surgeon and general network pruning, In: Proc. of IEEE ICNN’93, 1, WDS’08 Proceedings of Contributed Papers, Part I, 2008, 293–299 Search in Google Scholar

[42] W. U. Jian-yu, H. E. Xiao-rong, DOBD Algorithm for Training Neural Network, Part I. Method, Chinese J. Process Eng. 2(2), 172–176, 2002 Search in Google Scholar

[43] P. V. S. Ponnapallii, K. C. Ho, M. Thomson, A formal selection and pruning algorithm for feedforward artificial neural network optimization, IEEE T. Neural Networ., 10(4), 964–968, 1999 in Google Scholar

[44] L. M. Belue, K. W. Bauer, Determining input features for multilayer perceptrons, Neurocomputing 7, 111–121, 1995 in Google Scholar

[45] G. Augasta, T. Kathirvalavakumar, A Novel Pruning Algorithm for Optimizing Feedforward Neural Network of Classification Problems, Neural Process. Lett. 34(3), 241–258, 2011 in Google Scholar

[46] T. Ragg, H. Braun, H. Landsberg, A comparative study of neural network optimization Techniques, In 13th International Conf. on Machine Learning, Norwich, UK, 2–4 April, 1997, Artificial Nets and Genetic Algorithms (Springer, 1997) 341–345 10.1007/978-3-7091-6492-1_75Search in Google Scholar

Published Online: 2013-9-25
Published in Print: 2013-9-1

© 2013 Versita Warsaw

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.

Downloaded on 29.1.2023 from
Scroll Up Arrow