Skip to main content

Evolving Character-Level DenseNet Architectures Using Genetic Programming

  • Conference paper
  • First Online:
Applications of Evolutionary Computation (EvoApplications 2021)

Abstract

Densely Connected Convolutional Networks (DenseNet) have demonstrated impressive performance on image classification tasks, but limited research has been conducted on using character-level DenseNet (char-DenseNet) architectures for text classification tasks. It is not clear what DenseNet architectures are optimal for text classification tasks. The iterative task of designing, training and testing of char-DenseNets is a time consuming task that requires expert domain knowledge. Evolutionary deep learning (EDL) has been used to automatically design CNN architectures for the image classification domain, thereby mitigating the need for expert domain knowledge. This study demonstrates the first work on using EDL to evolve char-DenseNet architectures for text classification tasks. A novel genetic programming-based algorithm (GP-Dense) coupled with an indirect-encoding scheme, facilitates the evolution of performant char-DenseNet architectures. The algorithm is evaluated on two popular text datasets, and the best-evolved models are benchmarked against four current state-of-the-art character-level CNN and DenseNet models. Results indicate that the algorithm evolves performant models for both datasets that outperform two of the state-of-the-art models in terms of model accuracy and three of the state-of-the-art models in terms of parameter size.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. The MIT Press, Cambridge (2016)

    MATH  Google Scholar 

  2. Zhang, X., Zhao, J., LeCun, Y.: Character-level convolutional networks for text classification. In: Proceedings of the 28th International Conference on Neural Information Processing Systems, NIPS 2015, vol. 1, pp. 649–657. MIT Press, Cambridge (2015)

    Google Scholar 

  3. Conneau, A., Schwenk, H., Barrault, L., Lecun, Y.: Very deep convolutional networks for text classification. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, Valencia, Spain, pp. 1107–1116. Association for Computational Linguistics (April 2017). https://www.aclweb.org/anthology/E17-1104

  4. Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12(76), 2493–2537 (2011). http://jmlr.org/papers/v12/collobert11a.html

    MATH  Google Scholar 

  5. Yang, Z., Dai, Z., Yang, Y., Carbonell, J., Salakhutdinov, R.R., Le, Q.V.: XLNet: generalized autoregressive pretraining for language understanding. In: Wallach, H., Larochelle, H., Beygelzimer, A., d’ Alché-Buc, F., Fox, E., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 32, pp. 5753–5763. Curran Associates, Inc. (2019). https://proceedings.neurips.cc/paper/2019/file/dc6a7e655d7e5840e66733e9ee67cc69-Paper.pdf

  6. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  7. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Burstein, J., Doran, C., Solorio, T. (eds.) Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2019, Minneapolis, MN, USA, 2–7 June 2019, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/n19-1423

  8. Le, H.T., Cerisara, C., Denis, A.: Do Convolutional Networks need to be deep for text classification? In: AAAI Workshop on Affective Content Analysis. New Orleans, United States (February 2018)

    Google Scholar 

  9. Church, K.W.: Word2Vec. Nat. Lang. Eng. 23(1), 155–162 (2017)

    Article  Google Scholar 

  10. Pennington, J., Socher, R., Manning, C.: GloVe: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, pp. 1532–1543. Association for Computational Linguistics (October 2014). https://www.aclweb.org/anthology/D14-1162

  11. Huang, G., Liu, Z., Van Der Maaten, L., Weinberger, K.Q.: Densely connected convolutional networks. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2261–2269 (2017)

    Google Scholar 

  12. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning Representations by Back-Propagating Errors, pp. 696–699. MIT Press, Cambridge (1988)

    MATH  Google Scholar 

  13. Koza, J.R.: Genetic Programming: On the Programming of Computers by Means of Natural Selection. MIT Press, Cambridge, MA, USA (1992)

    MATH  Google Scholar 

  14. De Sa, C., Feldman, M., Ré, C., Olukotun, K.: Understanding and optimizing asynchronous low-precision stochastic gradient descent. In: 2017 ACM/IEEE 44th Annual International Symposium on Computer Architecture (ISCA), pp. 561–574 (2017)

    Google Scholar 

  15. Hara, K., Saito, D., Shouno, H.: Analysis of function of rectified linear unit used in deep learning. In: 2015 International Joint Conference on Neural Networks (IJCNN), pp. 1–8 (2015)

    Google Scholar 

  16. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition, CoRR abs/1512.0 (2015)

    Google Scholar 

  17. Liang, J., Meyerson, E., Hodjat, B., Fink, D., Mutch, K., Miikkulainen, R.: Evolutionary neural automl for deep learning. In: Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2019, pp. 401–409. Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3321707.3321721

  18. Miikkulainen, R., et al.: Evolving deep neural networks (2017)

    Google Scholar 

  19. Wulczyn, E., Thain, N., Dixon, L.: Wikipedia talk labels: personal attacks (2017). https://figshare.com/articles/dataset/Wikipedia_Talk_Labels_Personal_Attacks/4054689

  20. Gruau, F.: Neural Network Synthesis Using Cellular Encoding and the Genetic Algorithm. Ph.D. Thesis (1994)

    Google Scholar 

  21. Kaiming, H., Xiangyu, Z., Shaoqing, R., Jian, S.: Delving deep into rectifiers: surpassing human-level performance on imagenet classification Kaiming. Biochem. Biophys. Res. Commun. 498(1), 254–261 (2018)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Trevor Londt .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Londt, T., Gao, X., Andreae, P. (2021). Evolving Character-Level DenseNet Architectures Using Genetic Programming. In: Castillo, P.A., Jiménez Laredo, J.L. (eds) Applications of Evolutionary Computation. EvoApplications 2021. Lecture Notes in Computer Science(), vol 12694. Springer, Cham. https://doi.org/10.1007/978-3-030-72699-7_42

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-72699-7_42

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-72698-0

  • Online ISBN: 978-3-030-72699-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics