Skip to main content
Log in

An effective integrated genetic programming and neural network model for electronic nose calibration of air pollution monitoring application

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Air quality control requires real-time monitoring of pollutant concentration distributions in large urban areas. Estimation models are used for the soft-calibration of low-cost multisensor data to improve precision of pollutant concentration measurements. This study introduces an integrated genetic programming dynamic neural network model for more accurate estimation of carbon monoxide and nitrogen dioxide pollutant concentrations from the multisensor measurement data. This model combines a genetic programming-based estimation model with a neural estimator model and improves estimation performances. In this structure, a genetic programming-based polynomial model works as a former estimator and it feeds the neural estimator model via a short-term former estimation memory. Then, the neural model utilizes this former estimation memory in order to enhance pollutant concentration estimations. This integration approach benefits from the correlation enrichment strategy that is performed by the former model. The proposed two-stage training procedure facilitates the training of the integrated models. In experimental study, the standalone genetic programming model, artificial neural network model, and the proposed integrated model are implemented to estimate carbon monoxide and nitrogen dioxide pollutant concentrations from the experimental multisensor air quality data. Results demonstrate that the proposed integrated model can decrease mean relative error about 10% compared to the standalone artificial neural network and about 28% compared to the standalone genetic programming estimation models. Authors suggested that the integrated estimation model can be used for more accurate soft-calibration of multisensor electronic noses in a wide-area air-quality monitoring application.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Data availability

Available.

Code availability

Available.

References

  1. Völgyesi P, Nádas A, Koutsoukos X, Lédeczi Á (2008) Air Quality Monitoring with SensorMap. In: 2008 International Conference on Information Processing in Sensor Networks (ipsn 2008). IEEE, pp 529–530

  2. Al-Janabi S, Alkaim A, Al-Janabi E et al (2021) Intelligent forecaster of concentrations (PM2.5, PM10, NO2, CO, O3, SO2) caused air pollution (IFCsAP). Neural Comput Appl 33(21):14199–14229. https://doi.org/10.1007/s00521-021-06067-7

    Article  Google Scholar 

  3. Spinelle L, Gerboles M, Villani MG et al (2015) Field calibration of a cluster of low-cost available sensors for air quality monitoring. Part A: Ozone and nitrogen dioxide. Sensors Actuators B Chem 215:249–257. https://doi.org/10.1016/j.snb.2015.03.031

    Article  Google Scholar 

  4. Amuthadevi C, Vijayan DS, Ramachandran V (2021) Development of air quality monitoring (AQM) models using different machine learning approaches. J Ambient Intell Humaniz Comput. https://doi.org/10.1007/s12652-020-02724-2

    Article  Google Scholar 

  5. He J, Xu L, Wang P, Wang Q (2017) A high precise E-nose for daily indoor air quality monitoring in living environment. Integration 58:286–294. https://doi.org/10.1016/j.vlsi.2016.12.010

    Article  Google Scholar 

  6. Gocheva-Ilieva SG, Voynikova DS, Stoimenova MP et al (2019) Regression trees modeling of time series for air pollution analysis and forecasting. Neural Comput Appl 31:9023–9039. https://doi.org/10.1007/s00521-019-04432-1

    Article  Google Scholar 

  7. Polat K, Durduran SS (2012) Usage of output-dependent data scaling in modeling and prediction of air pollution daily concentration values (PM10) in the city of Konya. Neural Comput Appl 21:2153–2162. https://doi.org/10.1007/s00521-011-0661-z

    Article  Google Scholar 

  8. Bougoudis I, Demertzis K, Iliadis L (2016) HISYCOL a hybrid computational intelligence system for combined machine learning: the case of air pollution modeling in Athens. Neural Comput Appl 27:1191–1206. https://doi.org/10.1007/s00521-015-1927-7

    Article  Google Scholar 

  9. De Vito S, Piga M, Martinotto L, Di Francia G (2009) CO, NO2 and NOx urban pollution monitoring with on-field calibrated electronic nose by automatic bayesian regularization. Sensors Actuators B Chem 143:182–191. https://doi.org/10.1016/j.snb.2009.08.041

    Article  Google Scholar 

  10. Vardoulakis S, Fisher BE, Pericleous K, Gonzalez-Flesca N (2003) Modelling air quality in street canyons: a review. Atmos Environ 37:155–182. https://doi.org/10.1016/S1352-2310(02)00857-9

    Article  Google Scholar 

  11. Kourtidis KA, Ziomas I, Zerefos C et al (2002) Benzene, toluene, ozone, NO2 and SO2 measurements in an urban street canyon in Thessaloniki, Greece. Atmos Environ 36:5355–5364. https://doi.org/10.1016/S1352-2310(02)00580-0

    Article  Google Scholar 

  12. Tsujita W, Yoshino A, Ishida H, Moriizumi T (2005) Gas sensor network for air-pollution monitoring. Sensors Actuators B Chem 110:304–311. https://doi.org/10.1016/j.snb.2005.02.008

    Article  Google Scholar 

  13. De Vito S, Massera E, Piga M et al (2008) On field calibration of an electronic nose for benzene estimation in an urban pollution monitoring scenario. Sensors Actuators B Chem 129:750–757. https://doi.org/10.1016/j.snb.2007.09.060

    Article  Google Scholar 

  14. Zhang L, Tian F, Liu S et al (2013) Chaos based neural network optimization for concentration estimation of indoor air contaminants by an electronic nose. Sensors Actuators A Phys 189:161–167. https://doi.org/10.1016/j.sna.2012.10.023

    Article  Google Scholar 

  15. Yan J, Guo X, Duan S et al (2015) Electronic nose feature extraction methods: a review. Sensors 15:27804–27831. https://doi.org/10.3390/s151127804

    Article  Google Scholar 

  16. Tang K-T, Chiu S-W, Pan C-H et al (2010) Development of a portable electronic nose system for the detection and classification of fruity odors. Sensors 10:9179–9193. https://doi.org/10.3390/s101009179

    Article  Google Scholar 

  17. Zhang L, Tian F, Nie H et al (2012) Classification of multiple indoor air contaminants by an electronic nose and a hybrid support vector machine. Sensors Actuators B Chem 174:114–125. https://doi.org/10.1016/j.snb.2012.07.021

    Article  Google Scholar 

  18. Wilson AD (2012) Review of electronic-nose technologies and algorithms to detect hazardous chemicals in the environment. Procedia Technol 1:453–463. https://doi.org/10.1016/j.protcy.2012.02.101

    Article  Google Scholar 

  19. Capelli L, Sironi S, Del Rosso R (2014) Electronic noses for environmental monitoring applications. Sensors 14:19979–20007. https://doi.org/10.3390/s141119979

    Article  Google Scholar 

  20. Wedge DC, Das A, Dost R et al (2009) Real-time vapour sensing using an OFET-based electronic nose and genetic programming. Sensors Actuators B Chem 143:365–372. https://doi.org/10.1016/j.snb.2009.09.030

    Article  Google Scholar 

  21. Ari D, Alagoz BB (2021) A Genetic Programming Based Pollutant Concentration Predictor Design for Urban Pollution Monitoring Based on Multi-Sensor Electronic Nose. In: 2021 International Conference on Information Technology (ICIT). IEEE, pp 168–172

  22. Zhang Z, Zeng Y, Yan K (2021) A hybrid deep learning technology for PM2.5 air quality forecasting. Environ Sci Pollut Res 28:39409–39422. https://doi.org/10.1007/s11356-021-12657-8

    Article  Google Scholar 

  23. Jin N, Zeng Y, Yan K, Ji Z (2021) Multivariate air quality forecasting with nested long short term memory neural network. IEEE Trans Ind Informatics 17:8514–8522. https://doi.org/10.1109/TII.2021.3065425

    Article  Google Scholar 

  24. Lemus L, Hernández A, Luna R et al (2010) Do sensory cortices process more than one sensory modality during perceptual judgments? Neuron 67:335–348. https://doi.org/10.1016/j.neuron.2010.06.015

    Article  Google Scholar 

  25. Stein BE, Stanford TR, Rowland BA (2009) The neural basis of multisensory integration in the midbrain: its organization and maturation. Hear Res 258:4–15. https://doi.org/10.1016/j.heares.2009.03.012

    Article  Google Scholar 

  26. McCulloch WS, Pitts W (1943) A logical calculus of the ideas immanent in nervous activity. Bull Math Biophys 5:115–133. https://doi.org/10.1007/BF02478259

    Article  MathSciNet  MATH  Google Scholar 

  27. Widrow B, Lehr MA (1990) 30 years of adaptive neural networks: perceptron, Madaline, and backpropagation. Proc IEEE 78:1415–1442. https://doi.org/10.1109/5.58323

    Article  Google Scholar 

  28. Jawad J, Hawari AH, Javaid Zaidi S (2021) Artificial neural network modeling of wastewater treatment and desalination using membrane processes: A review. Chem Eng J 419:129540. https://doi.org/10.1016/j.cej.2021.129540

    Article  Google Scholar 

  29. Aminian J, Shahhosseini S (2008) Evaluation of ANN modeling for prediction of crude oil fouling behavior. Appl Therm Eng 28:668–674. https://doi.org/10.1016/j.applthermaleng.2007.06.022

    Article  Google Scholar 

  30. Hasanien HM (2011) FPGA implementation of adaptive ANN controller for speed regulation of permanent magnet stepper motor drives. Energy Convers Manag 52:1252–1257. https://doi.org/10.1016/j.enconman.2010.09.021

    Article  Google Scholar 

  31. Vijaya G, Kumar V, Verma HK (1998) ANN-based QRS-complex analysis of ECG. J Med Eng Technol 22:160–167. https://doi.org/10.3109/03091909809032534

    Article  Google Scholar 

  32. Egmont-Petersen M, de Ridder D, Handels H (2002) Image processing with neural networks—a review. Pattern Recognit 35:2279–2301. https://doi.org/10.1016/S0031-3203(01)00178-9

    Article  MATH  Google Scholar 

  33. Wilamowski BM, Hao Yu (2010) Improved computation for levenberg–marquardt training. IEEE Trans Neural Networks 21:930–937. https://doi.org/10.1109/TNN.2010.2045657

    Article  Google Scholar 

  34. Nawi NM, Khan A, Rehman MZ (2013) A new levenberg marquardt based back propagation algorithm trained with cuckoo search. Procedia Technol 11:18–23. https://doi.org/10.1016/j.protcy.2013.12.157

    Article  Google Scholar 

  35. Hagan MT, Menhaj MB (1994) Training feedforward networks with the Marquardt algorithm. IEEE Trans Neural Networks 5:989–993. https://doi.org/10.1109/72.329697

    Article  Google Scholar 

  36. Stanley KO, Clune J, Lehman J, Miikkulainen R (2019) Designing neural networks through neuroevolution. Nat Mach Intell 1:24–35. https://doi.org/10.1038/s42256-018-0006-z

    Article  Google Scholar 

  37. Anochi J, Sambatti S, Luz E, Velho HC (2016) New learning strategy for supervised neural network: MPCA meta-heuristic approach. In: Anais do 11. Congresso Brasileiro de Inteligência Computacional. SBIC, pp 1–6

  38. Ramchoun H, Amine M, Idrissi J et al (2016) Multilayer perceptron: architecture optimization and training. Int J Interact Multimed Artif Intell 4:26. https://doi.org/10.9781/ijimai.2016.415

    Article  Google Scholar 

  39. Luo R, Tian F, Qin T, et al (2018) Neural architecture optimization. In: NIPS’18: Proceedings of the 32nd International Conference on Neural Information Processing Systems. Red Hook, NY, USA, pp 7827–7838

  40. Elsken T, Metzen JH, Frank H (2019) Neural architecture search: a survey. J Mach Learn Res 20:1–21

    MathSciNet  MATH  Google Scholar 

  41. Waheeb W, Ghazali R, Shah H (2019) Nonlinear Autoregressive Moving-average (NARMA) Time Series Forecasting Using Neural Networks. In: 2019 International Conference on Computer and Information Sciences (ICCIS). IEEE, pp 1–5

  42. Sholahudin S, Han H (2016) Simplified dynamic neural network model to predict heating load of a building using Taguchi method. Energy 115:1672–1678. https://doi.org/10.1016/j.energy.2016.03.057

    Article  Google Scholar 

  43. Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117. https://doi.org/10.1016/j.neunet.2014.09.003

    Article  Google Scholar 

  44. Kim P (2017) MATLAB deep learning. Apress, Berkeley, CA

    Book  Google Scholar 

  45. Maji P, Mullins R (2018) On the reduction of computational complexity of deep convolutional neural networks. Entropy 20:305. https://doi.org/10.3390/e20040305

    Article  Google Scholar 

  46. Suganuma M, Shirakawa S, Nagao T (2017) A genetic programming approach to designing convolutional neural network architectures. In: Proceedings of the Genetic and Evolutionary Computation Conference on - GECCO ’17. ACM Press, New York, New York, USA, pp 497–504

  47. Ding S, Li H, Su C et al (2013) Evolutionary artificial neural networks: a review. Artif Intell Rev 39:251–260. https://doi.org/10.1007/s10462-011-9270-6

    Article  Google Scholar 

  48. Li H, Wang X, Ding S (2018) Research and development of neural network ensembles: a survey. Artif Intell Rev 49:455–479. https://doi.org/10.1007/s10462-016-9535-1

    Article  Google Scholar 

  49. Zameer A, Arshad J, Khan A, Raja MAZ (2017) Intelligent and robust prediction of short term wind power using genetic programming based ensemble of neural networks. Energy Convers Manag 134:361–372. https://doi.org/10.1016/j.enconman.2016.12.032

    Article  Google Scholar 

  50. Yang H, Ni J (2005) Dynamic neural network modeling for nonlinear, nonstationary machine tool thermally induced error. Int J Mach Tools Manuf 45:455–465. https://doi.org/10.1016/j.ijmachtools.2004.09.004

    Article  Google Scholar 

  51. Alimohammadi H, Alagoz BB, Tepljakov A et al (2020) A NARX model reference adaptive control scheme: improved disturbance rejection fractional-order PID control of an experimental magnetic levitation system. Algorithms 13:201. https://doi.org/10.3390/a13080201

    Article  MathSciNet  Google Scholar 

  52. Shin Y, Ghosh J (1995) Ridge polynomial networks. IEEE Trans Neural Networks 6:610–622. https://doi.org/10.1109/72.377967

    Article  Google Scholar 

  53. Trierweiler Ribeiro G, Alves Portela Santos A, Cocco Mariani V, dos Santos CL (2021) Novel hybrid model based on echo state neural network applied to the prediction of stock price return volatility. Expert Syst Appl 184:115490. https://doi.org/10.1016/j.eswa.2021.115490

    Article  Google Scholar 

  54. Rodrigues Moreno S, Gomes da Silva R, Cocco Mariani V, dos Santos CL (2020) Multi-step wind speed forecasting based on hybrid multi-stage decomposition model and long short-term memory neural network. Energy Convers Manag 213:112869. https://doi.org/10.1016/j.enconman.2020.112869

    Article  Google Scholar 

  55. Bruzzone L, Fernàndez Prieto D (1999) An incremental-learning neural network for the classification of remote-sensing images. Pattern Recognit Lett 20:1241–1248. https://doi.org/10.1016/S0167-8655(99)00091-4

    Article  Google Scholar 

  56. Rusu AA, Rabinowitz CN, Desjardins G, et al (2016) Progressive neural networks. ArXiv

  57. Koza JR (1992) Genetic programming: on the programming of computers by means of natural selection. MIT press

    MATH  Google Scholar 

  58. Poli R, Langdon WB, McPhee NF, Koza JR (2007) Genetic programming an introductory tutorial and a survey of techniques and applications. Tech Rep CES475 18:1–112

    Google Scholar 

  59. Chen Q, Xue B, Zhang M (2019) Improving generalization of genetic programming for symbolic regression with angle-driven geometric semantic operators. IEEE Trans Evol Comput 23:488–502. https://doi.org/10.1109/TEVC.2018.2869621

    Article  Google Scholar 

  60. Madár J, Abonyi J, Szeifert F (2005) Genetic programming for the identification of nonlinear input−output models. Ind Eng Chem Res 44:3178–3186. https://doi.org/10.1021/ie049626e

    Article  Google Scholar 

  61. Castelli M, Trujillo L, Vanneschi L, Popovič A (2015) Prediction of energy performance of residential buildings: a genetic programming approach. Energy Build 102:67–74. https://doi.org/10.1016/j.enbuild.2015.05.013

    Article  Google Scholar 

  62. Abooali D, Khamehchi E (2019) New predictive method for estimation of natural gas hydrate formation temperature using genetic programming. Neural Comput Appl 31:2485–2494. https://doi.org/10.1007/s00521-017-3208-0

    Article  Google Scholar 

  63. Miller JF (2019) Cartesian genetic programming: its status and future. Springer, US

    Google Scholar 

  64. Brameier M, Banzhaf W (2001) Evolving teams of predictors with linear genetic programming. Genet Program Evolvable Mach 2:381–407. https://doi.org/10.1023/A:1012978805372

    Article  MATH  Google Scholar 

  65. Ferreira C (2001) Gene expression programming: a new adaptive algorithm for solving problems. Pp 1–22

  66. Danandeh Mehr A (2021) Seasonal rainfall hindcasting using ensemble multi-stage genetic programming. Theor Appl Climatol 143:461–472. https://doi.org/10.1007/s00704-020-03438-3

    Article  Google Scholar 

  67. De Stefano C, Fontanella F, Folino G, Scotto di Freca A (2011) A Bayesian Approach for Combining Ensembles of GP Classifiers. In: Sansone Carlo, Kittler Josef, Roli Fabio (eds) Multiple Classifier Systems: 10th International Workshop, MCS 2011, Naples, Italy, June 15-17, 2011. Proceedings. Springer, Berlin, Heidelberg, pp 26–35

    Chapter  Google Scholar 

  68. Amir Haeri M, Ebadzadeh MM, Folino G (2017) Statistical genetic programming for symbolic regression. Appl Soft Comput J 60:447–469. https://doi.org/10.1016/j.asoc.2017.06.050

    Article  Google Scholar 

  69. Banzhaf W, Francone FD, Keller RE, Nordin P (1998) Genetic Programming: An Introduction. Morgan Kaufmann Publishers, San Francisco, CA

    Book  Google Scholar 

  70. Ince T, Kiranyaz S, Pulkkinen J, Gabbouj M (2010) Evaluation of global and local training techniques over feed-forward neural network architecture spaces for computer-aided medical diagnosis. Expert Syst Appl 37:8450–8461. https://doi.org/10.1016/j.eswa.2010.05.033

    Article  Google Scholar 

  71. Ferreira RP, Martiniano A, Ferreira A et al (2016) Study on daily demand forecasting orders using artificial neural network. IEEE Lat Am Trans 14:1519–1525. https://doi.org/10.1109/TLA.2016.7459644

    Article  Google Scholar 

  72. Siddiquee MSA, Hossain MMA (2015) Development of a sequential Artificial Neural Network for predicting river water levels based on Brahmaputra and Ganges water levels. Neural Comput Appl 26:1979–1990. https://doi.org/10.1007/s00521-015-1871-6

    Article  Google Scholar 

  73. Baldi P, Sadowski P (2016) A theory of local learning, the learning channel, and the optimality of backpropagation. Neural Netw 83:51–74. https://doi.org/10.1016/j.neunet.2016.07.006

    Article  Google Scholar 

  74. Strang G (2019) The functions of deep learning. in: linear algebra and learning from data, Indian edi

  75. Mhaskar H, Liao Q, Poggio T (2016) Learning functions: when is deep better than shallow

  76. Munir S, Mayfield M (2021) Application of density plots and time series modelling to the analysis of nitrogen dioxides measured by low-cost and reference sensors in Urban Areas. Nitrogen 2:167–195. https://doi.org/10.3390/nitrogen2020012

    Article  Google Scholar 

  77. Ghazali R, Jaafar Hussain A, Mohd Nawi N, Mohamad B (2009) Non-stationary and stationary prediction of financial time series using dynamic ridge polynomial neural network. Neurocomputing 72:2359–2367. https://doi.org/10.1016/j.neucom.2008.12.005

    Article  Google Scholar 

  78. Al-Rakhami M, Gumaei A, Alsanad A et al (2019) An ensemble learning approach for accurate energy load prediction in residential buildings. IEEE Access 7:48328–48338. https://doi.org/10.1109/ACCESS.2019.2909470

    Article  Google Scholar 

Download references

Funding

This study did not receive any funding.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Baris Baykant Alagoz.

Ethics declarations

Conflict of interest

There are no conflicts of interest/competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Detailed performance analyses of the IGPD-NN, GP and ANN models for randomly sampled test datasets after 100 repeated training sessions are presented below. (Average performance indices in tables were validated within the statistical significance level of 5%.) See Tables 6, 7, 8, 9, 10, 11, 12.

Table 6 MSE performance analyses of CO estimation models
Table 7 MSE performance analyses of NO2 estimation models
Table 8 MRE performance analyses of CO estimation models
Table 9 MRE performance analyses of NO2 estimation models
Table 10 R2 score performance analyses of CO estimation models
Table 11 R2 score performance analyses of NO2 estimation models
Table 12 Effects of hidden layer neuron numbers on average MSE performances of CO and NO2 estimation models

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ari, D., Alagoz, B.B. An effective integrated genetic programming and neural network model for electronic nose calibration of air pollution monitoring application. Neural Comput & Applic 34, 12633–12652 (2022). https://doi.org/10.1007/s00521-022-07129-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07129-0

Keywords

Navigation