Multilayered neural architectures evolution for computing sequences of orthogonal polynomials

Barrios Rolanía, Maria Dolores ORCID: https://orcid.org/0000-0002-4060-965X, Delgado Martínez, Guillermo and Manrique Gamo, Daniel ORCID: https://orcid.org/0000-0002-0792-4156 (2018). Multilayered neural architectures evolution for computing sequences of orthogonal polynomials. "Annals of Mathematics and Artificial Intelligence", v. 84 ; pp. 161-184. ISSN 1012-2443. https://doi.org/10.1007/s10472-018-9601-2.

Descripción

Título: Multilayered neural architectures evolution for computing sequences of orthogonal polynomials
Autor/es:
Tipo de Documento: Artículo
Título de Revista/Publicación: Annals of Mathematics and Artificial Intelligence
Fecha: 18 Septiembre 2018
ISSN: 1012-2443
Volumen: 84
Materias:
ODS:
Palabras Clave Informales: Evolutionary computation; Grammar-guided genetic programming; Artificial neural networks; Orthogonal polynomials
Escuela: E.T.S. de Ingenieros Informáticos (UPM)
Departamento: Inteligencia Artificial
Grupo Investigación UPM: Inteligencia Artificial LIA
Licencias Creative Commons: Reconocimiento - Sin obra derivada - No comercial

Texto completo

[thumbnail of This document is a manuscript. The final version has been published in Annals of Mathematics and Artificial Intelligence] PDF (Portable Document Format) (This document is a manuscript. The final version has been published in Annals of Mathematics and Artificial Intelligence) - Se necesita un visor de ficheros PDF, como GSview, Xpdf o Adobe Acrobat Reader
Descargar (1MB)

Resumen

This article presents an evolutionary algorithm to autonomously construct full-connected multilayered feedforward neural architectures. This algorithm employs grammar-guided genetic programming with a context-free grammar that has been specifically designed to satisfy three important restrictions. First, the sentences that belong to the language produced by the grammar only encode all valid neural architectures. Second, full-connected feedforward neural architectures of any size can be generated. Third, smaller-sized neural architectures are favored to avoid overfitting. The proposed evolutionary neural architectures construction system is applied to compute the terms of the two sequences that define the three-term recurrence relation associated with a sequence of orthogonal polynomials. This application imposes an important constraint: training datasets are always very small. Therefore, an adequate sized neural architecture has to be evolved to achieve satisfactory results, which are presented in terms of accuracy and size of the evolved neural architectures, and convergence speed of the evolutionary process.

Proyectos asociados

Tipo
Código
Acrónimo
Responsable
Título
Gobierno de España
MTM2014-54053-P
DECATA
D. Barrios Rolanía
Ecuaciones en diferencias y aproximación constructiva: teoría y aplicaciones

Más información

ID de Registro: 90404
Identificador DC: https://oa.upm.es/90404/
Identificador OAI: oai:oa.upm.es:90404
URL Portal Científico: https://portalcientifico.upm.es/es/ipublic/item/5497982
Identificador DOI: 10.1007/s10472-018-9601-2
URL Oficial: https://link.springer.com/article/10.1007/s10472-0...
Depositado por: Dr Daniel Manrique
Depositado el: 19 Ago 2025 05:50
Ultima Modificación: 19 Ago 2025 05:50