Evolutionary regularization of deep neural networks

Griñán Martínez, David (2019). Evolutionary regularization of deep neural networks. Thesis (Master thesis), E.T.S. de Ingenieros Informáticos (UPM).

Description

Title: Evolutionary regularization of deep neural networks
Author/s:
  • Griñán Martínez, David
Contributor/s:
  • Manrique Gamo, Daniel
Item Type: Thesis (Master thesis)
Masters title: Inteligencia Artificial
Date: 2019
Subjects:
Faculty: E.T.S. de Ingenieros Informáticos (UPM)
Department: Inteligencia Artificial
Creative Commons Licenses: Recognition - No derivative works - Non commercial

Full text

[img]
Preview
PDF - Requires a PDF viewer, such as GSview, Xpdf or Adobe Acrobat Reader
Download (3MB) | Preview

Abstract

Se propone una estrategia innovadora de regularización para redes neuronales basada en la noción de que, al quitar conexiones entre neuronas en la topología de una red neuronal, no solo se obtendrá una estructura menos compleja sino que también tendrá mejores capacidades de generalización que si fuera una estructura totalmente conexa. Esta estructura parcial se generará mediante un algoritmo de programación genética guiada por gramática que buscará en el conjunto de las topologías válidas hasta encontrar una adecuada. Para realizar esta tarea, se ha desarrollado una metagramática que, dada una estructura máxima, esta metagramática produzca subestructuras correctas que formen redes neuronales parcialmente conexas contenidas en la superestructura definida. Una vez instanciada, la gramática producirá árboles de derivación que contienen diferentes caminos desde una neurona de la capa de entrada a una neurona de la capa de salida pasando por una neurona de cada capa oculta. La topología final se obiene finalmente superponiendo todos los caminos usando únicamente una conexión en aquellos casos en los que haya repeticiones en la colección de caminos definidos por el árbol de derivación. El objetivo de desarrollar esta estrategia es proponer una alternativa a la técnica de dropout que, no solo reduzca el número de parámetros a estimar en una red, sino que tampoco requiera de ajustar hiperparámetros. Para probar esta técnica, se entrenará una red neuronal sobredimensionada para el conjunto de datos para confirmar que hay over- fitting, después se aplicará regularización para observar los resultados y finalmente se aplicará nuestra estrategia para comparar los resultados con los obtenidos en los pasos previos.---ABSTRACT---We propose a novel strategy of performing regression for deep neural networks based on the notion that by dropping connections in the topology of the net not only will the structure be less complex but it will also generalize better than if it were fully connected. This partially connected structure will be created by a grammar-guided genetic algorithm which will search through different topologies until a suitable one is found. To this end, a metagrammar has been developed so that given a maximum possible structure, the grammar will produce substructures forming partiallyconnected neural networks that are contained in said structure. The grammar produces derivation trees that contain different paths from a neuron in the input layer towards a neuron in the output layer going through exactly one neuron on each of the hidden layer. The final topology is then obtained by overlaying all paths using only one connection in case they are repeated in the path collection defined by each derivation tree. The goal of developing this strategy is to propose an alternative to the standard regularization techniques such as dropout that not only reduces the number of parameters in the net but it also doesn't require the tunning of hyperparameters. To test this approach, an overgrown neural network will be trained for a specific dataset to confirm that there is overfitting, then a dropout technique will be used to avoid said overfitting and finally our strategy will be applied to the overgrown net to compare the results with those obtained via dropout.

More information

Item ID: 56023
DC Identifier: http://oa.upm.es/56023/
OAI Identifier: oai:oa.upm.es:56023
Deposited by: Biblioteca Facultad de Informatica
Deposited on: 03 Sep 2019 07:22
Last Modified: 03 Sep 2019 07:25
  • Logo InvestigaM (UPM)
  • Logo GEOUP4
  • Logo Open Access
  • Open Access
  • Logo Sherpa/Romeo
    Check whether the anglo-saxon journal in which you have published an article allows you to also publish it under open access.
  • Logo Dulcinea
    Check whether the spanish journal in which you have published an article allows you to also publish it under open access.
  • Logo de Recolecta
  • Logo del Observatorio I+D+i UPM
  • Logo de OpenCourseWare UPM