Please use this identifier to cite or link to this item: http://ricaxcan.uaz.edu.mx/jspui/handle/20.500.11845/721
Full metadata record
DC FieldValueLanguage
dc.contributor6207es_ES
dc.contributor.otherhttps://orcid.org/0000-0002-7081-9084es_ES
dc.coverage.spatialGlobales_ES
dc.creatorOrtíz Rodríguez, José Manuel-
dc.creatorMartínez Blanco, María del Rosario-
dc.creatorCervantes Miramontes, José Manuel-
dc.creatorVega Carrillo, Héctor René-
dc.date.accessioned2019-03-13T15:37:06Z-
dc.date.available2019-03-13T15:37:06Z-
dc.date.issued2013-01-
dc.identifierinfo:eu-repo/semantics/publishedVersiones_ES
dc.identifier.isbn978-953-51-0935-8es_ES
dc.identifier.urihttp://localhost/xmlui/handle/20.500.11845/721-
dc.identifier.urihttps://doi.org/10.48779/dezm-yc98es_ES
dc.description.abstractApplications of artificial neural networks (ANNs) have been reported in literature in various areas. [1–5] The wide use of ANNs is due to their robustness, fault tolerant and the ability to learn and generalize, through training process, from examples, complex nonlinear and multi input/output relationships between process parameters using the process data. [6–10] The ANNs have many other advantageous characteristics, which include: generalization, adaptation, universal function approximation, parallel data processing, robustness, etc. Multilayer perceptron (MLP) trained with backpropagation (BP) algorithm is the most used ANN in modeling, optimization classification and prediction processes. [11, 12] Although BP algorithm has proved to be efficient, its convergence tends to be very slow, and there is a possibility to get trapped in some undesired local minimum. [4, 10, 11, 13] Most literature related to ANNs focused on specific applications and their results rather than the methodology of developing and training the networks. In general, the quality of the developed ANN is highly dependable not only on ANN training algorithm and its parameters but also on many ANN architectural parameters such as the number of hidden layers and nodes per layer which have to be set during training process and these settings are very crucial to the accuracy of ANN model. [8, 14–19]es_ES
dc.language.isoenges_ES
dc.publisherIntechOpenes_ES
dc.relationhttps://www.intechopen.com/books/artificial-neural-networks-architectures-and-applications/robust-design-of-artificial-neural-networks-methodology-in-neutron-spectrometryes_ES
dc.relation.urigeneralPublices_ES
dc.rightsAtribución-NoComercial-CompartirIgual 3.0 Estados Unidos de América*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/3.0/us/*
dc.sourceArtificial Neural Networks - Architectures and Applications, Editado po Kenji Suzuki, Universidad de Chicagoes_ES
dc.subject.classificationCIENCIAS FISICO MATEMATICAS Y CIENCIAS DE LA TIERRA [1]es_ES
dc.subject.otherArtificial Neural Networkses_ES
dc.subject.otherNeutron Spectrometryes_ES
dc.subject.otherMultilayer perceptrones_ES
dc.titleRobust Design of Artificial Neural Networks Methodology in Neutron Spectrometryes_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
Appears in Collections:*Documentos Académicos*-- UA Ciencias Nucleares

Files in This Item:
File Description SizeFormat 
Robust design.pdf1,88 MBAdobe PDFThumbnail
View/Open


This item is licensed under a Creative Commons License Creative Commons