University of Leicester
Browse
1506.04631v2.pdf (570.71 kB)

Approximation with Random Bases: Pro et Contra

Download (570.71 kB)
journal contribution
posted on 2015-06-26, 09:00 authored by Alexander N. Gorban, Ivan Yu. Tyukin, D. V. Prokhorov, Konstantin I. Sofeikov
In this work we discuss the problem of selecting suitable approximators from families of parameterized elementary functions that are known to be dense in a Hilbert space of functions. We consider and analyze published procedures, both randomized and deterministic, for selecting elements from these families that have been shown to ensure the rate of convergence in $L_2$ norm of order $O(1/N)$, where $N$ is the number of elements. We show that both strategies are successful providing that additional information about the families of functions to be approximated is provided at the stages of learning and practical implementation. In absence of such additional information one may observe exponential growth of the number of terms needed to approximate the function and/or extreme sensitivity of the outcome of the approximation to parameters. Implications of our analysis for applications of neural networks in modeling and control are illustrated with examples.

History

Citation

arXiv:1506.04631 [cs.NA]

Author affiliation

/Organisation/COLLEGE OF SCIENCE AND ENGINEERING/Department of Mathematics

Version

  • AO (Author's Original)

Published in

arXiv:1506.04631 [cs.NA]

Available date

2015-06-26

Publisher version

http://arxiv.org/abs/1506.04631

Notes

arXiv admin note: text overlap with arXiv:0905.0677 MSC classes: 41A45, 41A45, 90C59, 92B20, 68W20

Language

en

Usage metrics

    University of Leicester Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC