University of Leicester
Browse
1610.00494.pdf (524.08 kB)

One-trial correction of legacy AI systems and stochastic separation theorems

Download (524.08 kB)
journal contribution
posted on 2019-04-04, 12:09 authored by AN Gorban, R Burton, I Romanenko, IY Tyukin
We consider the problem of efficient “on the fly” tuning of existing, or legacy, Artificial Intelligence (AI) systems. The legacy AI systems are allowed to be of arbitrary class, albeit the data they are using for computing interim or final decision responses should posses an underlying structure of a high-dimensional topological real vector space. The tuning method that we propose enables dealing with errors without the need to re-train the system. Instead of re-training a simple cascade of perceptron nodes is added to the legacy system. The added cascade modulates the AI legacy system's decisions. If applied repeatedly, the process results in a network of modulating rules “dressing up” and improving performance of existing AI systems. Mathematical rationale behind the method is based on the fundamental property of measure concentration in high dimensional spaces. The method is illustrated with an example of fine-tuning a deep convolutional network that has been pre-trained to detect pedestrians in images.

Funding

The work was supported by the Ministry of Education and Science of Russia (Project no. 14.Y26.31.0022) and Innovate UK Knowledge Transfer Partnership grants KTP009890 and KTP010522.

History

Citation

Information Sciences, 2019, 484, pp. 237-254

Author affiliation

/Organisation/COLLEGE OF SCIENCE AND ENGINEERING/Department of Mathematics

Version

  • AM (Accepted Manuscript)

Published in

Information Sciences

Publisher

Elsevier

issn

0020-0255

eissn

1872-6291

Acceptance date

2019-02-01

Copyright date

2019

Publisher version

https://www.sciencedirect.com/science/article/pii/S0020025519300969?via=ihub

Notes

The file associated with this record is under embargo until 12 months after publication, in accordance with the publisher's self-archiving policy. The full text may be available through the publisher links provided above.

Language

en

Usage metrics

    University of Leicester Publications

    Categories

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC