Scientists can not reproduce AI research, and this is a serious problem

<pre>Scientists can not reproduce AI research, and this is a serious problem

At a recent meeting of the Association for the Advancement of Artificial Intelligence, scientist Odd Erik Gundersen presented a report, the essence of which – the industry is driving itself into a dangerous impasse. It turns out that most existing AIs do not support the fundamental principle of replication (reproducibility) of their own actions. And for that there are two reasons for which solutions are not yet seen.

By replication in this case we mean obtaining identical results of the AI's work when setting identical tasks. The user wants to be sure that the control system of his laptop or nuclear reactor works not only efficiently, but also predictably. So far, everything is limited to very simple, template tasks, as it is, but we already see the beginning of deviations in the work of real systems.

The first reason: all modern AI continuously learn and change their style of work, tactics and strategy. Become individual, because of what they need to be forcibly re-trained to work in new conditions. But it is extremely difficult to implement for the second reason – the source code, the algorithms of almost all systems are closed by their developers.

The Gundersen report indicates that of the 400 AIs submitted to the community in the last two years, only 6% of the algorithm has been deciphered and studied. The issuance of intermediate data was supported by less than a third of programs, which made their debugging and tuning an incredibly difficult task. Gundersen recognizes the authors of AI algorithms the right to keep their intellectual work in secret, but calls on everyone to think about how to start working together. Otherwise, the future of AI looks very vague.

           js.src = “&version=v2.8”;    'script', 'facebook-jssdk'));

Source link