The global optimization properties of a cellular neural network (CNN) with a slowly varying slope of the output characteristic (see ), are studied. It is shown that a two-cell CNN is able to find the global minimum of a quadratic function over the unit hypercube for any values of the input parameters. Then it is proved that if the dimension is higher than 2, then even the CNN described by the simplest one-dimensional space-invariant template [A1, A0, 1[ fails to find the global minimum in a subset of the parameter space. Finally through extensive simulations, it is shown that the CNN described by the above 3 element template works correctly within several parameter ranges, but that if the parameters are chosen according to a random algorithm, the error rate increases with the number of cells.
|Number of pages||6|
|Publication status||Published - Dec 1 1996|
|Event||Proceedings of the 1996 4th IEEE International Workshop on Cellular Neural Networks, and Their Applications, CNNA-96 - Seville, Spain|
Duration: Jun 24 1996 → Jun 26 1996
|Other||Proceedings of the 1996 4th IEEE International Workshop on Cellular Neural Networks, and Their Applications, CNNA-96|
|Period||6/24/96 → 6/26/96|
ASJC Scopus subject areas