### Abstract

The global optimization properties of a cellular neural network (CNN) with a slowly varying slope of the output characteristic (see [1]), are studied. It is shown that a two-cell CNN is able to find the global minimum of a quadratic function over the unit hypercube for any values of the input parameters. Then it is proved that if the dimension is higher than 2, then even the CNN described by the simplest one-dimensional space-invariant template [A_{1}, A_{0}, _{1}[ fails to find the global minimum in a subset of the parameter space. Finally through extensive simulations, it is shown that the CNN described by the above 3 element template works correctly within several parameter ranges, but that if the parameters are chosen according to a random algorithm, the error rate increases with the number of cells.

Original language | English |
---|---|

Pages | 417-422 |

Number of pages | 6 |

Publication status | Published - Dec 1 1996 |

Event | Proceedings of the 1996 4th IEEE International Workshop on Cellular Neural Networks, and Their Applications, CNNA-96 - Seville, Spain Duration: Jun 24 1996 → Jun 26 1996 |

### Other

Other | Proceedings of the 1996 4th IEEE International Workshop on Cellular Neural Networks, and Their Applications, CNNA-96 |
---|---|

City | Seville, Spain |

Period | 6/24/96 → 6/26/96 |

### Fingerprint

### ASJC Scopus subject areas

- Software

### Cite this

*Global optimization through time-varying cellular neural networks*. 417-422. Paper presented at Proceedings of the 1996 4th IEEE International Workshop on Cellular Neural Networks, and Their Applications, CNNA-96, Seville, Spain, .