The mathematical foundations of the modern Soft Computing (SC) techniques go back to Kolmogorov's approximation theorem stating that each multi-variable continuous function on a compact domain can be approximated with arbitrary accuracy by the composition of single-variable continuous functions . Since the late eighties several authors have proved that different types of neural networks possess the universal approximation property (e.g. ). Similar results have been published since the early nineties in fuzzy theory claiming that different fuzzy reasoning methods are related to universal approximators (e.g. in ). Due to the fact that Kolmogorov's theorem aims at the approximation of the very wide class of continuous functions, the functions to be constructed are often very complicated and highly non-smooth, therefore their construction is difficult. As is well known, continuity allows very extreme behavior even in the case of single-variable functions. The first example of a function that is everywhere continuous but nowhere differentiable was given by Weierstraß in 1872 . At that time mathematicians believed that such functions are only rare extreme examples, but nowadays it has become clear that the great majority of the continuous functions have extreme properties. The seemingly antagonistic contradiction between the complicated nature of the universal approximators and their successful practical applications makes one arrive at the conclusion that if we restrict our models to the far better behaving "everywhere differentiable" functions, these problems ab ovo can be evaded or at least reduced.