Please use this identifier to cite or link to this item:
|Title:||Generating Random Variates via Kernel Density Estimation and Radial Basis Function Based Neural Networks|
|Authors:||Forero Vargas, Manuel Guillermo|
|Keywords:||General regression neural network|
Probabilistic neural network
Kernel density estimation
|Publisher:||Lecture Notes in Computer Science|
|Citation:||Candia-García C., Forero M.G., Herrera-Rivera S. (2019) Generating Random Variates via Kernel Density Estimation and Radial Basis Function Based Neural Networks. In: Vera-Rodriguez R., Fierrez J., Morales A. (eds) Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications. CIARP 2018. Lecture Notes in Computer Science, vol 11401. Springer, Cham|
|Abstract:||When modeling phenomena that cannot be studied by deterministic analytical approaches, one of the main tasks is to generate random variates. The widely-used techniques, such as the inverse transformation, convolution, and rejection-acceptance methods, involve a significant amount of statistical work and do not provide satisfactory results when the data do not conform to the known probability density functions. This study aims to propose an alternative nonparametric method for generating random variables that combines kernel density estimation (KDE), and radial basis function based neural networks (RBFBNNs). We evaluate the method’s performance using Poisson, triangular, and exponential probability density distributions and assessed its utility for unknown distributions. The results show that the model’s effectiveness depends substantially on selecting an appropriate bandwidth value for KDE and a certain minimum number of data points to train the algorithm. the proposed method enabled us to achieve an R2 value between 0.91 and 0.99 for analyzed distributions.|
|Appears in Collections:||Artículos|
Files in This Item:
There are no files associated with this item.
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.