Added regularization support for neural network models
mixture = 1mixture = 00 < mixture < 1penalty (regularization strength) and
mixture (L1/L2 balance) parametersglmnet and other packagesn_hlayers() now fully supports tuning the number of
hidden layers
hidden_neurons() gains support for discrete values
via the disc_values argument
disc_values = c(32L, 64L, 128L, 256L)) is now
allowedTuning methods and grid_depth() is now fixed
n_hlayers (no
more invalid sampling when x > 1)tidyr::expand_grid(), not
purrr::cross*(){kindling}‘s own ’dials’n_hlayers = 1The supported models now use hardhat::mold(),
instead of model.frame() and
model.matrix().
Add a vignette to showcase the comparison with other similar packages
The package description got few clarifications
Vignette to showcase the comparison with other similar packages
hidden_neurons parameter now supports discrete
values specification
values
parameter (e.g.,
hidden_neurons(values = c(32, 64, 128)))hidden_neurons(range = c(8L, 512L)) /
hidden_neurons(c(8L, 512L)))Added \value documentation to
kindling-nn-wrappers for CRAN compliance
Documented argument handling and list-column unwrapping in tidymodels wrapper functions
Clarified the relationship between grid_depth() and
wrapper functions