| Did you know ... | Search Documentation: |
| Pack logtalk -- logtalk-3.100.1/docs/handbook/_sources/libraries/elastic_net_regression.rst.txt |
.. _library_elastic_net_regression:
elastic_net_regression
Elastic net regression regressor supporting continuous and mixed-feature
datasets. The library implements the regressor_protocol defined in
the regression_protocols library and learns a linear model using
cyclic coordinate descent with coefficient-wise soft-thresholding
updates and an additional L2 term in order to minimize mean squared
error plus a standard elastic net penalty.
Open the `../../apis/library_index.html#elastic_net_regression <../../apis/library_index.html#elastic_net_regression>`__ link in a web browser.
To load this library, load the loader.lgt file:
::
| ?- logtalk_load(elastic_net_regression(loader)).
To test this library predicates, load the tester.lgt file:
::
| ?- logtalk_load(elastic_net_regression(tester)).
The unit test suite covers the default mixed-penalty behavior together
with the boundary cases l1_ratio(0.0) and l1_ratio(1.0).
To run the performance benchmark suite, load the
tester_performance.lgt file:
::
| ?- logtalk_load(elastic_net_regression(tester_performance)).
l1_ratio(0.0) and lasso-like
l1_ratio(1.0) endpoints.The learned regressor is represented by default as:
elastic_net_regressor(Encoders, Bias, Weights, Diagnostics)
The exported predicate clauses therefore use the shape:Functor(Encoders, Bias, Weights, Diagnostics)The diagnostics/2 predicate returns a list of metadata terms with the form:
::
[
model(elastic_net_regression),
target(Target),
training_example_count(TrainingExampleCount),
options(Options),
convergence(Status),
iterations(Iterations),
final_delta(FinalDelta),
encoded_feature_count(FeatureCount)
]
Where:
model(elastic_net_regression) identifies the learning algorithm
that produced the regressor.target(Target) stores the target attribute name declared by the
training dataset.training_example_count(TrainingExampleCount) stores the number of
examples used during training.options(Options) stores the effective learning options after
merging the user options with the library defaults.convergence(Status) records the optimization stop condition. The
current values are tolerance when the maximum Karush-Kuhn-Tucker
optimality violation across the intercept and all encoded features is
within the configured tolerance and maximum_iterations_exhausted
when training stops because the iteration cap is reached.iterations(Iterations) stores the number of coordinate-descent
sweeps completed during training.final_delta(FinalDelta) stores the maximum Karush-Kuhn-Tucker
optimality violation measured during the final optimization check.encoded_feature_count(FeatureCount) stores the number of numeric
features induced by the encoder list, including missing-value
indicator features.
Use the regression_protocols diagnostic/2 and
regressor_options/2 helper predicates when you only need a single
metadata term or the effective options.
The learn/3 predicate accepts the following options:
2000.1.0e-7.0.01.[0.0, 1.0], where 0.0 gives the ridge endpoint and 1.0
gives the lasso endpoint. The default is 0.5.true and false. The default is true.