| Did you know ... | Search Documentation: |
| Pack logtalk -- logtalk-3.100.1/docs/handbook/_sources/libraries/ica_projection.rst.txt |
.. _library_ica_projection:
ica_projection
Independent Component Analysis reducer for continuous datasets (missing
or non-numeric values are rejected). The library implements the
dimension_reducer_protocol defined in the
dimension_reduction_protocols library and learns a linear unmixing
projection by centering the training data, optionally standardizing
continuous attributes, whitening the covariance matrix using the shared
deterministic symmetric eigen-decomposition from linear_algebra, and
then extracting independent components using a deterministic cubic
FastICA fixed-point iteration with orthogonal deflation.
Open the `../../apis/library_index.html#ica_projection <../../apis/library_index.html#ica_projection>`__ link in a web browser.
To load this library, load the loader.lgt file:
::
| ?- logtalk_load(ica_projection(loader)).
To test this library predicates, load the tester.lgt file:
::
| ?- logtalk_load(ica_projection(tester)).
linear_algebra and then extracts
independent directions using deterministic cubic FastICA with
orthogonal deflation.component_N-Value pairs.The learn/3 predicate accepts the following options:
min(feature_count, sample_count - 1) raise
domain_error(component_count, Requested-Maximum). Requests that
still exceed the numerical rank of the whitened covariance matrix
raise domain_error(component_count, Requested-Extracted). The
default is 2.true or false (default).1000.1.0e-8.
The learned diagnostics also include:
whitening_eigenvalues(Values): Eigenvalues used to build the
whitening transform, aligned with the extracted components.convergence(Statuses): Per-component stop reasons, such as
tolerance or maximum_iterations_exhausted.iterations(Counts): Per-component iteration counts aligned with
the extracted components.final_delta(Deltas): Per-component final update magnitudes aligned
with the extracted components.
The following examples use the sample dataset shipped with the
dimension_reduction_protocols library:
::
| ?- logtalk_load(dimension_reduction_protocols('test_datasets/mixed_independent_sources')).
Learning a reducer ~~~~~~~~~~~~~~~~~~
::
| ?- ica_projection::learn(mixed_independent_sources, DimensionReducer).
| ?- ica_projection::learn(mixed_independent_sources, DimensionReducer, [n_components(2), feature_scaling(true), maximum_iterations(200), tolerance(1.0e-7)]).
Transforming new instances ~~~~~~~~~~~~~~~~~~~~~~~~~~
::
| ?- ica_projection::learn(mixed_independent_sources, DimensionReducer),
ica_projection::transform(DimensionReducer, [x1-(-5.0), x2-(-4.0), x3-(-4.0)], ReducedInstance).
Exporting and reusing the reducer ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
::
| ?- ica_projection::learn(mixed_independent_sources, DimensionReducer, [n_components(2)]),
ica_projection::export_to_file(mixed_independent_sources, DimensionReducer, reducer, 'ica_reducer.pl').
| ?- logtalk_load('ica_reducer.pl'),
reducer(Reducer),
ica_projection::transform(Reducer, [x1-(-5.0), x2-(-4.0), x3-(-4.0)], ReducedInstance).
The learned dimension reducer is represented by a compound term with the functor chosen by the implementation and arity 3. For example:
::
ica_reducer(Encoders, Components, Diagnostics)
Where:
Encoders: List of continuous attribute encoders storing attribute
name, centering offset, and scale factor.Components: List of learned unmixing vectors in feature space.Diagnostics: Learned reducer metadata including the effective
training options, whitening eigenvalues, and per-component convergence
information.