Skip to content

Commit

Permalink
doc: fix bugs in doc (#105)
Browse files Browse the repository at this point in the history
  • Loading branch information
bbayukari authored Jun 18, 2024
1 parent 6d1627b commit 2da8f19
Show file tree
Hide file tree
Showing 10 changed files with 297 additions and 329 deletions.
4 changes: 2 additions & 2 deletions docs/source/feature/DataScienceTool.rst
Original file line number Diff line number Diff line change
Expand Up @@ -79,9 +79,9 @@ Information Criterion


Information criterion is a statistical measure used to assess the goodness of fit of a model while penalizing model complexity. It helps in selecting the optimal model from a set of competing models. In the context of sparsity-constrained optimization, information criterion can be used to evaluate different sparsity levels and identify the most suitable support size.
.. There is another way to evaluate sparsity levels, which is information criterion. The smaller the information criterion, the better the model.
There is another way to evaluate sparsity levels, which is information criterion. The smaller the information criterion, the better the model.
There are four types of information criterion can be implemented in ``skscope.utilities``: Akaike information criterion `[1]`_, Bayesian information criterion (BIC, `[2]`_), extend BIC `[3]`_, and special information criterion (SIC `[4]`_).
.. If sparsity is list and ``cv=None``, the solver will use information criterions to evaluate the sparsity level.
If sparsity is list and ``cv=None``, the solver will use information criterions to evaluate the sparsity level.
The input parameter ``ic_method`` in the solvers of skscope can be used to choose the information criterion. It should be a method to compute information criterion which has the same parameters with this example:

.. code-block:: python
Expand Down
6 changes: 3 additions & 3 deletions docs/source/feature/Variants.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ In addition to standard sparsity-constrained optimization (SCO) problems, ``sksc
Group-structured parameters
----------------------------

In certain cases, we may encounter group-structured parameters where all parameters are divided into non-overlapping groups. Examples of such scenarios include group variable selection under linear model `[1]`_, `multitask learning <../userguide/examples/GeneralizedLinearModels/multiple-response-linear-regression.html>`__, and so on.
In certain cases, we may encounter group-structured parameters where all parameters are divided into non-overlapping groups. Examples of such scenarios include group variable selection under linear model `[1]`_, `multitask learning <../gallery/GeneralizedLinearModels/multiple-response-linear-regression.html>`__, and so on.

When dealing with group-structured parameters, we treat each parameter group as a unit, selecting or deselecting all the parameters in the group simultaneously. This problem is referred to as group SCO (GSCO).

Expand Down Expand Up @@ -174,8 +174,8 @@ In some cases, there may be additional constraints on the intrinsic structure of

.. math::
\arg\min_{\theta \in R^s, \theta \in \mathcal{C}} f(\theta).
A typical example is the Gaussian graphical model for continuous random variables, which constrains :math:`\theta` on symmetric positive-definite spaces (see this example `<../userguide/examples/GraphicalModels/sparse-gaussian-precision-matrix.html>`__). Although the default numeric solver cannot solve this problem, ``skscope`` provides a flexible interface that allows for its replacement. Specifically, users can change the default numerical optimization solver by properly setting the ``numeric_solver`` in the solver.
A typical example is the Gaussian graphical model for continuous random variables, which constrains :math:`\theta` on symmetric positive-definite spaces (see this example `gaussian precision matrix <../gallery/GraphicalModels/sparse-gaussian-precision-matrix.html>`__). Although the default numeric solver cannot solve this problem, ``skscope`` provides a flexible interface that allows for its replacement. Specifically, users can change the default numerical optimization solver by properly setting the ``numeric_solver`` in the solver.

> Notice that, the accepted input of ``numeric_solver`` should have the same interface as ``skscope.numeric_solver.convex_solver_LBFGS``.

Expand Down
113 changes: 1 addition & 112 deletions docs/source/gallery/GeneralizedLinearModels/gamma-regression.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -317,110 +317,7 @@
"id": "c4d3720f",
"metadata": {},
"source": [
"Now the `solver.params` contains the coefficients of gamma model with no more than 5 variables. That is, those variables with a coefficient 0 is unused in the model:"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "e416367f",
"metadata": {
"scrolled": true
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[10.96270773 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 1.2021258 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0.99600871 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 1.74258709 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 1.18841825 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 1.46535362 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. 0. 0. 0.\n",
" 0. 0. 0. ]\n"
]
}
],
"source": [
"print(solver.params)"
"Now the `solver.params` contains the coefficients of gamma model with no more than 5 variables."
]
},
{
Expand Down Expand Up @@ -557,14 +454,6 @@
"- [2] Abess docs, \"make_glm_data\".\n",
"https://abess.readthedocs.io/en/latest/Python-package/datasets/glm.html\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "84ae9a94",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand Down
1 change: 1 addition & 0 deletions docs/source/gallery/GeneralizedLinearModels/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,5 @@ Generalized Linear Models
gamma-regression
multiple-response-linear-regression
multinomial-logistic-regression
poisson-identity-link
.. Inverse-gaussian-regression
211 changes: 67 additions & 144 deletions docs/source/gallery/GeneralizedLinearModels/logistic-regression.ipynb

Large diffs are not rendered by default.

Large diffs are not rendered by default.

11 changes: 0 additions & 11 deletions docs/source/userguide/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -52,17 +52,6 @@ Note that ``--recurse-submodules`` is required since there are some submodules i
Thanks to the editable mode with the flag ``-e``, we needn't re-build the package :ref:`skscope <skscope_package>` when the source python code changes.

If the dependence packages has been installed, we can build the package faster by

.. code-block:: Bash
python setup.py develop
where the function of the flag ``develop`` is similar with ``-e`` of command ``pip``.

This command will not check or prepare the required environment, so it can save a lot of time.
Thus, we can use ``pip`` with first building and ``python`` with re-building.




Expand Down
2 changes: 1 addition & 1 deletion docs/source/userguide/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ Further reading

- `JAX library <https://jax.readthedocs.io/en/latest/index.html>`__

- A bunch of `machine learning methods <examples/index.html>`__ implemented on the ``skscope``
- A bunch of `machine learning methods <gallery/index.html>`__ implemented on the ``skscope``

- More `advanced features <../feature/index.html>`__ implemented in ``skscope``

Expand Down
12 changes: 6 additions & 6 deletions docs/source/userguide/whatscope.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,39 +6,39 @@ What is ``skscope``?

``skscope`` is a powerful open-source Python package specifically developed to tackle sparsity-constrained optimization (SCO) problems with utmost efficiency. With SCO's broad applicability in machine learning, statistics, signal processing, and other related domains, ``skscope`` can find extensive usage in these fields. For example, it excels in solving classic SCO problems like variable selection (also known as feature selection or compress sensing). Even more impressively, it goes beyond that and handles a diverse range of intriguing real-world problems:

1. `Robust variable selection <examples/LinearModelAndVariants/robust-regression.html>`__
1. `Robust variable selection <gallery/LinearModelAndVariants/robust-regression.html>`__

.. image:: figure/variable_selection.png
:width: 300
:align: center

2. `Nonlinear variable selection <examples/Miscellaneous/hsic-splicing.html>`__
2. `Nonlinear variable selection <gallery/Miscellaneous/hsic-splicing.html>`__

.. image:: figure/nonlinear_variable_selection.png
:width: 666
:align: center


3. `Spatial trend filtering <examples/FusionModels/spatial-trend-filtering.html>`__
3. `Spatial trend filtering <gallery/FusionModels/spatial-trend-filtering.html>`__

.. image:: figure/trend_filter.png
:width: 666
:align: center

4. `Network reconstruction <examples/GraphicalModels/sparse-gaussian-precision.html>`__
4. `Network reconstruction <gallery/GraphicalModels/sparse-gaussian-precision.html>`__

.. image:: figure/precision_matrix.png
:width: 666
:align: center

5. `Portfolio selection <examples/Miscellaneous/portfolio-selection.html>`__
5. `Portfolio selection <gallery/Miscellaneous/portfolio-selection.html>`__

.. image:: figure/portfolio_selection.png
:width: 300
:align: center


These above examples represent just a glimpse of the practical problems that ``skscope`` can effectively address. With its efficient optimization algorithms and versatility, ``skscope`` proves to be an invaluable tool for a wide range of disciplines. Currently, we offer over 20 examples in our comprehensive `example gallery <examples/index.html>`__.
These above examples represent just a glimpse of the practical problems that ``skscope`` can effectively address. With its efficient optimization algorithms and versatility, ``skscope`` proves to be an invaluable tool for a wide range of disciplines. Currently, we offer over 20 examples in our comprehensive `example gallery <gallery/index.html>`__.


.. How does ``skscope`` work?
Expand Down
Loading

0 comments on commit 2da8f19

Please sign in to comment.