The source files for all examples can be found in /examples.

Lasso

This example uses the LassoSolver to solve a lasso regression problem. We also show how to use the MLSolver.

The lasso regression problem is

\[\begin{array}{ll} \text{minimize} & (1/2)\|Ax - b\|_2^2 + \lambda \|x\|_1. \end{array}\]

using GeNIOS
using Random, LinearAlgebra, SparseArrays

Generating the problem data

Random.seed!(1)
m, n = 200, 400
A = randn(m, n)
A .-= sum(A, dims=1) ./ m
normalize!.(eachcol(A))
xstar = sprandn(n, 0.1)
b = A*xstar + 1e-3*randn(m)
λ = 0.05*norm(A'*b, Inf)
0.10902103801870822

LassoSolver interface

The easiest interface for this problem is the LassoSolver, where we just need to specify the regularization parameter (in addition to the problem data).

λ1 = λ
solver = GeNIOS.LassoSolver(λ1, A, b)
res = solve!(solver; options=GeNIOS.SolverOptions(use_dual_gap=true, dual_gap_tol=1e-4, verbose=true))
rmse = sqrt(1/m*norm(A*solver.zk - b, 2)^2)
println("Final RMSE: $(round(rmse, digits=8))")
Starting setup...
Setup in  0.118s

──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
    Iteration      Objective           RMSE       Dual Gap       r_primal         r_dual              ρ           Time
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
            0      1.682e+01      2.900e-01      9.256e+00            Inf            Inf      1.000e+00         0.000
            1      1.682e+01      2.900e-01      9.256e+00      0.000e+00      9.692e+00      1.000e+00         0.103
           20      2.963e+00      3.613e-02      7.393e-02      1.897e-02      6.309e-02      1.000e+00         0.108
           39      2.963e+00      3.602e-02      9.329e-05      1.210e-04      2.713e-04      1.000e+00         0.113

SOLVED in  0.114s, 39 iterations
Total time:  0.231s
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────

Final RMSE: 0.05094243

MLSolver interface

Under the hood, this is just a wrapper around the MLSolver interface. This interface is more general, and allows us to specify the per-sample loss used in the machine learning problem. Specifically, it solves problems with the form

\[\begin{array}{ll} \text{minimize} & \sum_{i=1}^N f(a_i^Tx - b_i) + \lambda_1 \|x\|_1 + (\lambda_2/2) \|x\|_2^2. \end{array}\]

It's easy to see that the lasso problem is a special case.

f(x) = 0.5*x^2
fconj(x) = 0.5*x^2
λ1 = λ
λ2 = 0.0
solver = GeNIOS.MLSolver(f, λ1, λ2, A, b; fconj=fconj)
res = solve!(solver; options=GeNIOS.SolverOptions(relax=true, use_dual_gap=true, dual_gap_tol=1e-3, verbose=true))
rmse = sqrt(1/m*norm(A*solver.zk - b, 2)^2)
println("Final RMSE: $(round(rmse, digits=8))")
Starting setup...
Setup in  0.153s

──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
    Iteration      Objective           RMSE       Dual Gap       r_primal         r_dual              ρ           Time
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
            0      1.682e+01      2.900e-01      9.256e+00            Inf            Inf      1.000e+00         0.000
            1      1.682e+01      2.900e-01      9.256e+00      0.000e+00      9.692e+00      1.000e+00         0.166
           20      2.963e+00      3.613e-02      7.393e-02      1.897e-02      6.309e-02      1.000e+00         0.171
           35      2.963e+00      3.602e-02      6.247e-04      6.305e-04      1.270e-03      1.000e+00         0.176

SOLVED in  0.176s, 35 iterations
Total time:  0.329s
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────

Final RMSE: 0.05093825

Note that we also defined the conjugate function of $f$, defined as

\[f^*(y) = \sup_x \{yx - f(x)\},\]

which allows us to use the dual gap as a stopping criterion (see our paper for a derivation). Specifying the conjugate function is optional, and the solver will fall back to using the primal and dual residuals if it is not specified.


This page was generated using Literate.jl.