|
1 | 1 | # StructuredOptimization.jl
|
2 | 2 |
|
3 |
| -Convex and nonconvex regularized least squares in Julia. |
| 3 | +`StructuredOptimization.jl` is a high-level modeling language |
| 4 | +that utilizes a syntax that is very close to |
| 5 | +the mathematical formulation of an optimization problem. |
4 | 6 |
|
5 |
| -## Installation |
6 |
| - |
7 |
| -From the Julia command line hit `Pkg.clone("https://github.com/nantonel/StructuredOptimization.jl.git")`. |
8 |
| -Once the package is installed you can update it along with the others issuing |
9 |
| -`Pkg.update()` in the command line. |
10 |
| - |
11 |
| -## Usage |
| 7 | +This user-friendly interface |
| 8 | +acts as a parser to utilize |
| 9 | +three different packages: |
12 | 10 |
|
13 |
| -With StructuredOptimization.jl you can solve problems of the form |
| 11 | +* [`ProximalOperators.jl`](https://github.com/kul-forbes/ProximalOperators.jl) |
14 | 12 |
|
15 |
| -``` |
16 |
| -minimize (1/2)*||L(x) - b||^2 + g(x) |
17 |
| -``` |
| 13 | +* [`AbstractOperators.jl`](https://github.com/kul-forbes/ProximalOperators.jl) |
18 | 14 |
|
19 |
| -Here `L` is a linear operator, `b` is an `Array` of data, and `g` is a regularization |
20 |
| -taken from [ProximalOperators.jl](https://github.com/kul-forbes/ProximalOperators.jl). |
21 |
| -You can use any `AbstractMatrix` object to describe `L`, or any matrix-like object |
22 |
| -implementing the matrix-vector product and transpose operations |
23 |
| -(see for example [LinearOperators.jl](https://github.com/JuliaSmoothOptimizers/LinearOperators.jl)). |
24 |
| -Alternatively, you can provide the direct and adjoint mappings in the form of `Function` objects. |
| 15 | +* [`ProximalAlgorithms.jl`](https://github.com/kul-forbes/ProximalAlgorithms.jl) |
25 | 16 |
|
26 |
| -```julia |
27 |
| -x, info = solve(L, b, g) # L is a matrix-like object |
28 |
| -x, info = solve(Op, OpAdj, b, g, x0) # Op and OpAdj are of type Function |
29 |
| -``` |
| 17 | +`StructuredOptimization.jl` can handle large-scale convex and nonconvex problems with nonsmooth cost functions. |
30 | 18 |
|
31 |
| -The dimensions of `b` must match the ones of `L` or `Op` and `OpAdj`. |
32 |
| -Argument `x0` (the initial iterate for the algorithm) is compulsory when |
33 |
| -mappings `Op` and `OpAdj` are provided. |
| 19 | +It supports complex variables as well. |
34 | 20 |
|
35 |
| -## Example: sparse signal reconstruction |
| 21 | +## Installation |
36 | 22 |
|
37 |
| -Consider the problem of recovering a sparse signal, observed through a measurement |
38 |
| -matrix `A` which is orthogonal. A random instance of such problem is generated as |
39 |
| -follows, where measurements are affected by Gaussian noise: |
| 23 | +From the Julia command line hit `Pkg.clone("https://github.com/nantonel/StructuredOptimization.jl.git")`. |
| 24 | +Once the package is installed you can update it along with the others issuing `Pkg.update()` in the command line. |
40 | 25 |
|
41 |
| -```julia |
42 |
| -m, n, k = 1024, 4096, 160 # problem parameters |
43 |
| -B = randn(m, n) |
44 |
| -Q, R = qr(B') |
45 |
| -A = Q' # measurement matrix |
46 |
| -x_orig = sign(randn(n)) |
47 |
| -J = randperm(n) |
48 |
| -x_orig[J[k+1:end]] = 0 # generate sparse -1/+1 signal |
49 |
| -sigma = 1e-2 |
50 |
| -y = A*x_orig + sigma*randn(m) # add noise to measurement |
51 |
| -``` |
| 26 | +## Usage |
52 | 27 |
|
53 |
| -One way to approximately reconstruct `x_orig` is to solve an L1-regularized |
54 |
| -least squares problem using the `NormL0` function, as in the following snippet |
55 |
| -(parameters here are taken from [this paper](http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4407767)): |
| 28 | +A *least absolute shrinkage and selection operator* (LASSO) can be solved with only few lines of code: |
56 | 29 |
|
57 | 30 | ```julia
|
58 |
| -using StructuredOptimization |
59 |
| -using ProximalOperators |
60 |
| -lambda_max = norm(A'*y, Inf) |
61 |
| -lambda = 0.01*lambda_max # regularization parameter |
62 |
| -x_L1, info = solve(A, y, normL1(lambda), zeros(n)) |
63 |
| -``` |
| 31 | +julia> using StructuredOptimization |
64 | 32 |
|
65 |
| -Alternatively, one can use the `indBallL0` regularizer to look for the best |
66 |
| -approximation with a given number of nonzero coefficients: |
| 33 | +julia> n, m = 100, 10; # define problem size |
67 | 34 |
|
68 |
| -```julia |
69 |
| -x_L0c, info = solve(A, y, indBallL0(200), zeros(n)) |
70 |
| -``` |
| 35 | +julia> A, y = randn(m,n), randn(m); # random problem data |
71 | 36 |
|
72 |
| -## References |
| 37 | +julia> x = Variable(n); # initialize optimization variable |
73 | 38 |
|
74 |
| -The algorithms implemented in StructuredOptimization.jl are described in the following papers. |
| 39 | +julia> λ = 1e-2*norm(A'*y,Inf); # define λ |
75 | 40 |
|
76 |
| -1. L. Stella, A. Themelis, P. Patrinos, “Forward-backward quasi-Newton methods for nonsmooth optimization problems,” [arXiv:1604.08096](http://arxiv.org/abs/1604.08096) (2016). |
| 41 | +julia> @minimize ls( A*x - y ) + λ*norm(x, 1); # minimize problem |
77 | 42 |
|
78 |
| -2. A. Themelis, L. Stella, P. Patrinos, “Forward-backward envelope for the sum of two nonconvex functions: Further properties and nonmonotone line-search algorithms,” [arXiv:1606.06256](http://arxiv.org/abs/1606.06256) (2016). |
79 |
| - |
80 |
| -## Credits |
| 43 | +``` |
81 | 44 |
|
82 |
| -StructuredOptimization.jl is developed by [Lorenzo Stella](https://lostella.github.io) and [Niccolò Antonello](http://homes.esat.kuleuven.be/~nantonel/) at [KU Leuven, ESAT/Stadius](https://www.esat.kuleuven.be/stadius/). |
| 45 | +See the [documentation]() for more details about the type of problems `StructuredOptimization.jl` can handle and the [demos]() to check out some examples. |
0 commit comments