Skip to content

Commit fe38ee4

Browse files
committed
changed README
1 parent a3aab8f commit fe38ee4

File tree

3 files changed

+31
-62
lines changed

3 files changed

+31
-62
lines changed

README.md

+24-61
Original file line numberDiff line numberDiff line change
@@ -1,82 +1,45 @@
11
# StructuredOptimization.jl
22

3-
Convex and nonconvex regularized least squares in Julia.
3+
`StructuredOptimization.jl` is a high-level modeling language
4+
that utilizes a syntax that is very close to
5+
the mathematical formulation of an optimization problem.
46

5-
## Installation
6-
7-
From the Julia command line hit `Pkg.clone("https://github.com/nantonel/StructuredOptimization.jl.git")`.
8-
Once the package is installed you can update it along with the others issuing
9-
`Pkg.update()` in the command line.
10-
11-
## Usage
7+
This user-friendly interface
8+
acts as a parser to utilize
9+
three different packages:
1210

13-
With StructuredOptimization.jl you can solve problems of the form
11+
* [`ProximalOperators.jl`](https://github.com/kul-forbes/ProximalOperators.jl)
1412

15-
```
16-
minimize (1/2)*||L(x) - b||^2 + g(x)
17-
```
13+
* [`AbstractOperators.jl`](https://github.com/kul-forbes/ProximalOperators.jl)
1814

19-
Here `L` is a linear operator, `b` is an `Array` of data, and `g` is a regularization
20-
taken from [ProximalOperators.jl](https://github.com/kul-forbes/ProximalOperators.jl).
21-
You can use any `AbstractMatrix` object to describe `L`, or any matrix-like object
22-
implementing the matrix-vector product and transpose operations
23-
(see for example [LinearOperators.jl](https://github.com/JuliaSmoothOptimizers/LinearOperators.jl)).
24-
Alternatively, you can provide the direct and adjoint mappings in the form of `Function` objects.
15+
* [`ProximalAlgorithms.jl`](https://github.com/kul-forbes/ProximalAlgorithms.jl)
2516

26-
```julia
27-
x, info = solve(L, b, g) # L is a matrix-like object
28-
x, info = solve(Op, OpAdj, b, g, x0) # Op and OpAdj are of type Function
29-
```
17+
`StructuredOptimization.jl` can handle large-scale convex and nonconvex problems with nonsmooth cost functions.
3018

31-
The dimensions of `b` must match the ones of `L` or `Op` and `OpAdj`.
32-
Argument `x0` (the initial iterate for the algorithm) is compulsory when
33-
mappings `Op` and `OpAdj` are provided.
19+
It supports complex variables as well.
3420

35-
## Example: sparse signal reconstruction
21+
## Installation
3622

37-
Consider the problem of recovering a sparse signal, observed through a measurement
38-
matrix `A` which is orthogonal. A random instance of such problem is generated as
39-
follows, where measurements are affected by Gaussian noise:
23+
From the Julia command line hit `Pkg.clone("https://github.com/nantonel/StructuredOptimization.jl.git")`.
24+
Once the package is installed you can update it along with the others issuing `Pkg.update()` in the command line.
4025

41-
```julia
42-
m, n, k = 1024, 4096, 160 # problem parameters
43-
B = randn(m, n)
44-
Q, R = qr(B')
45-
A = Q' # measurement matrix
46-
x_orig = sign(randn(n))
47-
J = randperm(n)
48-
x_orig[J[k+1:end]] = 0 # generate sparse -1/+1 signal
49-
sigma = 1e-2
50-
y = A*x_orig + sigma*randn(m) # add noise to measurement
51-
```
26+
## Usage
5227

53-
One way to approximately reconstruct `x_orig` is to solve an L1-regularized
54-
least squares problem using the `NormL0` function, as in the following snippet
55-
(parameters here are taken from [this paper](http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4407767)):
28+
A *least absolute shrinkage and selection operator* (LASSO) can be solved with only few lines of code:
5629

5730
```julia
58-
using StructuredOptimization
59-
using ProximalOperators
60-
lambda_max = norm(A'*y, Inf)
61-
lambda = 0.01*lambda_max # regularization parameter
62-
x_L1, info = solve(A, y, normL1(lambda), zeros(n))
63-
```
31+
julia> using StructuredOptimization
6432

65-
Alternatively, one can use the `indBallL0` regularizer to look for the best
66-
approximation with a given number of nonzero coefficients:
33+
julia> n, m = 100, 10; # define problem size
6734

68-
```julia
69-
x_L0c, info = solve(A, y, indBallL0(200), zeros(n))
70-
```
35+
julia> A, y = randn(m,n), randn(m); # random problem data
7136

72-
## References
37+
julia> x = Variable(n); # initialize optimization variable
7338

74-
The algorithms implemented in StructuredOptimization.jl are described in the following papers.
39+
julia> λ = 1e-2*norm(A'*y,Inf); # define λ
7540

76-
1. L. Stella, A. Themelis, P. Patrinos, “Forward-backward quasi-Newton methods for nonsmooth optimization problems,” [arXiv:1604.08096](http://arxiv.org/abs/1604.08096) (2016).
41+
julia> @minimize ls( A*x - y ) + λ*norm(x, 1); # minimize problem
7742

78-
2. A. Themelis, L. Stella, P. Patrinos, “Forward-backward envelope for the sum of two nonconvex functions: Further properties and nonmonotone line-search algorithms,” [arXiv:1606.06256](http://arxiv.org/abs/1606.06256) (2016).
79-
80-
## Credits
43+
```
8144

82-
StructuredOptimization.jl is developed by [Lorenzo Stella](https://lostella.github.io) and [Niccolò Antonello](http://homes.esat.kuleuven.be/~nantonel/) at [KU Leuven, ESAT/Stadius](https://www.esat.kuleuven.be/stadius/).
45+
See the [documentation]() for more details about the type of problems `StructuredOptimization.jl` can handle and the [demos]() to check out some examples.

docs/src/index.md

+6
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,12 @@ three different packages:
1616

1717
`StructuredOptimization.jl` can handle large-scale convex and nonconvex problems with nonsmooth cost functions. It supports complex variables as well. See the demos and the [Quick tutorial guide](@ref).
1818

19+
## Installation
20+
21+
From the Julia command line hit `Pkg.clone("https://github.com/nantonel/StructuredOptimization.jl.git")`.
22+
Once the package is installed you can update it along with the others issuing
23+
`Pkg.update()` in the command line.
24+
1925
## Citing
2026

2127
If you use `StructuredOptimization.jl` for published work, we encourage you to cite:

docs/src/tutorial.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ The *least absolute shrinkage and selection operator* (LASSO) belongs to this cl
2020

2121
Here the squared norm $\tfrac{1}{2} \| \mathbf{A} \mathbf{x} - \mathbf{y} \|^2$ is a *smooth* function $f$ wherelse the $l_1$-norm is a *nonsmooth* function $g$.
2222

23-
This problem can be solved using `StructuredOptimization.jl` using only few lines of code:
23+
This problem can be solved with only few lines of code:
2424

2525
```julia
2626
julia> using StructuredOptimization

0 commit comments

Comments
 (0)