# Using Julia and Convex.jl

## Installing Julia

The easiest way to run Julia locally for small bits of code is to use notebooks. You can follow the tutorial here to setup your local environment.

### VSCode

I prefer to use the VSCode editor which has a Julia plugin. Here are the getting started instructions. Like with a notebook, you can run individual lines of code or cells (delimited by double comments ##). The keybindings for this are below.

• Julia: Execute Code in REPL and Move: Shift+Enter

• Julia: Execute Code in REPL: Ctrl+Enter

• Julia: Execute Code Cell in REPL: Alt+Enter

• Julia: Execute Code Cell in REPL and Move: Alt+Shift+Enter

### Why Julia

Steven Johnson explains it better than I can, but I'll add that the mathematical optimization community is coalescing quickly around Julia. The optimization problems you encounter in practice can be massive, especially if the problem depends on some collected real-world data. Julia will better equip you to tackle these problems, engage with the research community building the next generation of optimization tools, and contribute to the open source tooling.

## Using Convex.jl

First, you'll need to install Convex.jl and at least one of the solvers. From the Julia REPL, use ] to bring up the package prompt and type

(v1.8) pkg> add Convex SCS

As an example (from the Convex.jl docs) consider the constrained least squares problem

\begin{aligned} & \mathrm{minimize} && \|Ax - b\|_2^2 \\ & \mathrm{subject\ to} && x \ge 0 \end{aligned}

with variable $$x \in \mathbf{R}^n$$ and problem data $$A \in \mathbf{R}^{m \times n}$$, $$b \in \mathbf{R}^m$$. The following code solves this problem in Convex.jl.

# Make the Convex.jl module available
using Convex, SCS

# Generate random problem data
m = 4;  n = 5
A = randn(m, n); b = randn(m, 1)

# Create a (column vector) variable of size n x 1.
x = Variable(n)

# The problem is to minimize ||Ax - b||^2 subject to x >= 0
# This can be done by: minimize(objective, constraints)
problem = minimize(sumsquares(A * x - b), [x >= 0])

# Solve the problem by calling solve!
solve!(problem, SCS.Optimizer)

# If you don't want output, instead call
solve!(problem, SCS.Optimizer, silent_solver = true)

# Check the status of the problem
problem.status # :Optimal, :Infeasible, :Unbounded etc.

# Get the optimum value
problem.optval

# Get the value of x
x.value

You should run this example! If you're using VSCode, Shift+Enter to go through and execute each line of code.

### Variables

Problem variables in Convex.jl are declared with the Variable keyword. Dimensions must be specified at declaration. In addition, you can specify properties like positivity (this will be important later).

# Scalar variable
x = Variable()

# Column vector variable
x = Variable(5)

# Column vector variable with positive entries
x = Variable(5, Positive())

# Matrix variable
X = Variable(4, 6)

# Semidefinite matrix variable
X = Semidefinite(4)

### Expressions

Convex.jl uses sign information and a set of atoms to build expressions with known convexity. For example

objective = sumsquares(A * x - b)

in the code above. Much of Julia's core syntax is overloaded, so for example,

norm(A * x - b)

will be recognized as convex. However, non-DCP compliant expressions will not work. Thus there are a number of defined operations which you may need to use for homework problems.

### Constraints

Constraints, which are relations between expressions, are declared with the comparison operators >=, <=, and ==. For example, above we had the constraint

constraint = x >= 0

### Problem

Finally, problems are declared with a sense (minimize, maximize, or satisfy), an objective, and optionally a list of constraints:

problem = minimize(objective, [constraint1, constraint2, ...])

You can also create a problem then add the constraints:

# No constraints given
problem = minimize(objective)
problem.constraints += [constraint1, constraint2, ...]
solve!(problem, solver)
JuMP.jl is another modeling language in Julia that does not support DCP or automatic conversion of problems into a standard conic form. Both rely on MathOptInterface.jl as the backend and are under the same JuMP umbrella organization (see jump.dev). JuMP also prioritizes scalar-based constructions instead of linear algebraic constructions More about the differences is here.