Sum of squares decompositions


YALMIP has a built-in module for sum of squares calculations. In its most basic formulation, a sum of squares (SOS) problem takes a polynomial p(x) and tries to find a real-valued vector function h(x) such that p(x)=hT(x)h(x) (or equivalently p(x)=vT(x)Qv(x), with Q positive semidefinite), hence proving non-negativity of the polynomial p(x). Read more about sum of squares decompositions in [Parrilo].

Options

In the examples below, we have used the default settings when solving the SOS problem, but there are a couple of options that can be changed to alter the computations (A detailed text on the sum of squares solver will hopefully be available soon. Contact the author for a possible pre-print.)

sdpsettings('sos.model',[0|1|2]) The SDP formulation of a SOS problem is not unique but can be done in several ways. YALMIP supports two version, here called image and kernel representation. If you set the value to 1, a kernel representation will be used, while 2 will result in an image representation. If the option is set to the default value 0, YALMIP will automatically select the representation (kernel by default).

The kernel representation will most often give a smaller and numerically more robust semidefinite program, but can not be used for nonlinearly parameterized SOS programs (i.e. problems resulting in BMIs etc) or problems involving second order cone, LMI or integrality constraints on parametric variables.

sdpsettings('sos.newton',[0|1]) To reduce the size of the resulting SDP, a Newton polytope reduction algorithm is applied by default. For efficiency, you must have CDDMEX or GLPKMEX installed.
sdpsettings('sos.congruence',[0|1]) A useful feature in YALMIP is the use of symmetry of the polynomial to block-diagonalize the SDP. This can make a huge difference for some SOS problems and is applied by default.
sdpsettings('sos.inconsistent',[0|1]) This options can be used to further reduce the size of the SDP. It is turned off by default since it typically not gives any major reduction in problem size once the Newton polytope reduction has been applied. In some situations, it might however be useful to use this strategy instead of the linear programming based Newton reduction (it cannot suffer from numerical problems and does not require any efficient LP solver), and for some problems, it can reduce models that are minimal in the Newton polytope sense, leading to a more numerically robust solution of the resulting SDP.

Simple sum of squares problems

The following lines of code presents some typical manipulation when working with SOS-calculations. Note the use of the commands sos (to define a SOS constraint) and solvesos (to solve the problem).

x = sdpvar(1,1);y = sdpvar(1,1);
p = (1+x)^4 + (1-y)^2;
F = set(sos(p));
solvesos(F);

The sum of squares decomposition is extracted with the command sosd.

h = sosd(F);
sdisplay(h)

  ans = 
   '-1.203-1.9465x+0.22975y-0.97325x^2'
   '0.7435-0.45951x-0.97325y-0.22975x^2'
   '0.0010977+0.00036589x+0.0010977y-0.0018294x^2'

To see if the decomposition was successful, we simply calculate p(x)-hT(x)h(x) which should be 0. However, due to numerical errors, the difference will not be zero. A useful command then is clean. Using clean, we remove all monomials with coefficients smaller than, e.g., 1e-6.

clean(p-h'*h,1e-6)

 ans =
     0

The decomposition p(x)-vT(x)Qv(x) can also be obtained easily.

x = sdpvar(1,1);y = sdpvar(1,1);
p = (1+x)^4 + (1-y)^2;
F = set(sos(p));
[sol,v,Q] = solvesos(F);
clean(p-v'*Q*v,1e-6)
 ans =
     0

Parameterized problems

The involved polynomials can be parameterized, and we can optimize the parameters under the constraint that the polynomial is a sum of squares. As an example, we can calculate a lower bound on a polynomial.

The second argument can be used for an objective function to be minimized. Here, we maximize the lower bound, i.e. minimize the negative lower bound. The third argument is an options structure while the fourth argument is a vector containing all parametric variables in the polynomial (in this example we only have one variable, but several parametric variables can be defined by simply concatening them).

sdpvar x y lower
p = (1+x*y)^2-x*y+(1-y)^2;
F = set(sos(p-lower));
solvesos(F,-lower,[],lower);
double(lower)

 ans =
     0.75

YALMIP automatically declares all variables appearing in the objective function and in non-SOS constraints as parametric variables. Hence, the following code is equivalent.

solvesos(F,-lower);
double(lower)

 ans =
     0.75

Multiple SOS constraints can also be used. Consider the following problem of finding the smallest possible t such that the polynomials t(1+xy)2-xy+(1-y)2 and (1-xy)2+xy+t(1+y)2 are both sum of squares.

sdpvar x y t
p1 = t*(1+x*y)^2-x*y+(1-y)^2;
p2 = (1-x*y)^2+x*y+t*(1+y)^2;
F = set(sos(p1))+set(sos(p2));
solvesos(F,t);
double(t)

 ans = 

   0.2500
sdisplay(sosd(F(1)))
ans = 
    '-1.102+0.95709y+0.14489xy'
    '-0.18876-0.28978y+0.47855xy'
sdisplay(sosd(F(2)))
ans = 
    '-1.024-0.18051y+0.76622xy'
    '-0.43143-0.26178y-0.63824xy'
    '0.12382-0.38586y+0.074568xy'
 

Polynomial parameterizations

A special feature of the sum of squares package in YALMIP is the possibility to work with nonlinear SOS parameterizations, i.e. SOS problems resulting in PMIs (polynomial matrix inequalities) instead of LMIs. The following piece of code solves a nonlinear control synthesis problem using sum of squares. Note that this requires the solver PENBMI.

clear all
yalmip('clear');

% States...
sdpvar x1 x2
x = [x1;x2];

% Non-quadratic Lyapunov z'Pz 
z = [x1;x2;x1^2];
P = sdpvar(3,3);
V = z'*P*z;

% Non-linear state feedback
v = [x1;x2;x1^2];
K = sdpvar(1,3);
u = K*v;

% System x' = f(x)+Bu
f = [1.5*x1^2-0.5*x1^3-x2; 3*x1-x2];
B = [0;1];

% Closed loop system, u = Kv
fc = f+B*K*v;

% Stability and performance constraint dVdt < -x'x-u'u
% NOTE : This polynomial is bilinear in P and K
F = set(sos(-jacobian(V,x)*fc-x'*x-u'*u));

% P is positive definite, bound P and K for numerical reasons
F = F + set(P>0)+set(25>P(:)>-25)+set(25>K>-25);

% Minimize trace(P)
% Parametric variables P and K automatically detected
% by YALMIP since they are both constrained
solvesos(F,trace(P));