Starting from:
$25

$19

Homework 7: Convex programs


    1. Moving averages. There are many ways to model the relationship between an input sequence fu1; u2; : : : g and an output sequence fy1; y2; : : : g. In class, we saw the moving average (MA) model, where each output is approximated by a linear combination of the k most recent inputs:

MA:    yt    b1ut + b2ut  1 +    + bkut  k+1

We then used least-squares to nd the coe cients b1; : : : ; bk. What if we didn’t have access to the inputs at all, and we were asked to predict future y values based only on the previous y values? One way to do this is by using an autoregressive (AR) model, where each output is approximated by a linear combination of the ‘ most recent outputs (excluding the present one):

AR:    yt    a1yt  1 + a2yt  2 +    + a‘yt  ‘

Of course, if the inputs contain pertinent information, we shouldn’t expect the AR method to outper-form the MA method!

    a) Using the same dataset from class uy_data.csv, plot the true y, and on the same axes, also plot the estimated y^ using the MA model and the estimated y^ using the AR model. Use k = 5 for

both models. To quantify the di erence between estimates, also compute ky    y^k for both cases.

    b) Yet another possible modeling choice is to combine both AR and MA. Unsurprisingly, this is called the autoregressive moving average (ARMA) model:

ARMA:    yt    a1yt  1 + a2yt  2 +    + a‘yt  ‘ + b1ut + b2ut  1 +    + bkut  k+1

Solve the problem once more, this time using an ARMA model with k = ‘ = 1. Plot y and y^ as before, and also compute the error ky y^k.
Note: For the problems in this question you don’t need to use optimization codes; you can just use the \backslash" notation for solving linear least squares.


2. The Huber loss. In statistics, we frequently encounter data sets containing outliers, which are bad data points arising from experimental error or abnormally high noise. Consider for example the following data set consisting of 15 pairs (x; y).

x
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
















y
6.31
3.78
24
1.71
2.99
4.53
2.11
3.88
4.67
4.25
2.06
23
1.58
2.17
0.02

















The y values corresponding to x = 3 and x = 12 are outliers because they are far outside the expected range of values for the experiment.

a) Compute the best linear t to the data using an ‘2 cost (least squares). In other words, we are looking for the a and b that minimize the expression:


15


Xi
axi   b)2
‘2 cost:
(yi


=1


Repeat the linear t computation but this time exclude the outliers from your data set. On a single plot, show the data points and both linear ts. Explain the di erence between both ts.

1 of 3
CS/ECE/ISyE 524    Introduction to Optimization    Steve Wright,    Spring 2021


    b) It’s not always practical to remove outliers from the data manually, so we’ll investigate ways of automatically dealing with outliers by changing our cost function. Find the best linear t again (including the outliers), but this time use the ‘1 cost function:


15

‘1 cost:
Xi
axi   b j

j yi


=1


Include a plot containing the data and the best ‘1 linear t. Does the ‘1 cost handle outliers better or worse than least squares? Explain why.

    c) Another approach is to use an ‘2 penalty for points that are close to the line but an ‘1 penalty for points that are far away. Speci cally, we’ll use something called the Huber loss, de ned as:

(
2M x

M2   otherwise
(x) =
x2
j
if   M   x   M

j



Here, M is a parameter that determines where the quadratic function transitions to a linear function. The plot on the right shows what the Huber loss function looks like for M = 1.


6






5






4






3






2






1






0
−2
−1




−3


0
1
2
3


The formula above is simple, but not in a form that is useful for us. As it turns out, we can evaluate the Huber loss function at any point x by solving the following convex QP instead:
(x) =
8
v;w
w
2
+ 2Mv
9

>
minimize



>



j

j









<





=

>subject to:
x

w + v
>

>





>

>





>
    • v   0; w   M;

Verify this fact by solving the above QP (with M = 1) for many values of x in the interval 3   x   3 and reproducing the plot above. Finally,  nd the best linear  t to our data using a

Huber loss with M = 1 and produce a plot showing your    t. The cost function is:


15



Xi
axi

Huber loss:
( yi

b)

=1




    3. Heat pipe design. A heated uid at temperature T (degrees above ambient temperature) ows in a pipe with xed length and circular cross section with radius r. A layer of insulation, with thickness w, surrounds the pipe to reduce heat loss through the pipe walls (w is much smaller than r). The design variables in this problem are T , r, and w.

The energy cost due to heat loss is roughly equal to 1T r=w. The cost of the pipe, which has a xed wall thickness, is approximately proportional to the total material, i.e., it is given by 2r. The cost of the insulation is also approximately proportional to the total insulation material, i.e., roughly 3rw. The total cost is the sum of these three costs.

The heat ow down the pipe is entirely due to the ow of the uid, which has a xed velocity, i.e., it is given by 4T r2. The constants i are all positive, as are the variables T , r, and w.

Now the problem: maximize the total heat ow down the pipe, subject to an upper limit Cmax on total cost, and the constraints

Tmin    T    Tmax;    rmin    r    rmax    wmin    w    wmax;    w    0:1r

    a) Express this problem as a geometric program, and convert it into a convex optimization problem.

2 of 3
CS/ECE/ISyE 524    Introduction to Optimization    Steve Wright,    Spring 2021


    b) Consider a simple instance of this problem, where Cmax = 500 and 1 = 2 = 3 = 4 = 1. Also assume for simplicity that each variable has a lower bound of zero and no upper bound. Solve this problem using JuMP. Use the Ipopt solver and the command @NLconstraint(...) to specify nonlinear constraints such as log-sum-exp functions. Have your code print the optimal values of T , r, and w, as well as the optimal objective value.





























































3 of 3

More products