### Introduction

In this tutorial, we will

- Use the Gibbs sampler to generate bivariate normal draws.
- Create side-by-side plots of the parameter paths.
- Calculate the drawn distribution's mean and variance-covariance matrix.

### Background

The *Gibbs sampler* is a popular Bayesian algorithm. It draws iteratively from posterior conditional distributions rather than drawing directly from the joint posterior distribution. This makes the Gibbs Sampler particularly useful as the joint posterior is not always easy to work with.

The Gibbs sampler works by restructuring the joint estimation problem as a series of smaller, easier estimation problems. For example, consider the case where the parameter vector can be broken into two blocks: $ \theta' = [\theta_{1}'\text{ }\theta_{2}'] $. The bivariate general Gibbs Sampler can be broken down into simple steps:

- Set up sampler specifications including the number of iterations and the number of burn-ins draws.
- Choose a starting value $p(\theta_1|y,\theta_2^{(0)})$.
- Draw $\theta_2^{(r)}$ from $p(\theta_2|y,\theta_1^{(r-1)})$.
- Draw $\theta_1^{(r)}$ from $p(\theta_1|y,\theta_2^{(r)})$.
- Repeat 2-3 for desired number of iterations.
- Final calculations including means, standard deviations, and plots.

In this particular tutorial, we consider a bivariate normal posterior distribution such that

$$ \begin{eqnarray*} \begin{pmatrix} \theta_1\\ \theta_2 \end{pmatrix} & \sim & N\left[\left(\begin{array}{c} 0\\ 0 \end{array}\right),\left(\begin{array}{ccc} 1 & \rho\\ \rho & 1 \end{array}\right)\right]\\ \end{eqnarray*} $$

where $\theta_1$ and $\theta_2$ are unknown parameters of the model, while $\rho$ is the known posterior correlation between $\theta_1$ and $\theta_2$.

Using the properties of the multivariate normal distribution

$$\theta_1|\theta_2, y \text{ ~ } N(\rho\theta_2, 1-\rho^2) \text{ ~ } \rho\theta_2 + \sqrt{1-\rho^2}N(0,1)$$

and

$$\theta_2|\theta_1, y \text{ ~ } N(\rho\theta_1, 1-\rho^2) \text{ ~ } \rho\theta_1 + \sqrt{1-\rho^2}N(0,1)$$

## Set Gibbs sampler options

Our first step is to set the specifications of the Gibbs sampler such that we will have 10,000 total draws. In addition, we will make 1000 burn-in draws and set the known $\rho$ equal to 0.6.

```
//Gibbs Sampler Specifications
//Known correlation parameter
rho = 0.6;
//Burn-in for the Gibbs sampler
burn_in = 1000;
//Draws to keep from sampler
keep_draws = 10000;
```

## Pre-initialize vectors to hold the draws

Next, we need to initialize the storage matrices for our parameters, $\theta_1$ and $\theta_2$. The GAUSS function `zeros`

allows us to build zero matrices with specified numbers of rows and columns.

```
// Initialize the variables for the do loop
theta_1_do = zeros(burn_in + keep_draws, 1);
theta_2_do = zeros(burn_in + keep_draws, 1);
```

The last step before starting our Gibbs sampler is to set our initial values for our parameters. In this example, we will start with $\theta_1^{(1)} = \theta_2^{(1)}$ = 0. As we have already set our matrices to zero for all iterations, we can effectively set our initial values by starting our loop at `r = 2`

.

## Run Gibbs sampler with `do`

loop

At this point, we are ready to write our iterative loop. In GAUSS, as in most programming languages, we have several choices for how to construct our loops. We could use a *do loop*, which specifies that GAUSS repeats the commands within the loop until a given condition is met. The *do loop* is flexible programming tool and allows us to control looping using simple counting or complex conditions.

```
//General anatomy of a 'do' loop
//Do not run this code snippet
i = start;
do while i <= end_condition;
//loop body work
i = i + 1;
endo;
```

Our bivariate normal Gibbs sampler will include two commands inside the *do loop*, one that draws $\theta_2^{(r)}$ from $p(\theta_2|y,\theta_1^{(r-1)})$ and ones that draws $\theta_1^{(r)}$ from $p(\theta_1|y,\theta_2^{(r)})$. The GAUSS command `rndn`

allows us to draw from the standard normal distribution. `rndn`

takes two input arguments, the number of rows and the number of columns of the output matrix of random normal numbers.

```
//Set random seed for repeatable random numbers
rndseed 45352;
//Start counting at 2, use theta_1[1,1]=0, theta_2[1,1]=0 as initial values
i = 2;
//Start do loop
do while i <= burn_in + keep_draws;
//Draw theta_2 given theta_1
theta_2_do[i, 1] = ((1 - rho^2)^0.5)*rndn(1, 1) + rho*theta_1_do[i-1, 1];
//Draw theta_1 given theta_2
theta_1_do[i, 1] = ((1 - rho^2)^0.5)*rndn(1, 1) + rho*theta_2_do[i, 1];
//Advance the loop
i = i + 1;
endo;
print "The first 5 values of 'theta_1_do' and theta_2_do' are:";
print theta_1_do[1:5]~theta_2_do[1:5];
```

The above code will print the following output:

The first 5 values of 'theta_1_do' and theta_2_do' are: 0.0000000 0.0000000 -0.65117043 -0.15106836 -0.32630327 -1.6688055 -0.98322799 -0.62352043 -0.45213979 -0.24999669

## Run Gibbs sampler with `for`

loop

The *for loop* offers an alternative method for looping which, in general, is the preferred method for situations like the Gibbs sampler. The *for loop* often provides a faster, more compact method than the *do loop*. Rather than specifying a condition as done with the *do loop*, the *for loop* repeats a set of commands for a specified number of iterations based on a start, stop and step specification.

```
//General anatomy of a 'for' loop
//'i' is the loop counter
//Do not run this code snippet
for i(start, stop, step);
//loop body work
endfor;
```

```
// Initialize the variables for the loop to zero again
theta_1 = zeros(burn_in + keep_draws, 1);
theta_2 = zeros(burn_in + keep_draws, 1);
//Set random seed for repeatable random numbers
rndseed 45352;
//Start for loop
for i(2, burn_in + keep_draws, 1);
//Draw theta_2 given theta_1
theta_2[i, 1] = ((1 - rho^2)^0.5)*rndn(1, 1) + rho*theta_1[i-1, 1];
//Draw theta_1 given theta_2
theta_1[i, 1] = ((1 - rho^2)^0.5)*rndn(1, 1) + rho*theta_2[i, 1];
endfor;
print "The first 5 values of 'theta_1' and theta_2' are:";
print theta_1[1:5]~theta_2[1:5];
```

Since we set the same seed for the random number generator, we should get the same results from the code with the `do`

loop as we get from the `for`

loop code. That is exactly what we see in the output below:

The first 5 values of 'theta_1' and theta_2' are: 0.0000000 0.0000000 -0.65117043 -0.15106836 -0.32630327 -1.6688055 -0.98322799 -0.62352043 -0.45213979 -0.24999669

## Plot the variable paths

To check the behavior of our Gibbs sampler, we can again plot the path of our variables. We will first set the plot appearance using a GAUSS control structure.

```
//Declare 'myGibbsPlot` to be a plotControl structure
struct plotControl myGibbsPlot;
//Fill 'myGibbsPlot' with 'XY' defaults
myGibbsPLot = plotGetDefaults("xy");
//Set the text interpreter to LaTex for axis labels only
plotSetTextInterpreter(&myGibbsPlot, "latex", "axes");
//Set plot title
plotSetTitle(&myGibbsPlot, "Bivariate Normal Gibbs Sampler", "Arial", 18);
//Set axis labels
plotSetYLabel(&myGibbsPlot, "\\theta_1", "Arial", 14);
```

We will again combine two separate plots, one for $\theta_1$ and one for $\theta_2$ using the `plotLayout`

command. In our first cell, we will plot the pay of $\theta_1$ against the iteration number. `plotLayout`

takes the following input arguments

- r
- Scalar, number of rows in subplot grid.
- c
- Scalar, number of columns in subplot grid.
- location
- Scalar, cell location in which to place the next created graph.

```
//Change plot layout to 2 x 1 grid and place
//next plot in 1st cell
plotLayout(2, 1, 1);
//Create a sequential vector counting iteration numbers
x_plot = seqa(1, 1, burn_in + keep_draws);
//Plot theta_1
plotXY(myGibbsPlot, x_plot, theta_1);
```

Now plot $\theta_2$ next to the graph of $\theta_1$.

```
//Turn off title for second plot
plotSetTitle(&myGibbsPlot, "");
//Set axis labels
plotSetYLabel(&myGibbsPlot, "\\theta_2", "Arial", 14);
//Change plot layout to 2 x 1 grid and place
//next plot in 2nd cell
plotLayout(2, 1, 2);
//Plot theta_1
plotXY(myGibbsPlot, x_plot, theta_2);
//Turn off plot grid for next plot
plotClearLayout();
```

The above code should produce a plot similar to the following:

Finally, we finish our Gibbs sampler by examining the mean and variance-covariance matrix of the Gibbs sampler distribution.

```
//Output matrix concatenating theta_1 and theta_2
ygs = theta_1~theta_2;
ygs = ygs[burn_in+1:burn_in+keep_draws,.];
"---------------------------------------------------";
"Variance-Covariance Matrix from Gibbs Sampler";
"---------------------------------------------------";
"Sample Mean: ";; meanc(ygs);
"Sample Variance-Covariance Matrix: ";
vcx(ygs);
```

The above code should print the following output:

--------------------------------------------------- Variance-Covariance Matrix from Gibbs Sampler --------------------------------------------------- Sample Mean: 0.0060170250 0.0032377421 Sample Variance-Covariance Matrix: 0.97698091 0.58179239 0.58179239 0.98180547