**Author: **Tanya Rawat

Risk is minimized when one invests in markets that show a low level of correlation in relation to each other. Co integration on the other hand would assist in determining long-term trends in these markets with or without correlation. Thus, whilst correlation talks in relation to short term movements, co integration highlights how the presence of co integration makes difficult a risk diversification exercise.

This essay shall form part one of a four part series, wherein the level of correlation and co integration in The Middle East and Turkey markets will be examined to determine the implications for hedging purposes. It will focus primarily on one of the popular measures of dependence viz. correlation.

Stochastic dependence refers to the relationship in distribution that may exist between two or more random variables. For random variables to be dependent, they will fail to satisfy the property of stochastic independence. Thus if X1 and X2are random variables, then they are independent if and only if, for all x1 and x2 ε R,

P(X1≤ x1, X2≤ x2)=P(X1≤ x1 ) P(X2≤ x2)

When the above definition doesn’t hold, the random variables are said to be dependent. Since the above property of stochastic dependence is a rather broad definition, an all encompassing measure would have to be of a non-parametric nature and most measures are only able to capture a portion of this dependence. Most Copulas simplify this non-parametric question to more manageable proportions by assuming a parametric model to describe the dependence structure. Let’s examine some basic properties for dependence measures to examine which measures viz. Pearson’s or linear correlation, Spearman’s rank correlation and tail dependence respectively, meet these conditions.

**I. ****Properties for Dependence Measures**

**Property 1: Comonotonicity**** and Countermonotonicity**

A monotonic function is a function between ordered sets (a sequence that is distinguished from other sequences of the same element by the order of the elements) that preserves the given order.

Comonotonocity refers to perfect positive dependence between the components of a random vector, essentially saying that they can be represented as increasing functions of a single random variable.

δ(X,Y)=1↔X,Y

X=f(Z),Y=g(Z) a.s., where f and g are two increasing real-valued functions

Applicability in our case would imply that if, in particular, the sum of the components X1+X2+…Xn is the riskiest if the joint probability distribution of the random vector (X1+X2+…Xn) is comonotonic. Furthermore, the α-quantile of the sum equals of the sum of the α-quantiles of its components, hence comonotonic random variables are quantile-additive. This property is an extension of the concept of (perfect) positive correlation to variables with arbitrary distributions.

Countermonotonicity implies X and – Y are comonotonic.

δ(X,Y)=-1↔X,Y

X=f(Z),-Y=g(Z) a.s., where f and g are two increasing real-valued functions

**Property 2: Symmetry**

δ(X,Y)=δ(Y,X)

**Property 3: Boundedness**

-1≤δ(X,Y)≤1

**Property 4: **

For h an strictly monotonic function on the range of X:

δ(h(X),Y)=f(x)={δ(X,Y)if h increasing,

{-δ(X,Y)if h decreasing.

**Property 5: **

δ(X,Y)= 0↔X,Y are independent

Property 4 and 5 are contradictory

Established risk measures like Pearson’s product moment correlation coefficient and Spearman’s rank order correlation coefficient are controlled by small movements around the mean and thus fail to describe dependence between extreme events. Facing asset selection and allocation with respect to portfolio management extreme events are primarily represented by jump risks and default risks. The target thus is to diversify away extreme risks by minimizing extreme dependence between assets within a portfolio.

© 2013 Tanya Rawat. By posting content to and from this blog, you agree to transfer copyright to blog owner.

*© Peter Carlsen Photography*

Good work. Looking forward to the rest of the series.

LikeLike

If you have three random variables A, B, and C, what correlation x between A and B, and between B and C, would guarantee a positive correlation between A and C?

(x=corr(A,B)=corr(B,C))

LikeLike

I quite enjoyed the explanation on the following link:

http://stats.stackexchange.com/questions/5747/if-a-and-b-are-correlated-with-c-why-are-a-and-b-not-necessarily-correlated

LikeLike