Limit Laws for Random Vectors with an Extreme Component

Models based on assumptions of multivariate regular variation and hidden regular variation provide ways to describe a broad range of extremal dependence structures when marginal distributions are heavy tailed. Multivariate regular variation provides a rich description of extremal dependence in the case of asymptotic dependence, but fails to distinguish between exact independence and asymptotic independence. Hidden regular variation addresses this problem by requiring components of the random vector to be simultaneously large but on a smaller scale than the scale for the marginal distributions. In doing so, hidden regular variation typically restricts attention to that part of the probability space where all variables are simultaneously large. However, since under asymptotic independence the largest values do not occur in the same observation, the region where variables are simultaneously large may not be of primary interest. A different philosophy was offered in the paper of \cite{heffernan:tawn:2004} which allows examination of distributional tails other than the joint tail. This approach used an asymptotic argument which conditions on one component of the random vector and finds the limiting conditional distribution of the remaining components as the conditioning variable becomes large. In this paper, we provide a thorough mathematical examination of the limiting arguments building on the orientation of \cite{heffernan:tawn:2004}. We examine the conditions required for the assumptions made by the conditioning approach to hold, and highlight simililarities and differences between the new and established methods.