Search Results

You are looking at 1 - 6 of 6 items

  • Author: Jan-Frederik Mai x
Clear All Modify Search

Abstract

Two simulation algorithms for hierarchical Archimedean copulas in the case when intra-group generators are not necessarily completely monotone are presented. Both generalize existing algorithms for the completely monotone case. The underlying stochastic models for both algorithms arise as a particular instance of a more general probability space studied recently in Ressel, P. (2018): A multivariate version of Williamson’s theorem, 1-symmetric survival functions, and generalized Archimedean copulas. Depend. Model. 6, 356–368. On this probability space the inter-group dependence need not be Archimedean, however, we highlight two particular circumstances that guarantee that a hierarchical Archimedean copula is obtained.

Abstract

There is an infinite exchangeable sequence of random variables {Xk}k∈ℕ such that each finitedimensional distribution follows a min-stable multivariate exponential law with Galambos survival copula, named after [7]. A recent result of [15] implies the existence of a unique Bernstein function Ψ associated with {Xk}k∈ℕ via the relation Ψ(d) = exponential rate of the minimum of d members of {Xk}k∈ℕ. The present note provides the Lévy–Khinchin representation for this Bernstein function and explores some of its properties.

Abstract

Min-stable multivariate exponential (MSMVE) distributions constitute an important family of distributions, among others due to their relation to extreme-value distributions. Being true multivariate exponential models, they also represent a natural choicewhen modeling default times in credit portfolios. Despite being well-studied on an abstract level, the number of known parametric families is small. Furthermore, for most families only implicit stochastic representations are known. The present paper develops new parametric families of MSMVE distributions in arbitrary dimensions. Furthermore, a convenient stochastic representation is stated for such models, which is helpful with regard to sampling strategies.

Abstract

We present a list of challenges one faces when given the task of modeling dependence between stochastic objects, with a special focus on financial applications. Our aim is to draw the readers' attention to common (and not so common) pitfalls and fallacies, and we particularly address readers who are new to dependence modeling. The presented list of challenges is clearly not complete, but it gives a flavor of how difficult and subtle the task of dependence modeling can be. Moreover, the readers shall get some intuition about what challenges are structural and cannot be overcome, and what challenges allow for a better solution than common practice might suggest.

Abstract

Some empirical studies suggest that the computation of certain graph structures from a (large) historical correlation matrix can be helpful in portfolio selection. In particular, a repeated finding is that information about the portfolio weights in the minimum variance portfolio (MVP) from classical Markowitz theory can be inferred from measurements of centrality in such graph structures. The present article compares the two concepts from a purely algebraic perspective. It is demonstrated that this heuristic relationship between graph centrality and the MVP does not originate from a structural similarity between the two portfolio selection mechanisms, but instead is due to specific features of observed correlation matrices. This means that empirically found relations between both concepts depend critically on the underlying historical data. Repeated empirical evidence for a strong relationship is hence shown to constitute a stylized fact of financial return time series.

Abstract

It is standard in quantitative risk management to model a random vector 𝐗:={Xtk}k=1,...,d of consecutive log-returns to ultimately analyze the probability law of the accumulated return Xt1++Xtd. By the Markov regression representation (see ), any stochastic model for 𝐗 can be represented as Xtk=fk(Xt1,...,Xtk-1,Uk), k=1,...,d, yielding a decomposition into a vector 𝐔:={Uk}k=1,...,d of i.i.d. random variables accounting for the randomness in the model, and a function f:={fk}k=1,...,d representing the economic reasoning behind. For most models, f is known explicitly and Uk may be interpreted as an exogenous risk factor affecting the return X tk in time step k. While existing literature addresses model uncertainty by manipulating the function f, we introduce a new philosophy by distorting the source of randomness 𝐔 and interpret this as an analysis of the model's robustness. We impose consistency conditions for a reasonable distortion and present a suitable probability law and a stochastic representation for 𝐔 based on a Dirichlet prior. The resulting framework has one parameter c[0,] tuning the severity of the imposed distortion. The universal nature of the methodology is illustrated by means of a case study comparing the effect of the distortion to different models for 𝐗. As a mathematical byproduct, the consistency conditions of the suggested distortion function reveal interesting insights into the dependence structure between samples from a Dirichlet prior.