Program Abstract (Video/Slide)
Speaker |
|
Time |
08:30 - 09:30 on 27 February (Tue) |
Title |
Advanced statistical models for cryptocurrency research |
Abstract |
Cryptocurrencies as of late have commanded global attention on a number of fronts: heightened pecuniary, technological infrastructures, and investment interest. Cryptocurrency is different from fiat currency in many ways: no institutional control, near instantaneous transaction and a fast changing cryptocurrency community. Yet there are many controversies surrounding cryptocurrency, its monetary role, price formulation, investment devices, etc. As cryptocurrency is increasingly accepted even by some major banks, we aim to derive advanced time series models to understand the statistical properties of cryptocurrency, including its notoriously wild volatility. This speculative nature of cryptocurrency has raised debates over its monetary role against speculative asset. We highlight some stylized facts about the volatility dynamic including its oscillatory persistence and leverage effect and relate these results to their respective cryptographic designs. The data analysis is initiated with a broad scope of 224 cryptocurrencies, then a more detailed understanding of the top 5 by market capitalization and followed up with a specific focus on Bitcoin. The results favour Gegenbauer long memory over standard long memory filter to model the logarithm of the daily range. |
Presentation | Video / Slide |
Speaker |
|
Time |
10:50 - 11:50 on 27 February (Tue) |
Title |
Explicit Solutions to Correlation Matrix Completion Problems, with an Application to Risk Management and Insurance |
Abstract |
We derive explicit solutions to the problem of completing a partially specified correlation matrix. Our results apply to several block structures for the unspecified entries that arise in insurance and risk management, where an insurance company with many lines of business is required to satisfy certain capital requirements but may have incomplete knowledge of the underlying correlation matrix. Among the many possible completions we focus on the one with maximal determinant. This has attractive properties and we argue that it is suitable for use in the insurance application. Our explicit formulas enable easy solution of practical problems and are useful for testing algorithms for the general correlation matrix completion problem. Based on accepted manuscript: Georgescu D. I., Higham N.J. and Peters G.W. (2018) “Explicit Solutions to Correlation Matrix Completion Problems, with an Application to Risk Management and Insurance” Royal Society Open Science. |
Presentation | Video / Slide |
Speaker |
|
Time |
14:10 - 15:10 on 27 February (Tue) |
Title |
High-dimensional Localized Regression |
Abstract |
We introduce the localized Lasso, which is suited for learning models that are both interpretable and have a high predictive power in problems with high dimensionality d and small sample size n. More specifically, we consider a function defined by local sparse models, one at each data point. We introduce sample-wise network regularization to borrow strength across the models, and sample-wise exclusive group sparsity (a.k.a., ℓ1,2 norm) to introduce diversity into the choice of feature sets in the local models. The local models are interpretable in terms of similarity of their sparsity patterns. The cost function is convex, and thus has a globally optimal solution. Moreover, we propose a simple yet efficient iterative least-squares based optimization procedure for the localized Lasso, which does not need a tuning parameter, and is guaranteed to converge to a globally optimal solution. The solution is empirically shown to outperform alternatives for both simulated and genomic personalized medicine data. |
Presentation | Video |
Speaker |
Prof. Nourddine Azzaoui (Université Clermont Auvergne) |
Time |
15:30 - 16:30 on 27 February (Tue) |
Title |
“A theoretical framework for extracting some temporal and frequency features in non-stationnary fractional signals.” |
Abstract |
In this talk we begin by recalling the theoretical representation of a fractional non stationnary Brownian process and its principal features witch are the mean function and the spectral density. We will be especially interested in processes having both a piecewise mean function $\mu(t)$ and piecewise spetral density $f(\omega)$. We begin first by motivating the importance of these kind of processes in many applied fields like signal, image, speech processing and physiological data... We then show how one can extract, in some cases, temporal features from $\mu (.)$ and frequency features from $f(\omega)$. We illustrate this for a speech signal and show how to extract energies of some specific frequencies or bands that can be linked to some specific behavior in the studied signal (like for example structural noise ... ). |
Presentation | Video / Slide |
Speaker |
|
Time |
08:30 - 09:30 on 28 February (Wed) |
Title |
Spatial Field Reconstruction and Sensor Selection in Heterogeneous Sensor Networks with Stochastic Energy Harvesting |
Abstract |
We address the two fundamental problems of spatial field reconstruction and sensor selection in heterogeneous sensor networks. We consider the case where two types of sensors are deployed: the first consists of expensive, high quality sensors; and the second, of cheap low quality sensors, which are activated only if the intensity of the spatial field exceeds a pre-defined activation threshold (eg. wind sensors). In addition, these sensors are powered by means of energy harvesting and their time varying energy status impacts on the accuracy of the measurement that may be obtained. . We address the following two important problems: (i) how to efficiently perform spatial field reconstruction based on measurements obtained simultaneously from both networks; and (ii) how to perform query based sensor set selection with predictive MSE performance guarantee. We solve the first problem by developing a low complexity algorithm based on the spatial best linear unbiased estimator (S-BLUE). Next, building on the S-BLUE, we address the second problem, and develop an efficient algorithm for query based sensor set selection with performance guarantee. Our algorithm is based on the Cross Entropy method which solves the combinatorial optimization problem in an efficient manner. |
Presentation | Video |
Speaker |
|
Time |
09:30 - 10:30 on 28 February (Wed) |
Title |
Revisit of the resampling mechanism used in Importance sampling methods. work with R. Lamberti, Y. Petetin and F. Desbouvries from Telecom SudParis/SAMOVAR (France) |
Abstract |
Sequential Monte Carlo algorithms, or Particle Filters, are Bayesian filtering algorithms which propagate in time a discrete and random approximation of the a posteriori distribution of interest. Such algorithms are based on Importance Sampling with a bootstrap resampling step which aims at struggling against weight degeneracy. However, in some situations (informative measurements, high dimensional model), the resampling step can prove inefficient. In this talk, I will describe the recently proposed revisit of the fundamental resampling mechanism which leads us back to Rubin’s static resampling mechanism. More specifically, we proposed an alternative rejuvenation scheme in which the resampled particles share the same marginal distribution as in the classical setup, but are now independent. This set of independent particles provides a new alternative to compute a moment of the target distribution and the resulting estimate is analyzed through a CLT. We next adapted our results to the dynamic case and proposed a particle filtering algorithm based on independent resampling. This algorithm can be seen as a particular auxiliary particle filter algorithm with a relevant choice of the first-stage weights and instrumental distributions. Finally, a class of parameterized solutions which enable to adjust the tradeoff between computational cost and statistical performances will be described. |
Presentation | Video / Slide |
Speaker |
|
Time |
10:50 - 11:50 on 28 February (Wed) |
Title |
Some challenges in wireless communication |
Abstract |
This talk will discuss different modelling issues in wireless
communication and electromagnetic field reconstruction. Deployment of
heterogeneous networks face several challenges. Interference is one of them. We
study the impact of a sparse use of the resource by many asynchronous users
that create a non-Gaussian interference at the Gateways. To increase the number
of possible active transmitters, adapted signal processing has to be performed.
Modeling interference as well as its dependence structure if several antennas
are used at the receiving side is a crucial step that we will discuss. A
special focus on a LoRa-like system will serve as an illustration. A second difficulty is the energy consumption of the nodes. This encourages researches on new types of electronic components. One attractive solution is neuro-inspired circuits. We will address this problem from a signal processing point of view with, as an objective, the design of wake-up radios. Finally we will address a slightly different topic dealing with electro-magnetic field reconstruction. The basic idea is to measure in several points the field, reconstruct it in any point and propose a visualization of the result using augmented reality. In a first approach we used Gaussian processes for the reconstruction and first results will be presented. |
Presentation | Video |
Speaker |
|
Time |
13:10 - 14:10 on 28 February (Wed) |
Title |
Modelling optimal decisions in retirement under expected utility stochastic control framework |
Abstract |
In this talk we present expected utility model for optimal behaviour in the decumulation phase of Australian retirees with sequential family status subject to consumption, housing, investment, bequest, and government provided means-tested Age Pension. The model is calibrated using empirical data of consumption and housing from the Australian Bureau of Statistics 2009-2010 Survey. The model fits the characteristics of the data well to explain the behaviour of Australian retirees. The model is solved numerically as a stochastic control problem, and allows to find the optimal housing, consumption and risky asset allocation depending on age and wealth. We then extend the model with stochastic interest rate and additional optimal decisions with respect to house upgrade/downgrade, reverse house mortgage and access to annuities. In one-dimensional case we solve the model numerically using quadrature method and for multi-dimensional cases we develop the Least Squares Monte Carlo method accounting for a transformation bias and heteroscedasticity. The key findings are as follows. First, the optimal policy is highly sensitive to the means-tested Age Pension early in retirement, but this sensitivity fades with age. Second, the allocation to risky assets shows a complex relationship with the means-tested Age Pension. As a general rule, when wealth decreases, the proportion allocated to risky assets increases, because the Age Pension works as a buffer against investment losses. Couples can be more aggressive with risky allocations owing to their longer life expectancy compared with singles. Third, the means-tested Age Pension crowds out annuitisation and the means-test rules lead to a preference of reverse mortgages over scaling housing.
Related publications: 1. Andréasson, Johan G. and Shevchenko, Pavel V. (2017). Optimal annuitisation, housing decisions and means-tested public pension in retirement. Available at SSRN: https://ssrn.com/abstract=2985830 2. Andréasson, Johan G. and Shevchenko, Pavel V. (2017). Bias-Corrected Least-Squares Monte Carlo for Utility Based Optimal Stochastic Control Problems. Available at SSRN: https://ssrn.com/abstract=2985828 3. Andréasson, Johan G. and Shevchenko, Pavel V. (2017). Assessment of Policy Changes to Means-Tested Age Pension Using the Expected Utility Model: Implication for Decisions in Retirement. Risks 5, 47:1-47:21. Preprint https://ssrn.com/abstract=2875551 4. Andréasson, Johan G., Shevchenko, Pavel V. and Novikov, Alex (2017). Optimal consumption, investment and housing with means-tested public pension in retirement. Insurance: Mathematics and Economics 75, 32-47. Preprint http://arxiv.org/abs/1606.08984 . |
Presentation | Video / Slide |
Speaker |
|
Time |
14:10 - 14:40 on 28 February (Wed) |
Title |
Which Risk Factors Drive Oil Futures Price Curves? Speculation and Hedging in the Short and Long-Term |
Abstract |
We develop a consistent estimation framework, which builds on the familiar two-factor model of Schwartz and Smith (2000), to allow for an investigation of the influence of observable covariates, such as inventories, production or hedging pressure, on the term structure of crude oil futures prices. Using this novel Hybrid Multi-Factor (HMF) model, we can obtain closed form futures prices under standard risk neutral pricing formulations, and we can incorporate state-space model estimation techniques to consistently estimate both the structural features related to the convenience yield and spot price dynamics (long and short term stochastic dynamics) and also the structural parameters that relate to the influence on the spot price of the observed exogenous covariates. We can utilise such models to gain significant insight into the futures and spot price dynamics in terms of interpretable observed factors that influence speculators and hedgers heterogeneously, which is not attainable with existing modelling approaches. |
Presentation | Video / Slide |
Speaker |
|
Time |
14:40 - 15:10 on 28 February (Wed) |
Title |
Mortality Models Incorporating Long Memory Improves Life Table: A Comprehensive Analysis |
Abstract |
Forecasting life expectancy and mortality are two important aspects of the study of demography. We demonstrate in this work that the existence of long memory in mortality data improves the understanding of mortality and the model incorporating a long memory structure provides a new approach to enhance the mortality forecasts. To achieve this we first demonstrate the existence of long memory in mortality data using Hurst exponent estimated by several empirical estimation methods. Then the dynamic of the long memory across genders, age groups, countries and time periods is further analyzed. Results motivate us to develop new mortality models by extending the Lee-Carter model to death counts and incorporating a long memory model structure. Furthermore, there is no any identification issue in the proposed model. Hence, the constraints in LC model are released. Bayesian inference is applied to estimate the model parameters. The Deviance Information Criterion is evaluated to select between different Lee-Carter model extensions of our proposed models in terms of both in-sample fits and out-of-sample forecasts performance. Then the models are applied to analyze death count data sets from 16 different countries divided according to genders and age groups. Estimates of mortality rates are applied to calculate life expectancies when constructing life tables. On comparing different life expectancy estimates, results show the Lee-Carter model without the long memory component may provide underestimates of life expectancy. This underestimation has a great impact on the old-age support programs in social security and pension and may eventually lead to insufficient funds in a pension scheme. In summary, it is crucial to investigate how the long memory feature in mortality influences life expectancies in the construction of life tables. |
Presentation | Video |
Speaker |
|
Time |
17:40 - 18:10 on 28 February (Wed) |
Title |
Environmental indicators towards green finance |
Abstract |
This research focuses on the development of quantile diffusions with the aim of gaining insight into the dynamics of data sets. A multivariate diffusion process is considered where each marginal is equipped with an associated distribution. In order to allow for a flexible dependence structure among the marginals, i.e. beyond the one imposed by the multivariate diffusion itself, a generic copula function is introduced. By a composite transformation, the vector-valued diffusion is then mapped to a multivariate quantile function-valued process that satisfies a stochastic differential equation. Various applications are envisaged ranging from the modelling of financial indices and portfolio allocation, time series analysis and risk measurement, to characterisations of market inefficiency. This is joint work with Andrea Macrina (University College London) and Gareth W. Peters (Heriot-Watt University). |
Speaker |
|
Time |
18:10 - 18:40 on 28 February (Wed) |
Title |
Environmental indicators towards green finance |
Abstract |
Currently, as countries are pressing for actions to solve global environmental problems such as climate change, how to procure necessary funds for problem solving is an urgent issue. Interest in recent years has been increasing in green finance that raises the funds for implementing green projects. In green finance, it is crucial to certify the environmental improvement effect with environmental indicators. In this talk, several environmental indicators are introduced. |
Speaker |
Prof. Konstantin Markov (Aizu U.) |
Title |
Personality recognition from Facebook status updates |
Presentation | Video |
Speaker |
Prof. Tor Andre Myrvoll (SINTEF Digital) |
Title |
Public Transport Passenger Counts Using WiFi Signatures of Mobile Devices |
Presentation | Video |
Speaker |
Prof. Daisuke Murakami (ISM) |
Title |
A spatiotemporal generalized hyperbolic model for heatwave monitoring |
Presentation | Video |
Speaker |
Mr. Xiaoming Zhang (Bloomberg) |
Title |
Finance of Green Bonds: Qualitative and Quantitative Aspects |
Presentation | Video |